Jump to content
  • Advertisement

p1p1

Member
  • Content Count

    22
  • Joined

  • Last visited

Community Reputation

143 Neutral

About p1p1

  • Rank
    Member
  1. Yep... I guess all physics engines have some built-in bias in their joint correction... Therefore it still looks like working naturally, though it really isn't %) I will still try multiple joints in Chipmunk, we'll see were it gets. My initial question was about propagating the impulse, that is 'forgotten' in your model, through various paths via different joint-to-body links according to actual distribution of impulse along a 2d normal to hit surface. I think I also need some stress color-coding for bricks, so, I'll be busy with that for a while ) Thank you for your answers!
  2. In other words, how would broken pieces behave in your wall, if you'd completely removed the projectile from the world right upon impact? The whole wall would be shattered, but would still hold its shape and bricks would still be together in their initial positions, with all broken joints now removed, right?
  3. Ok, but, as far as I understand, in real nature, the momentum is conserved, and if I do as you say, I remove the joints immediately after the initial touch, but if I don't apply any additional forces, would it still be correct, wouldn't a significant part of impulse be lost on the way? Say, a ball hits a single brick in the wall, the brick is detached from all other bricks. In real nature, the detached piece would gain some velocity from the impact, and fly off the wall, otherwise it would've just stayed in place in the wall, right? Even if the ball bounced back off the wall, there could be some chips flying off the other side, right? So they definitely gain some momentum of impact in real world. But what we see in the demo is not actually a proper momentum simulation, but is a result of a heavy projectile just continuing to push broken pieces inwards into the wall during sequential stepping after initial hit. In the demo, a projectile hits the wall, the piece is broken off, but it stays in its initial position until the moving projectile shifts it in some direction. It does not gain any velocity upon impact in this demo, right?
  4. That's almost exactly what I am talking about! But how do you propagate collisions in the demo ? Do you remove joints and then reapply forces to the detached bodies, or do you simply remove joints on first contact and continue stepping? I have a setup of Box2d, Chipmunk and Bullet in my engine, so I can use any of them, they seem to have similar APIs, but in either of them I still have to make user-defined behaviour for breaking joints and force propagation. As for breaking joints under forces exceeding some maximum threshold, that is all clear to me. But what do you do with the impulse upon detachment of the bodies? Also, how did you make the wall to break into chunks, instead of separate bricks? According to simple 'weak joint' solution, the area of damage would've been concentrated around the point of impact (would've detach closest bricks), but would not cause cracks to chip large chunks off the wall. So there must be something going on with recalculating connectivity in this demo, though I did not figure what exactly is happening there yet. Can you give some insights on that?
  5.   In addition to conservation of momentum, I would try to look at the average direction of each surface involved in the collision, both new cracks and the initial surface hit.  My guess at a quick solution is that the momentum transfer would be proportional to the dot product of the impact direction and the crack's average normal.  Cracks that form with normals parallel to the impact direction will help the impact particle transfer momentum to the fractured chunk, while perpendicular cracks will not.   Thanks, Aledrinker! As for the avg normal of the crack – I think I got it, depending on the crack directions, long chunks on the way of the projectile would impose more resistance on the projectile, is that correct?
  6. Thank you so much, Buckeye! The model you proposed with proportional distribution of energy sounds almost like what I need. The only thing left here is to account not only for linear velocities, but also for angular velocities, I mean, i need to calculate points and normals to add some rotations applied at certain points of newly separated chunks, not just linear velocities at centers of chunks. If I just apply linear velocity to each chunk (directed from point of impact to each chunk's center) - it makes them fly apart, but that looks kinda 'oversimplified' and not as realistic as it might be. One option would be to simply randomize rotations. That probably would generate better breaking effects, but still not realistic enough. I think I need to make them fly and rotate with proper linear and angular velocities, but I am not very good at maths and physics for that. I would appreciate If you could give a short explanation on how to define directions of propagation of the hit impulse. I may be misunderstanding something, but what's the direction(s) of energy/chunk-velocities there in your previous post?
  7. Thank you, Buckeye, for your answer! The thing is, I know what I want, but I don't know all the details I need to make it... As for randomness – that's actually implied by the random nature of fractures, so, yes, I will have randomness in either variable strength of cement or just runtime random chance. That's the easy part. The part that I don't understand is how do i calculate proportional split of the force of the impact and transfer it evenly to newly freed chunks to launch them after they are actually split?
  8. Hello, everybody!   I am trying to implement destructable objects in my game, and I am currently stuck with 'physics of impulse propagation' (I don't know how to say it properly).   My problem can be described in the following way:   Imagine I have a circular projectile and a rectangular obstacle, both of them on a 2d plane, for simplicity. The projectile is launched towards the obstacle. The obstacle is a wall, consisting of small rectangular 'bricks', which are placed next to each other (in some kind of a regular pattern to form a 2d 'wall'). The bricks are atomic (indivisible) solid hard shapes, cannot break, cannot deform. Bricks are glued together with cement to make a strong 'wall obstacle'. The cement between bricks is virtual, there's nothing physical in between the bricks, the cement itself is just a simplification of weakest points between any two neighboring bricks, so, if a crack is made in a wall, the crack would most probably propagate through cement between bricks. In my simplified model the bricks are atomic, so the only path for a crack to propagate inside a wall – is through cement between bricks. Therefore, when a projectile hits the wall, depending on the energy of the collision, the wall should be broken into several single bricks and patches of several bricks of various sizes.   If a projectile hits the wall with significant velocity, it should penetrate the wall. What I want to achieve is somewhat-simplified, but still looking realistically-enough physics of breaking a hard body into pieces, specifically physics of debris produces by a hit. In general, after a penetrating hit, there should be several smaller chips flying off the wall (through the other side of the wall), some bigger chunks of several bricks, flying sideways, and two big separated parts of the obstacle wall with most of bricks still in them.   What I have as 'given' at any point in time: 1. all the masses and velocities 2. the impulse of the initial hit 3. the connectivity graph between the bricks 4. strength threshold of cement (strength of each link between bricks in a connectivity graph) in arbitrary 'units', say some maxForce / somePeriodOfTime. For simplicity, strength of cement is uniform across the wall.   Here's a rough illustration of the problem with my comments in blue, yellow and orange. The bricks on this image are not glued together, so the wall simply flies apart. The projectile has just collided with the green vertical wall a few moments ago: [attachment=28836:Screen-Shot-2015-08-22-at-18.41.45.jpg]   My problem is reduced to calculating how an impulse from a projectile should propagate between the bricks (to make cracks, to penetrate obstacles and to chip off some bricks also). Many games use Voronoi for fractures, but it does not account for impulse propagation as far as I know, it just splits a polygon or a mesh into roughly equal areas / volumes. In my case, the result of the split made by a crack should go between bricks, which can be thought of as a result of fracturing with Voronoi. But I'm not using Voronoi, because my wall is 'prefractured' already in some sense. So, my question is actually this: what kind of physics do i need to dive into, to target the 'stress/impulse propagation inside a structure of several bodies' problem? Intuitively, the impulse has to split along the fracture lines somehow to disperse the hit, but the actual mechanics of that is unknown to me, I really need some supervising guidance here. Links are much appreciated!   Thank you in advance for your help, and sorry for such a long text.
  9. Hello, everyone! I have a problem with my Phong shader... I deleted all the textures, normal maps and specular maps to make a clean test for light in the following scene. There are five spheres in a line (each sphere has exactly the same Z and Y coordinates and a different X coordinate). Above and to the right of the spheres there's a light source. It has the same Z coordinates with the spheres, but it is shifted to the right and upwards. Somehow the light is distributed in a strange fashion, so that the sphere closer to the light source 'reflects' less light towards the viewpoint (camera)... [attachment=28500:problem.jpg] From what my eyes are used to seeing, the leftmost sphere should look like the rightmost one, and vice versa. But the actual lit area of the spheres seems to be in reverse relation with the position of the light source. I don't really know how to express that in proper terms. Can anybody explain this? It does not look intuitive, I think the light is actually reversed somehow, but I cannot understand the reason (must be some direction vector in the shader code or something)... Here's the basic phong shader I'm using: #version 300 es in vec3 position; in vec3 normal; in vec2 texcoord; out vec4 normal0; out vec4 position0; uniform mat4 modelMatrix; uniform mat4 modelViewMatrix; uniform mat4 modelViewProjectionMatrix; uniform mat3 normalMatrix; void main () { gl_Position = modelViewProjectionMatrix * vec4 (position, 1.); position0 = modelMatrix * vec4 (position, 1.); normal0 = modelMatrix * vec4 (normal, 0.); } #version 300 es // fragment properties in highp vec4 position0; in highp vec4 normal0; // camera properties uniform highp vec3 cameraPosition; // light properties uniform highp vec3 lightPosition; uniform lowp vec4 lightColor; // material properties uniform lowp vec4 emissiveColor; uniform lowp vec4 diffuseColor; uniform lowp vec4 specularColor; uniform highp float shininess; // global ambient light uniform lowp vec4 ambientColor; // output out highp vec4 fragmentColor; void main () { // emissive component is as it is // diffuse component highp vec4 N = normalize (normal0); highp vec4 L = normalize (vec4 (lightPosition, 1.) - position0); fragmentColor = (emissiveColor + ambientColor + max (dot (N, L), 0.) * lightColor * diffuseColor + pow (max (dot (reflect (-L, N), normalize (vec4 (cameraPosition, 1.) - position0)), 0.), shininess) * lightColor * specularColor); }
  10. Thank you all for clarifications... the difference is much more clear when illustrated )
  11. Hello! Speaking of 3D and lighting, is it true that 'normal map' is the same as 'height map' or 'bump map'? Why is this map called differently? Is it just because this map does not have a conventional name, or do these things really differ in some specific way? As far as I understand, raising or lowering a point on a lit surface is visually the same as displacing the normal to the surface at that point. Is that correct?
  12. Ok, here's my final result. I finally got it working. Most of the code of my shader was taken from here: http://www.3dgep.com/texturing-and-lighting-with-opengl-and-glsl/ . I just tweaked it a little bit for my naming and structures... Thanks, IYP! [attachment=28419:Screen Shot 2015-08-04 at 00.38.50.png]
  13. Ok, thanks again for the answer! I'll try to move it to fragment shader and I'll post the results here as soon as it is done... Thank you.
  14. To be more specific, I get the following: Program link log: WARNING: Could not find vertex shader attribute 'normal' to match BindAttributeLocation request. (I am not using the 'normal' attribute in the shader here, for example) The program still compiles and links and works, but I wanted to get rid of those annoying warnings, and not just silence them, but to get rid of the cause.
  15. Thank you for your answer. The problem is that I want to know the names of the atrtibutes prior to binding, because I want to determine the set of bound attribs in runtime when loading the shader, so that I could link the attributes with my params later on rendering stage. Therefore I need to know the names of the locations of attributes. Say, I have a shader which receives everything except normals or everything except texture coords. A call to glBindAttribLocation would then complain that it cannot find a named attribute 'normal' or 'texcoord0' if the are not used in the shader code itself. So, how do I know which attribs (with their respective names) to bind when loading and building the shader without parsing the shader code manually?
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!