# coderchris

Member

307

306 Neutral

• Rank
Member

• Role
Programmer
• Interests
Programming

## Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

1. ## sweep test with two moving bodies

Spot on. That's typically how you would handle the case of two moving rigid objects. Fix object A and subtract object A's transform from object B. I'm not sure how bullet's swept function works internally, but if you want to implement it yourself, you could look into conservative advancement.
2. ## Geometric stiffness term in Stable Constrained Dynamics

Yeah that sounds right. I emailed the author and asked for clarification on this. He confirmed that the final K matrix is indeed the linear combination of hessians. I've implemented it this way into my physics engine and it seems to be working as described in the paper. I found matlab to be of great help when deriving K. For anyone else that gets stuck, here is an example of computing K for a simple length constraint between points "A" and "B" with rest length "l": syms A1 A2 B1 B2 lambda l A = [A1; A2]; B = [B1; B2]; phi = norm(A - B) - l; J = jacobian(phi, [A1, A2, B1, B2]); K_til = jacobian(transpose(J), [A1, A2, B1, B2]) * lambda; pretty(K_til) The same logic should work for any type of constraint.
3. ## Geometric stiffness term in Stable Constrained Dynamics

This is in reference to "Stable Constrained Dynamics": https://hal.inria.fr/hal-01157835/document Equation (18) / (21) I'm having trouble understanding how to build this K "geometric stiffness" term. K = (∂J^T / ∂x) λ Where J is the constraints jacobian and λ is the constraint force magnitudes. What I do know - based on its usage in (21), K should be a (3n x 3n) matrix in 3D where n is the number of particles lets say. What I'm confused about - the jacobian J is a (C x 3n) matrix where C is the number of constraints. λ is (C x 1). This doesn't seem to work out in terms of the matrix dimensions... What am I missing here? If I consider only a single constraint, then it does appear to work out in terms of the dimensions - I end up with λ being a scalar and K ultimately being (3n x 3n). However, that leads to the question of how to then build K such that it contains all of the individual constraint K's (one K for each constraint I guess)?
4. ## Divide and conquer broadphase

Thanks for the feedback Aressera. Yeah its a tricky comparison. I guess it comes down to whether or not you are tracking pairs between frames. I have a benchmark setup in which about half my scenes have everything moving. When I am comparing methods, I use the total time required to simulate all scenes for some number of steps. So, it's not a very scientific comparison to be sure. For my BVH, I refit each frame and apply internal tree rotations based on the SAH cost (used the method presented here: https://www.cs.utah.edu/~aek/research/tree.pdf). It's very likely that I have a sub-optimal BVH implementation though (no SIMD, poor cache locality). Indeed, if I track pairs between frames and do not query objects which haven't left their bounds, it costs very little. That requires me to use a fattened AABB so that I don't re-test an object unless it has moved out of its bounds. However the fattened AABB generates a bunch of extra pairs that my narrowphase has to check, which is particularly expensive in my case. I think that's why I'm seeing lower than normal overall performance. One day I will do a more scientific comparison with some existing BVH implementations. Overall the divide and conquer method is just a novelty due to the memory savings and simplicity - I don't foresee using it in my own engine because I do want to optimize for the case where nothing is moving. That's very interesting. Octree is one which I haven't tried yet. I will have to give it a go! I'm still trying to beat my uniform hash grid. For the octree, do you only insert an object into the node in which it entirely fits, or all nodes which it overlaps? I'm guessing just the one it fits in since the object has a pointer to the node its in. What metric did you use for determining when to subdivide a node? I suppose when testing for collisions, you need to check the node an object as in, as well as traverse down any intersecting children right?
5. ## Divide and conquer broadphase

Ah yes, that should be std::partition ("cleaned" up my implementation for the post but missed that :P) Yeah it's sort of like a a top down BVH build with pair checking at the same time. Main difference from BVH is that an object can potentially appear in both subtrees, hence the need to partition twice. And yeah makes sense, you could copy the index array then run left and right in parallel. Wouldn't be fixed memory usage anymore, but we've got plenty of memory these days so no biggie

7. ## Practical (robust) vertex-face continuous collision detection?

If you are looking for extreme robustness in a swept test, check out the method outline in this paper: http://gamma.cs.unc.edu/BSC/BSC.pdf It covers two variants. There is a fast floating point version which, while not exact, will be correct in the vast majority of cases. I use this variant in my own project and it works great. The paper also covers an exact version which uses extended precision arithmetic, and it should return the exact answer 100% of the time. This method says nothing above resolving the collision. You could use a swept test, and then resolve the collision by moving the vertex and triangle such that they lie on the same plane. The issue with this is that on the next iteration / step, it would be impossible to know which side of the triangle the vertex should be on. I've used two approaches to deal with this issue. 1) Track collisions from between steps, remembering which side of the triangle the vertex should be on. This way, you can even skip the expensive swept test if a vertex is known to be colliding with a triangle. This is what I would call a corrective approach. You are correcting penetration issues after finding them, potentially over the course of many steps. There are a few issues with this - the biggest being that special handling is required when the vertex reaches the edge of the triangle while still in the "colliding" state. You essentially have to transfer the collision to adjacent triangles to avoid having different saved sides for adjacent triangles. 2) Have a margin on all triangles. This requires two tests - the swept test, as well as a discrete test which simply pushes the vertex away from triangles if closer than the margin. This is what I would call a preventative approach. It prevents any collisions from persisting between iterations / steps. This method is less sensitive to the precision of the swept test than #1. The major issue here is that all collisions must be resolved before advancing to the next step, otherwise you will see missed collisions. #1 is probably more visually pleasing since you don't have an artificial margin separating things, however I have had more success with #2 since there are less strange cases to deal with.
8. ## When contact resolution fails

As an absolute last resort, you could look into using "rigid impact zones". It was originally devised for cloth simulation, but in my experience it works surprisingly well for complex rigid body collision scenarios. It will not resolve overlap that existed at the beginning of the time step, but it can guarantee that no new overlap will exist by the end of the step. The basic idea is to take a group of colliding bodies for which normal collision resolution failed to find a solution for, and treat that group as a single rigid body. This is done by computing, for that group, an average linear and angular velocity, then applying that to all objects in the group. This should all be done as the very last step in your solver. The method is detailed here (section 7.5): http://physbam.stanford.edu/~fedkiw/papers/stanford2002-01.pdf
9. ## Position based dynamics second order integration with inequality constraints

Ah interesting, hadn't seen those slides. In the slides they reference this paper (https://www.cs.ubc.ca/~rbridson/docs/english-siggraph08-cloth.pdf) which talks about performing an explicit velocity correction to counteract the effect. I'm guessing this correction would have to go after the velocity is computed. I will experiment with it some.
10. ## Position based dynamics second order integration with inequality constraints

Hello, In reference to position based dynamics survey section 4.2.5: http://matthias-mueller-fischer.ch/talks/2017-EG-CourseNotes.pdf The second order integration method they derive works very well for conserving energy. The original PBD integration loses a lot of energy, and it is especially apparent during rotation of groups of constrained particles. With the second order method, it looks almost perfect. However, I'm seeing bouncing and generally inconsistent reactions when this method is paired with any inequality constraints, but most importantly, collisions. They mention that no changes are required to the inner solve loop, but it seems as though inequality constraints will have to be treated specially for this type of integration to work. I don't understand the math enough to know how or why it isn't working. Does anyone know how to treat inequality constraints (collisions) properly with second order integration in position based dynamics? My current attempt to fix it is to change collisions to equality constraints, and essentially pull the colliding object back towards the collision point (like a distance constraint). This works and no longer bounces. However, now the problem is how to determine when to destroy the constraint... Thanks! -Chris
11. ## Help understanding term in XPBD (position based dynamics) paper

Nah I dont think normalized constraints are any better from what I've seen.    I was playing with the normalized distance constraint because in his other paper (strain based dynamics), all the constraints are normalized.   That said, I ran into an issue getting the damping to work with strain based constraints - the damping was basically having little to no effect. This was due to those constraints being normalized. Turns out that the damping mechanism as described only works when the constraints are not normalized, so I'm having to unnormalize / scale the strain based constraints for damping to work on them.   I haven't figured out the exactly correct scaling though. Still trying to wrap my head around the math  :blink:
12. ## Help understanding term in XPBD (position based dynamics) paper

I think you are correct, that seems to work nicely! Thanks for the input.   Everything seems to behave as described in the paper, definitely an improvement over regular PBD.   The stiffness parameter is a bit tricky though, for two reasons.   By their definition (if I'm interpreting correctly), alpha = 1 / stiffness. I found this to not completely solve a constraint in one step when stiffness = 1. So, I use a slightly modified version: alpha = (1 / stiffness) - 1. This will give alpha = 0 when stiffness is 1, which will fully solve a constraint in one step.   The second thing about the stiffness is that, although the valid range is [0, 1], how stiff it makes the constraint is highly dependent on the constraint's scaling. For example, with the usual distance constraint C(a,b) = |a - b| - restLength, with a stiffness of 0.5 and a timestep of 1, the constraint will be reasonably stiff using XPBD. However, using the normalized version C(a,b) = (|a - b| / restLength) - 1, will apply almost zero correction, so you need a much higher stiffness value to achieve the same effect.   Intuitively, this is due to the alpha in the denominator of the delta lambda "overpowering" the constraint. This is not a huge issue, since you can simply scale the stiffness value based on your choice of constraint function.
13. ## Help understanding term in XPBD (position based dynamics) paper

The paper in question is XPBD (http://mmacklin.com/xpbd.pdf)   Equation (26) in the paper relates to adding damping to the constraint.   It contains a term that I do not understand how to evaluate - the grad C(xi - x^n) in the numerator.   For example, with a distance constraint we have grad C(a, b) = (a - b) / |a - b| This takes two positions, and is itself a vector function, but the equation in question (26) evaluates to a scalar.   What might they mean by this term as it relates to a distance constraint?
14. ## Cubic interpolation over a triangular surface

This guy gives a good explanation for solving the system in 1D:   http://www.paulinternet.nl/?page=bicubic   Which all makes sense to me. I would like to apply this to my case, but I'm not sure what my system of equations actually is...
15. ## stable constraint solver with no warmstarting

In PBD (position based dynamics), we essentially move the predicted positions of each particle / object / fluid / whatever directly in order to solve any constraint (including friction).   [attachment=27691:PBD.png]   Pardon the crude drawing and probably poor explanation -    Consider this example of a static edge and a moving particle that collides with that edge. The black square is the initial position and the red square is its unconstrained predicted position. Predicted position = current position + current velocity * dt   To solve this collision constraint without friction, we project the predicted (red) position onto the edge which results in the (purple) point. We simply set our predicted position to this purple point - thats the constraint solve.   To add friction, we can compute the (blue) intersection point between the edge and the particles trajectory. We then compute a modified (green) position that is some amount between the (blue) intersection point and the (purple) projected point, based on how much friction we want. For example, for 100% friction we would set the predicted position to be at the blue point directly.   I guess it's not the most accurate way to model friction, but I did it this way because it let me have a normalized friction parameter [0, 1]   I have not implemented motors, but in the PBD framework you do have a velocity to play with, so I'm fairly sure it could be done without too much hassle.