ZenoGD

Members
  • Content count

    22
  • Joined

  • Last visited

Community Reputation

142 Neutral

About ZenoGD

  • Rank
    Member
  1. I'm not sure if this is still true, but a couple of generations ago the GeForce cards stopped supporting user-defined OpenGL clip planes in hardware as another way to differentiate the Quadro line.
  2. Jacobi for Solving Fluid Pressure

    Quote:I had a closer look at the mathematics (wiki) today and I think it is not appropriate for shaders. Right, that's the other problem, it does certain things that are not suited to GPU (like finding maximum values and sums) so it would be difficult to write and may not be faster than a CPU version in the end. Quote:On the other hand: The multigrid's restriction and interpolation steps smell like it could be done "abusing" a 2D down/upsampling of the textures, but I really don't know. Haven't found a good multigrid tutorial either, unfortunately. But if the boundary conditions pose a problem... Thanks for the info. I had exactly the same thought! Multigrid is just applying something like Jacobi or Gauss Seidel on multiple "mipmaps" of your grid so that the effects of these solvers (which only access nearest neighbors) don't take so many iterations to travel from one end of the grid to the other. That is, you're helping those algorithms that work well on small scale divergences work efficiently on all scales. GPUs are great at making smaller copies of textures. That's also why I keep saying that CG is needed more for liquids than for gasses - with liquids you almost always have gravity, which adds a huge global divergence that's really hard for Jacobi/GS to get rid of. I think the most difficult part of a multigrid solver (just like fluid simulation in general) would be getting the boundary conditions right. How do you handle a one cell wide boundary after you downsample? That boundary will be lost but it should still affect the fluid at that larger scale. It might be pretty easy if your sim has no internal boundaries. Quote:I'm also considering using a cubic smoothing as shown here at least for the visuals. Yep, I did that, it is a big improvement for rendering. Using the same function for lookup during advection will keep a lot of detail in your sim too. Quote:@ZenoGD: Your info is very helpful, thanks. You're welcome. Maybe I'll post some screenshots tonight comparing these things (cubic on/off, CG vs GS, etc) to help people decide where to spend their effort.
  3. Jacobi for Solving Fluid Pressure

    First let me make a correction, my sim uses either CG or Gauss-Seidel, not Jacobi as I had previously stated, but the GS and Jacobi methods are pretty similar. Anyway, I think the Jacobi, Multigrid, and Gauss-Seidel solvers are all O(N) where N is the number of grid cells, but CG is worse than that - off the top of my head I think it would be O(N^3/2) on a square grid in 2D without a preconditioner. I shouldn't try to dissuade you too much - CG might be faster to converge under some combination of grid sizes and error tolerances so if you have time give it a try. It's just been my experience that there are a lot of easier things you can do that will give you a bigger visual difference in output - I'd put cubic interpolation at the top of that list. Unfortunately I don't know of any good multigrid tutorials and I haven't implemented it myself. I think the people who write tutorials are mostly researchers who don't care much about speed so they end up using CG since everyone is familiar with it and the boundary conditions are easier to handle than they are in multigrid.
  4. Jacobi for Solving Fluid Pressure

    Quote:Original post by Geometrian I think I ought to focus on the conjugate gradient first. Sooo . . . conjugate gradient shaders, anyone? Just to clarify: I don't think you'll be happy with CG unless you need the accuracy for simulating liquids where mass loss is really obvious. It will be slower than Jacobi and challenging to implement on GPU. I think 80 Jacobi iterations is overkill. Try 10-20 and see if you can tell a difference.
  5. Jacobi for Solving Fluid Pressure

    Congrats on getting your fluid sim to work! The reason your backwards advection behaves better than the forward one is because there is only one data "smoothing" event in backwards (interpolated lookup) but there are at least two when you go forward (splatting and gathering). Each time you have to interpolate a data value it introduces some error which ends up blurring detail out of your smoke. If your end goal is a fast good looking simulation, you're usually better off trying to decrease the error in your advection and display methods than you are in increasing the grid size. Almost everything scales better than grid size. For example, cubic interpolation introduces a lot less smoothing than linear. McCormack advection (like in the GPU gems article) is better than simple backwards advection and PIC FLIP is even better than that. As far as the projection step goes Jacobi is probably the best thing to use for gasses because it's simple and fast. It's not as accurate as CG but extreme accuracy isn't needed for smoke motion to look right. If you want to skip a step multigrid is the holy grail of solvers (both fast and accurate) but there are fewer examples of it out there and boundary conditions are tricky. Speed wise, maybe the others aren't doing as much as you are in terms of boundary conditions or diffusion (which I never bothered with). You may also have some bottleneck somewhere, it's hard to tell. Just to throw it out there my CPU sim gets about 40 fps with 90k cells (45x25x80) on core i7 920 using Jacobi solver and tricubic interpolation. It's about as optimized as I know how to make it without using SSE.
  6. Solving Fluid Diffusion/Velocity

    I've implemented a real time 3d fluid sim using both conjugate gradient and jacobi solvers so I know how frustrating it can be. It doesn't help that tiny bugs can break the whole thing so you don't know if you have a typo or are misunderstanding the algorithm. I spent a long time once looking for what ended up to being misplaced minus sign. To answer your first question about Ax=b. You know A and b and you're trying to solve for x, which involves inverting the matrix, A. This can't be done directly in a reasonable amount of time because A is gigantic so you have to use an iterative method. These methods guess and check a new 'x' value at each iteration, hopefully getting closer to the correct answer each time. In the case of fluids, x is a vector of pressures, one for each grid cell (say there's N of these). b is a vector containing the divergence of the velocity field at each grid cell. A is an NxN matrix whose coefficients tell you how to combine the pressures of neighboring cells so that they will make the velocity divergence free at each cell. You don't store A explicitly in memory (well, maybe the non-zero elements as an optimization), instead, your CG algorithm will just call a routine that multiplies A*x and returns the resulting vector. Your last post is a bit confusing. The advection step is independent of the projection step (the Ax=b part), so whether you advect forward or backward you will still have to an iterative projection step to remove the divergence from your velocities. Doing a backward advection has two advantages. The first is that it is guaranteed to be stable if you are careful about interpolating. The second is that it's easier to interpolate on lookup rather than on writing. When you backward advect you do an interpolated lookup but know exactly which grid point you're going to write to. With forward advection, you start on a precise point and your result (the displaced velocity or density) ends up in between grid points, so you'll have to store those intermediate points and do a gather pass at the end with some weight function.
  7. Shader branching ruins performance

    This may be a dumb idea but what happens if you allocate all of your local variables (like sphereVector, nearestPoint, and maybe even the loop counters) outside the loop? I'm thinking maybe it is unrolling your loop internally but doing so requires it to create more temporary variables than it has room for? Would you mind showing a screen shot of your result? How does AO look with only 8 samples?
  8. When a ball hits a plane

    The acceleration on your ball should just be a constant, don't increase it every update and don't multiply by timeDelta there. Like this: this->mAccel = Vector3(0.0f,GRAVITY_ACCELERATION ,0.0f); You may also want to be more careful about your time integration. If g is -10 m/s^2, for example, how far should the ball move in one second starting from rest? (A: 5m) How far does yours move? (A: 10m) Once those things are fixed your answer will be better but to get it perfect you'll have to calculate the exact time of impact and split the time step during which the collision occurs.
  9. Fluids and gravity

    I'm working on implementing a simulation of the "Stable Fluids" paper by Foster and Fedkiw. I have a working solver of the Navier-Stokes equation (following Stam's papers) and it appears to work correctly for a uniform fluid. I'm trying to make it support a water/air simulation but am having some trouble. Here's what I have so far: I mark each cell, for now, as being either air or water. Navier-stokes is only solved on the water cells. Passive particles are placed throughout the water volume and their positions are updated using the velocity field. The particles do not affect the simulation. Boundary Conditions: Air-water: Pressure is set to air pressure in air cells. Velocity of the air is taken to be the average velocity from any surrounding water cells. Grid edges: Pressure gradient is 0 between the edge cells and the adjacent cells. Component of velocity normal to the edge is opposite that of adjacent cells. Now, the problem is that if I fill the bottom 1/4 of my (128x128) grid with water cells and the top half with air cells and the only force is gravity, then my water particles all start to compress downward to the bottom 2 or 3 cells before there is a small "rebound" and then eventually all particles settle down and bleed out the bottom. The motion of the fluid otherwise looks correct. Shouldn't the pressure projection step, which ensures a divergence free field, not allow the water cells to be compressed like that? This compression indicates a net flow of fluid into the cells on the bottom. I've tried using shorter time steps (doesn't seem to matter), checking that the compression is coming from gravity and not air pressure, increasing iterations for the linear solver in the projection step, etc, all without luck. It really seems that my velocity field is not truly divergence free, but I can't figure out why. Any help would be greatly appreciated. Thanks, Zeno