Yesterday I made some more editor fixes and had two good meetings with my partner, where we knocked out several design and interface decisions.
I also submitted a talk to GDC 06. Wish me luck.
Today I have been fixing up the AI navigation system. I got it pretty close to working a few weeks ago, and purposely left for a while to let my mind work over it in the background. As I tried different levels, I kept finding slight issues with the navigation system. The problems stem from two different issues :
Firstly, I sample the world in a voxel-like fashion. For each 1x1x1 meter voxel, if it has a walkable surface, I look above it to see if there is enough clearance for a 2-meter tall creature. If this passes, then I mark this voxel as walkable.
Next I go through all voxels and try to make rectangular areas out of neighbors. I grow these areas as large as possible up to length and total area limits. This rectangularizing is done to reduce the number of nodes the a star algorithm will have to process.
One improvement to this that I made today was to sweep through the rectangular area in all 4 possible directions, so from nsew to snew to snwe to nswe. This is to ensure that all voxels can be travelled to and from in all directions. This solves a problem with some ramps. For instance, if you scanned just a certain direction through the voxels, you would accept certain voxels that maybe could be gotten to by going e->w but not going from the other voxel to w->e, due to height differences.
The second problematic area is the portalization. This is a similar issue. I put all area bounding boxes into a bounding volume hierarchy tree, and for each box, I look for intersecting neighbors. Portals between neighbors are zero-volume boxes where they intersect. I then place a .5 meter radius spherical particle on one side of the portal, and then try to push it through the other side, colliding with static geometry along the way. My old code was sort of hacking this, so just now I added a routine that simulates a particle over a given time, velocity & acceleration ( gravity in this case ), for a given # of ticks .
This way I can use the same collision routines as the game, and ensure that each potential portal can really be entered. This should solve a problem I was having with stairways, where if the stairways had no sides or railings, the enemies thought they could just skip up the stairs halfway.
Unfortunately, I'm now having a bug where my rectangular areas are intersecting quite a bit, not just in zero volume portal regions, which is causing my portalizer to find no portals.
I am going to extend the recangularizer and the 'check for walkable' code to use the same particle simulation routine.
Water Part 2
Ok, last time we had a basic water system with fresnel-oriented reflection. Next, I wanted to add refraction.
True refraction is actually much easier than true reflection in a game with real lighting. Real lighting makes real reflection very expensive, b/c you have to re-do all lighting on the reflected scene as well.
Outdoor games don't have enough lights to matter, but I think this is why we haven't seen too many indoor games with real lighting use render to texture reflection.
Refraction is easy. Basically all you do is render what's under water first, with whatever lighting you want, potentially just as you would do if there were no water. Then you copy the back buffer to a texture, using StretchRect(). Then you render the water surface to the back buffer, sourcing this off-screen refraction texture, that contains the underwater scene ( and potentially other things as well ). When you generate the texture coordinates for the refracted texture, and modifying the screen-space texture coordinates based on the vertex normal's x & y in screen space. The idea is that if the vertex normal is <0,1,0> then there should be no distortion, whereas a normal of <1,0,0> should have maximum refraction ( usually just a dozen pixels or so ).
When you render the water, you DON'T perform alpha blending, but rather blend between the refraction and the reflection with the Fresnel term in the pixel shader. You can also add some specular lighting at this point if you want, and just render the water as an opaque object.
Here area a couple of screen shots of this version of the effect :
Now the main advantage of the 'copy back buffer' trick, is that you generally don't need a clip plane to render what's under the water vs what's above the water. Of course, you can still have artifacts if you just implement it as above. The main problem is that if you don't clip what's under water, draw the whole scene, then copy offscreen, it's possible you might have a pillar in front of the water from the camera's pov. If you are drawing the water, and happen to displace the refraction texture coordinates to grab texels from the pillar, you will see the water seem to distort the pillar, even though the pillar is
in front of the water! It's a very weird and disturbing effect.
One solution is outlined in GPU Gems 2 in an excellent article. The upshot is to mark the water surface with 1.0 to dest alpha before copying it offscreen. Then, in the pixel shader when drawing the water surface, do two lookups of the refraction texture. One with the normal undistorted texture coordinates, and the other with the distortion. Then use the distorted lookup's alpha value ( from dest alpha rendered earlier ) to lerp between itself and the undistorted version. What this does is choose, on a per-pixel basis, whether to use the distorted lookup ( the normal case ), or, when there is an obstructionin front of that point, it uses the undistorted lookup, which is guaranteed to be obstruction free.
This isn't even close to correct, but it looks really good, and you have to have very large distortions that you inspect very closely to even see the problems, and at times it even covers up for other artifacts, like hollow objects partly in the water.
Next time I'll talk about moving from per-vertex distortion to per-pixel ( fairly easy ), lighting and shadowing the water, etc...