# [MDX] Triangles Flickering w/ QuadTree

This topic is 3983 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I am rendering a terrain using a quadtree. Each leaf mesh stores the indices of its 33x33 chunk of the whole terrain. The vertices of the entire terrain are store in one VertexBuffer. My problem is that for the first couple of seconds the terrain renders fine, but then some of the triangles dissappear, and then the whole terrain seems to flicker. When I set the PresentationInterval to One, this problem is fixed, otherwise on Immediate this problem happens. My rendering loop looks something like this: //Note I'm at work, and I don't remember the exact paramaters for the functions so some may be wrong here, but they are right in the program.
public void Render()
{
recursiveFrustumCheck(SubGrid node);

mGDevice.setStream(0, mVertexBuffer, 0);

mGDevice.VertexFormat = PosNormTex.Format;

foreach(SubGrid subGrid in mVisibleSubgrids)
subGrid.Render(mGDevice);
}


where my SubGrid.Render(Device gDevice) function looks like this:
public void Render(Device gDevice)
{
gDevice.Indices = mIndexBuffer;

gDevice.DrawIndexedPrimitives(PrimitiveType.TriangleList, 0, mNumVertices, 0, mNumTris);
}


Before this method, I stored each chunk as a Mesh and then Rendered the visible submeshes, which worked just fine. Any ideas? Any help is much appreciated. [Edited by - glaeken on June 22, 2007 4:39:40 PM]

##### Share on other sites
Are you overwriting the data in the vertex/index buffer per frame whatsoever or does it remain static after you write to them for the first time?

##### Share on other sites
After the original VertexBuffer and the leaves' IndexBuffers are created they are never written too or modifed.

I got a BSOD from the nvidia display driver, so I thought it was a driver error. But Programming a Multiplayer FPS renders the same way and it runs fine on my system.

Bump... Anyone?

##### Share on other sites
What size indices are you using (if using 32bit indices ensure you check what the maximum allowed value is, My Gf6800 only allows 24 bit addressing)? What sort of vertex count are you requiring?

In my vertex-cache optimized terrain rendering I ended up hitting problems with huge numbers of vertices per single draw-call which had a similar result but I doubt the cause is the same (you're using lots of smaller batches), but the index buffering one could be the case.

Are you getting any debug output from the runtimes? Any warnings that you're ignoring?

The presentation interval is an odd one - what sort of performance are you achieving? It might well be that you're actually seeing tearing issues.

Also, if you're getting BSOD's from the driver try running your app on the reference rasterizer. Ideally on another IHV's hardware as well if you have access.

hth
Jack

##### Share on other sites
I'm using int indices, so 32-bit. However, the flickering still occurs when I use ushorts as well. Before I implemented the Quadtree, I just rendered the entire scene and I used int indices then also and it worked fine. I have a gf6600.

The indices index into the whole terrain so I have to use 32-bit indices for anything larger than a grid of 129x129. For 257x257, that's 66049 vertices which is larger than what a ushort can store.

For a 257x257 terrain, I get about 240 fps (that's with multi-texturing) positioned in the center looking down the +z axis. This might be because of the missing triangles though. For the first couple of seconds it renders in tact(no missing triangles) at about 180 fps (about the same frame rate as rendering with meshes), after that it jumps to about 240.

When I tried rendering with the Reference rasterizer, I got an InvalidCallException() when a node would call device.DrawIndexedPrimitives(). I'm not sure why this happens. I'm pretty sure the node's IndexBuffer contains the correct amount of indices. I'll have to think about how I save the indices some more. Unfortunately I don't have access to any ati hardware.

I tried making the indices local to their chunk instead of the whole terrain. So the indices are in the range of 0-33x33 (instead of 0 - 257x257). And I save the vertices in each subgrid that corresponds to these. So for each node I call device.SetStreamSource() and device.Indices instead of setting the VertexBuffer that contained all of the vertices and then each node would just set the indices. Pretty much what I was doing when rendering with meshes. This seems terribly inefficient and it's not much faster than rendering with meshes, but the flickering stopped.

The only thing I can think of are that the way I compute the indices is somehow incorrect (but this wouldn't explain rendering correctly when limiting the framerate to the refresh rate (presentationInterval of One) ).

• 10
• 19
• 14
• 19
• 15