Flickering/Instability problem

Started by
5 comments, last by EvilDonut 15 years, 9 months ago
I'm working on a simple 3d engine to render large(ish) terrain. I'm getting a kind of instability problem whereby the program shifts into a state where some of the triangles making up the terrain randomly flicker on and off quite fast. The following screenshots should demonstrate this: My terrain is 14 patches, each of 1024x1024 - each of these patches in then further divided into 128x128 'cells'. During my normal debugging I only enable one of these patches (so 64 cells) to keep loading speed decent. When I do this, the program is a lot more stable and I don't get the flickering. When I enable all of the terrain, it flickers. I notice the problem occuring almost instantly when I use IMMEDIATE present flag rather than DEFAULT (which caps my fps at 60). The method I'm using to render each cell is to use frustum culling to determine which cells are visible, then determine the cell's LOD using a simple distance metric. My LOD is a kind of geo-mipmapping, so each subsequent level contains a quarter of the vertices of the previous level. I'm using multiple streams so I only need to store the Y value for each vertex, and I'm using a common set of indices and X,Z values for each cell, if that makes sense. It seems to work a lot better than my previous method which stored X,Y,Z values for each of the 14 million vertices making up the terrain. Could this just be an instability in my graphics card (GeForce 7600GS, 256MB of ram - most of which is used up when I enable all 14 patches), or a problem with my code? I should probably mention that very occasionally, Direct3D apps cause my computer to randomly reboot - happens at most once or twice a day even when I play games for most of the day. Any ideas?
Advertisement
A stable system should never ever randomly reboot no matter how hard you push the system!
But then again the end user will always find a way as I once had a friend try to crash what I believed to be my uber stable PC by running 2 instances of the Nvidia fairy demo which I never thought of doing!
[size="2"]Don't talk about writing games, don't write design docs, don't spend your time on web boards. Sit in your house write 20 games when you complete them you will either want to do it the rest of your life or not * Andre Lamothe
I don't think the reboots are directly related to my flickering problem, as I play several games without noticing any flicking or graphical instabilities, however I still get random reboots (with frequency ranging from once a week to about 3-4 times a day - its not nearly bad enough for me to bother getting new hardware though, I can live with it).

I don't know whats causing the flickering, its not a very standard problem and there are probably loads of things that could cause it. Theres too much code to post (and its very disorganised), and I don't think that would help diagnose the problem really. My basic method is to store vertex buffers and index buffers in the D3DPOOL_DEFAULT pool (so on the card) and use multi-streams to feed the custom shaders to save a bit of memory.
Well what does PIX tell you? Are those pixels actually being rendered to? Also is there anything from the debug runtimes?
I've not used PIX before, but there isn't any suspicious debug output. Just an invalid string pointer message which is to do with my model rendering code, but thats a small bug I've not got around to fixing yet - the terrain flickering problem happened before that bug appeared. I can see that the pixels themselves are being rendered to, as you can see through some of the black gaps in the terrain to the white edge of the skyplane - so the pixels are being rendered to, just not by the terrain rendering code which appears to be missing out triangles almost at random (although its not truly random, as it is the same triangles that flicker on and off each frame).
Does the same thing happen if you turn off LOD or otherwise maintain the same LOD throughout? I had a similar looking problem some time ago due to an error in my LOD calcs. It wasn't related to the present interval but the visual symptoms are similar. I didn't properly sew high LOD areas to lower LOD areas.

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Nope, good question though - I tried that a while ago. For each cell, there are like 7 LOD levels, each with 4 times as many vertices as the previous level...and the flickering problem happens at all levels as you can see on my sceenshots, where the high-LOD cells (ones near the camera, and the ones next to rivers) have lots of missing small triangles, whilst the low LOD levels have just 1 or 2 really big triangles missing. Its not a problem with the calculation of the LOD, as if that were the case the triangles would be missing all the time.

I don't know anything about how things work on a 'low level' in graphics cards etc, but its as if some of the triangles are simply being discarded somewhere in the pipeline, or the vertex/index buffer isnt being read from graphics card memory properly (i.e. some ram addresses become unstable and cannot be read from properly).

Anyone know of a problem that could cause triangles to be randomly discarded prior to rasterization?

I've upgraded my nvidia drivers now and the problem seems to have gone away. Dunno why I didn't try that ages ago really, they were 1.5 years old

This topic is closed to new replies.

Advertisement