It's a plain old Minecraft clone ("oh no.. not another one of these..."), chunked rendering, "infinite" terrain. I have a 2D plane of WorldChunk objects. Each chunk is converted to a minimal mesh of exposed faces and stored as a VBO. Once the chunk is ready for render, I might need to place it somewhere very far from the origin, because this chunk is at, say, (16384, 16384). What happens in my rendering is that gaps appear between the chunks and, as I pan my camera around, each chunk jitters a bit due to lack of precision in absolute placement. Each chunk might jitter around 1-2 pixels, sometimes landing exactly lined up with its neighbor, other times 2 pixels offset with a gap. The problem gets worse as you go farther from the origin. Near the origin, such as below (4096,4096) the problem is not obvious. At the origin is it not possible to spot the problem at all. I know there must be limited precision in translates, so I need to do things in a fundamentally different way.
For each chunk, when it has a VBO that is ready to draw, I do:
glTranslatef(16384.0f, 0.0f, 16384.0f);
glDrawArrays(GL_TRIANGLES, 1, this.numVerts);
I thought about making my translates more like a "local world" where I simply place them relative to the viewer, but it makes a few other things difficult. I'm sure I'm doing this wrong by trying to use very large glTranslatef() values relative to the origin.
Any advice out there?
The rendering test app is @ https://voodoo.arcanelogic.net/CYDI-latest.jar if anyone cares to look at it, but it would take a while to fly far enough away from the origin to notice big gaps and jitter
Edited by voodoodrul, 12 November 2013 - 12:14 PM.