I've got a curious problem I need some help with. When I translate my geometry e.g. 6000 on an arbitrary axis in world space. And then get close to them they start to distort and jump around a bit when the camera moves. This does not happen with geometry located around 0. My nearPlane is 0.01f and far plane 50000.0f Decreasing the far plane does not solve the problem. Increasing near plane only in the sense that I can't even get that close I'm not sure why this is only happening if my geometry is far away in world space (but still in front of my camera). Here's a video showing this:
Geometry jumping around / distorted on close up(Depth problem?)
And btw I'm using a 32bit float depth buffer. (DXGI_FORMAT_D32_FLOAT)
How does it matter if I transform one by one in the vertex shader ?
If the camera and the object are both 6,000 units away from the origin, then you have lost the same amount of precision on both. This adds up to some nasty jumping around.
The best way to rectify this is to treat your camera as the origin of the scene, and move everything else around it (can be done by quite simply by subtracting the camera position right before building your model/view matrices).
You mean subtracting the camera position from the geometry's position ? This does not seem to work for me.
You mean subtracting the camera position from the geometry's position ? This does not seem to work for me.
Subtracting the camera's position from every object's position (including itself - the camera should be located at [0,0,0]).
Hmm I guess it might be possible to fake my position for the atmosphere telling it that I'm 6000+ higher on the y axis while I'm actually not
So the point is that floating-point precision is greatest close to zero, and reduces quite drastically over larger numbers.
Currently, your camera and object are ~6,000 units away from zero (so the precision is very bad). But conceptually, your camera and object are very close together, so you can exploit that fact by moving the origin closer (and it's conceptually simple to treat the camera position as the origin). If you really need your further object to be 6,000 units away from the camera, that's fine - after you subtract the camera position they will still be 6,000 units apart from each other.
I imagine however that the problem is that the atmosphere shader relies on being centred at the origin (which you should fix - what happens when you want 2 separate planets with atmosphere?). A simple fix here is to do the atmosphere calculations in the planet's local coordinate space (while rendering everything in the camera's coordinate space).
BTW, you may want to read over Sean O'Neil's decade-old-but-still-relevant article on scale in planet rendering.