• Create Account

## Geometry jumping around / distorted on close up(Depth problem?)

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

9 replies to this topic

### #1lipsryme  Members

1478
Like
0Likes
Like

Posted 29 January 2013 - 02:01 PM

I've got a curious problem I need some help with. When I translate my geometry e.g. 6000 on an arbitrary axis in world space. And then get close to them they start to distort and jump around a bit when the camera moves. This does not happen with geometry located around 0. My nearPlane is 0.01f and far plane 50000.0f Decreasing the far plane does not solve the problem. Increasing near plane only in the sense that I can't even get that close I'm not sure why this is only happening if my geometry is far away in world space (but still in front of my camera). Here's a video showing this:

### #2AliasBinman  Members

829
Like
2Likes
Like

Posted 29 January 2013 - 02:45 PM

When you are 6000 units from the camera you have effectively chopped off up to 13 bits of fractional precision. Typically this shouldn't matter for the rendering as you should have aggregate transforms which work in camera space. Is it possible you are transforming from local to world then world to view? If so then combine them together on the CPU beforehand.

### #3lipsryme  Members

1478
Like
0Likes
Like

Posted 29 January 2013 - 02:56 PM

But I'm not actually 6000 units away from the camera, my camera is located at the same 6000 units in front of my geometry.
And btw I'm using a 32bit float depth buffer. (DXGI_FORMAT_D32_FLOAT)
How does it matter if I transform one by one in the vertex shader ?

Edited by lipsryme, 29 January 2013 - 02:59 PM.

### #4swiftcoder  Senior Moderators

17824
Like
1Likes
Like

Posted 29 January 2013 - 03:02 PM

If the camera and the object are both 6,000 units away from the origin, then you have lost the same amount of precision on both. This adds up to some nasty jumping around.

The best way to rectify this is to treat your camera as the origin of the scene, and move everything else around it (can be done by quite simply by subtracting the camera position right before building your model/view matrices).

Tristam MacDonald - Software Engineer @ Amazon - [swiftcoding] [GitHub]

### #5lipsryme  Members

1478
Like
1Likes
Like

Posted 29 January 2013 - 03:39 PM

You mean subtracting the camera position from the geometry's position ? This does not seem to work for me.

### #6swiftcoder  Senior Moderators

17824
Like
0Likes
Like

Posted 29 January 2013 - 03:50 PM

You mean subtracting the camera position from the geometry's position ? This does not seem to work for me.

Subtracting the camera's position from every object's position (including itself - the camera should be located at [0,0,0]).

Tristam MacDonald - Software Engineer @ Amazon - [swiftcoding] [GitHub]

### #7lipsryme  Members

1478
Like
0Likes
Like

Posted 29 January 2013 - 04:02 PM

The problem is I need my camera to be at 6000+ for my atmosphere shader to work. Also my geometry completely disappears. Let me get this right, I translate my geometry somewhere, and from this translation value I subtract the camera position ? (And the camera has to be at 0,0,0?)

Hmm I guess it might be possible to fake my position for the atmosphere telling it that I'm 6000+ higher on the y axis while I'm actually not

Edited by lipsryme, 29 January 2013 - 04:10 PM.

### #8swiftcoder  Senior Moderators

17824
Like
1Likes
Like

Posted 29 January 2013 - 04:15 PM

So the point is that floating-point precision is greatest close to zero, and reduces quite drastically over larger numbers.

Currently, your camera and object are ~6,000 units away from zero (so the precision is very bad). But conceptually, your camera and object are very close together, so you can exploit that fact by moving the origin closer (and it's conceptually simple to treat the camera position as the origin). If you really need your further object to be 6,000 units away from the camera, that's fine - after you subtract the camera position they will still be 6,000 units apart from each other.

I imagine however that the problem is that the atmosphere shader relies on being centred at the origin (which you should fix - what happens when you want 2 separate planets with atmosphere?). A simple fix here is to do the atmosphere calculations in the planet's local coordinate space (while rendering everything in the camera's coordinate space).

Tristam MacDonald - Software Engineer @ Amazon - [swiftcoding] [GitHub]

### #9lipsryme  Members

1478
Like
0Likes
Like

Posted 29 January 2013 - 04:23 PM

Well since it's relying on precomputed texture lookups and needs the world position of the geometry and the current camera position I figure I could offset this 6360 units on the y-axis, and running this I do seem to get the same correct results like before but I'm actually at 0, 0, 0 now. I will however try to use this trick of making everything relative to the camera to further avoid problems with the depth precision. Thanks.

Edited by lipsryme, 29 January 2013 - 04:31 PM.

### #10swiftcoder  Senior Moderators

17824
Like
0Likes
Like

Posted 29 January 2013 - 04:57 PM

BTW, you may want to read over Sean O'Neil's decade-old-but-still-relevant article on scale in planet rendering.

Tristam MacDonald - Software Engineer @ Amazon - [swiftcoding] [GitHub]

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.