• Create Account

### #ActualRetsu90

Posted 08 October 2012 - 05:40 AM

Hi,
I've noticed a possible bug on nVIdia cards. I'm developing a game that handle 3D textures and when I need to use a layer from that 3D texture, I use a formula like this:
z = 1.0f/textureDepth * layer
On my home computer this formula works without problems (I'm using a ATI Radeon 4800 series) and it render the layer that I want, but on nVidia (and also on Intel HD 3000) it doesn't. The bugfix can be resolved editing that formula:
z = 1.0f/textureDepth * layer + 0.00001
Someone noticed this before? I can't find anything about it on gamedev and also on Google...

EDIT: This problem happens when the texture depth is an odd number

### #1Retsu90

Posted 08 October 2012 - 05:38 AM

Hi,
I've noticed a possible bug on nVIdia cards. I'm developing a game that handle 3D textures and when I need to use a layer from that 3D texture, I use a formula like this:
z = 1.0f/textureDepth * layer
On my home computer this formula works without problems (I'm using a ATI Radeon 4800 series) and it render the layer that I want, but on nVidia (and also on Intel HD 3000) it doesn't. The bugfix can be resolved editing that formula:
z = 1.0f/textureDepth * layer + 0.00001
Someone noticed this before? I can't find anything about it on gamedev and also on Google...

PARTNERS