Depth Textures

Started by
3 comments, last by ET3D 16 years, 10 months ago
Hi, I want to use a texture's surface as depth stencil buffer for my rendering step. What I'm doing is: - create a texture using D3DUSAGE_DEPTHSTENCIL and D3DFMT_D24S8 and D3DPOOL_DEFAULT. - Get the surface level (0). - Set the device depthstencil surface with the texture's surface 0. - Draw my primitives. - Get the texture and set it on the pixel shader and see what happens... The result is: A totally white texture.. I've been trying different things on the PS but still I get it monochromatic... Anyone ever tried doing this? FYI, my video card is a GeForce 8800 GTX.. Cheers.
Advertisement
It is not a best way to do it.
You have 3 bytes float for depth and try iterpret it as 3 one-byte floats in shader (one for each channel). Maybe this is a problem reason. But I really don't know.
If 8800 GTX support locable depth buffer, you can lock it and see texture data in debug mode.

The best way to do your task is render depth to texture with shaders. Just scale z in camera space to [0,1] and write to texture. You need only one channel for this task. But you have additional render pass in this case.
Traditional behaviour of depth textures on NVIDIA hardware, which I suppose continues with the 8800, is that reading from the depth texture returns either 0 or 1, depending on whether its value is lower or higher than the current pixel's depth (maybe it's the other way round). That's useful for shadows, but not for much else.

There's no general way to use depth as a texture in DX9. ATI supports special depth formats which allow that, and I have no idea if NVIDIA also added something lately. The usual way to go is to write your own Z into a floating point texture, then use that. The ShadowMap sample in the SDK provides an example of this.
Wel,l at the moment I'm having an additional technique on my shaders for writing depth on a R32f RT but I would love to avoid it having something more transparent...

I've been reading an article on ShaderX 5 (about depth of field) where it says that NVidia and ATI cards should support this...

Anyway, assuming that NVidia cards will return 1 or 0 I don't think I would get only ones...

Also, I know it's not correct to output the result directly to the rendertarget as colours but still it shouldn't return blank values.. I'm just trying to get it working then I will adjust the values to use the 24-bits...

Anyone ever used depth textures?

Thanks..
As I said, the result depends on the comparison of Z. So check your original Z range and the Z you're drawing with.

I haven't read the ShaderX 5 article, so can't comment about this. You might want to check the documentation available on NVIDIA's site to see if there's any new depth texture option. All I know about is what I mentioned above (which is the behaviour of textures with standard depth formats on NVIDIA hardware) and ATI's special depth texture formats.

This topic is closed to new replies.

Advertisement