Any ATI / NVidia specific hacks to grab depth buffer to texture?
Hi,
Are there any tricks on PC D3D ver.9 that would let me grab depth buffer into the texture without need to do separate scene render pass? I need the depth buffer for various stuff like depth-fading of the particles as well as post-processing.
Maybe some ATI / NVidia specific tricks?
Or maybe the other way - anyone tried to create lockable depth buffer for the D3D device and then, during frame, manually:
1. lock
2. custom copy depth buffer into texture
3. unlock
OpenGL let's us do it quite easily.
Thanks in advance.
You do NOT want to lock the Z-buffer in the middle of the frame, that will kill performance (even on OGL).
On Nvidia, IIRC, I think you can just call CreateTexture with D3DFMT_D16 and it will give you a renderable depth texture. You can render to that and then fetch from it without having to copy.
This doesn't work on ATI, but ATI has a fourCC surface format called DF16, that you can use to do the same thing.
If you do need to lock the texture (possibly even use it as a texture input), you'll need to use either D3DFMT_D16_LOCKABLE or D3DFMT_D32_LOCKABLE. Unfortunately, neither has a stencil buffer, so you're stuck if you need one.
ATI's depth format should be accessible, but NVIDIA's won't be.
I think that you could render to a render target depth yourself, and do it in one pass with MRT.
I think that you could render to a render target depth yourself, and do it in one pass with MRT.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement