• Advertisement
Sign in to follow this  

Texturing a stage with the Z Buffer

This topic is 3670 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey all Anyone know how to texture a stage with the the depth buffer of the back buffer? Since SetTexture() needs a texture not a surface... Are you allowed to do that in DX9? -programmer_tom

Share this post


Link to post
Share on other sites
Advertisement
You can't.

On nVidia hardware, you can create a depth texture, assign it as your depth buffer, render, then later assign that texture to a sampler. When you do a projected read from the texture, it compares the z/w value stored in the texture with the z/w from the projected lookup, does PCF filtering on a variety of neighbouring pixels, and returns you a usually 0 or 1 value (not always 0 or 1 because of filtering). This works on GeForce3 and up (note that a GF4MX is not really up from a 3... not sure if they added support on those)

I believe ATI has another method of creating depth textures, and reading those gives you back the value, not a filtered set of comparisons. The ATI SDK would be the best place to look that up. I've never looked into it.

The common card agnostic modern way is to create a single channel float surface, like R16F or R32F, bind it as a second render target, and output your color and a depth value from your shader... but this needs more modern hardware.

Share this post


Link to post
Share on other sites
Namethatnobodyelsetook-

Thanks. I had a suspicion that was the case. I was just looking for a more efficient way than rendering to a special texture, since all I am looking to store is already in the Z buffer.

I'm quite familiar with rendering to textures, so I guess I'll do it that way.

-programmer_tom

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement