• Advertisement
Sign in to follow this  

Simple depth stencil buffer question

This topic is 4760 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Here's a simple question that I can't find documentation on. In DirectX 9, Can a depth stencil buffer be created in system memory and be used by my video card that does not natively support the buffer's format? For example I want to use a D3DFMT_D24S8 buffer format in software that my hardware uses while still using my hardware for just about everything else. Or is what I'm asking just not feasible?

Share this post


Link to post
Share on other sites
Advertisement
Quote:
Or is what I'm asking just not feasible?

Nope, it won't happen.. Not in hardware anyway, the refrast supports all formats and is run entirely on the CPU so your depth-stencil is technically in system memory...

I'm sure that if you were clever you could implement this sort of a feature (for example using a pixel shader to write the depth/stencil to a render target), but that would just be silly.

At a guess, any hardware you could succesfully emulate on would probably support the feature (in hardware) anyway thus making it pointless.

Also, most modern hardware is optimized for depth/stencil buffer reading/writing (e.g. ATI's Hyper-Z technology).

Just out of curiousity, why do you want to do this?

hth
Jack

Share this post


Link to post
Share on other sites
There was no specific reason this came up actually, but the more I thought about it the more it didn't make sense. Having the depth/stencil buffer in software for use by the hardware kind of defeats its purpose.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement