• Advertisement

Archived

This topic is now archived and is closed to further replies.

zBuffering and color Depth

This topic is 5067 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

This isnt so much a problem as it is something that confuses the crap outta me. I have a DX8.1 program that renders some .x files. I recently started it up after not running it for a while and found that the depth buffering was working a lot worse than before. There was a lot of fighting etc. after a little while I realized that the only difference was that I had recently switched to 16 bit color and forgotten to switch back to 32. When I switched back it worked fine again. My question is: Why would pixel color depth have an effect on z-buffering? AND How can I take this into account when designing DX applications, so that they work the same on both 16- and 32-bit settings? Thanks

Share this post


Link to post
Share on other sites
Advertisement
- Are you sure it''s 16 bit color not 16 bit z-buffer?
- Perhaps your card doesn''t support 32-bit z-buffering with 16-bit color depth? Check with CapsViewer.


Muhammad Haggag,
Optimize
Bitwise account: MHaggag -

Share this post


Link to post
Share on other sites
Yeah, I''m sure, cause the first thing I did when it wasnt working was make sure none of my z-buffer settings had changed. The problem came when I switched color depths.

Share this post


Link to post
Share on other sites
Some Nvidia cards limit the depth buffer to be the same precision as the render target. Perhaps you''re using one of those and the driver "fixes" things so the app still runs.

At any rate you should test to make sure the depth buffer is compatible with your backbuffer and react accordingly.


Stay Casual,

Ken
Drunken Hyena

Share this post


Link to post
Share on other sites

  • Advertisement