Jump to content
  • Advertisement
Sign in to follow this  
EbonySeraphim

Texture Format vs Back Buffer Format

This topic is 3903 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I was wondering what the performance impact would be if I used a 24bit texture(D3DFMT_R8G8B8) while rendering to a 32-bit back buffer with alpha blending turned on. I'd like to load 24 bit textures because most of our game assets will not use alpha so I want to keep them small, but if there's a performance hit in doing this I'd rather not. Assuming performance does drop when using a 24 bit texture, would there be any increase in performance by using D3DFMT_X8R8G8B8 over using D3DFMT_A8R8G8B8 and alpha is set to 0xFF?

Share this post


Link to post
Share on other sites
Advertisement
There used to be a huge perf hit when using 24 bit textures. That hit no longer exists because every card in recent times dropped support for 24 bit textures, and D3DXCreateTextureFromFile functions silently promote the files to 32 bit. A8 and X8 should have the same performance characteristics, even without packing the alpha with 0xFF.

Creating DDS files with 24bit textures inside rather than 32bit will cause loads to fail (I think it used to do the 32 bit conversion, but not anymore. Or maybe it only dies on XBox, and PC converts. Regardless, don't make 24 bit DDS files).

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!