Jump to content
  • Advertisement
Sign in to follow this  
GDW

A8R8G8B8 or X8R8G8B8?

This topic is 4259 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I keep reading the documentation over and over but I can't understand what's the difference between A8R8G8B8 & X8R8G8B8..

Share this post


Link to post
Share on other sites
Advertisement
If the format is for a texture or surface this means that the A8R8G8B8 format can have alpha values inside. Useful for color keying and blending.

Whereas X8R8G8B8 uses the same amount of memory, but 25% stay unused.

For a backbuffer this doesn't make any difference.


Notice: 25% unused may sound harsh, but it means a pixel is 32bit which fits todays architecture way better than 24bit. Most cards don't offer any 24bit acceleration nor the display modes for it.

Share this post


Link to post
Share on other sites
Quote:
For a backbuffer this doesn't make any difference.
I don't like to nit-pick, but you mean front buffer, right? [smile]

An alpha channel in the render-target/backbuffer allows the use of destination alpha during blend-op's. However this is all resolved in the front-buffer such that alpha isn't of any use anymore...

Check out the D3DFORMAT documentation for details - A8R8G8B8 is valid for back buffers, but not for front buffers.

hth
Jack

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!