Jump to content
  • Advertisement
Sign in to follow this  
Stoic

OpenGL Beginning D3D9 BackBuffer Format question

This topic is 4419 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi All - I'm working on doing a conversion of my previously OpenGL-based engine to DX9, and I'm a little confused by the BackBuffer Formats. It looks like what I want is D3DFMT_A8R8G8B8. This sounds like standard 32 bit color with an Alpha channel (Like a standard PFD_RGBA pixel format in OGL). However, it doesn't seem like I can use this format for Fullscreen mode. The tutorials I've been following all seem to use D3DFMT_X8R8G8B8. This format doesn't include an alpha channel, so will it allow alpha blending? In windowed mode, it seems like you should just use whatever format the desktop is set to (off the D3DDISPLAYMODE), which more or less makes sense. But which Format should I use if I want to get alpha blending in full screen? Can someone explain this? Sorry for the Newbie question...

Share this post


Link to post
Share on other sites
Advertisement
You can use any of the formats listed as backbuffer formats in the D3DFORMAT help page, provided your card supports it.

For alpha blending, you typically have A8R8G8B8 textures, and use the alpha from the texture. You only need alpha in the backbuffer if you're a) storing something extra in that channel, or b) using a blending mode that refers to DESTALPHA (ie: lay down alpha in pass 1, then use that alpha in pass 2).

For windowed mode, you can use any backbuffer mode, and it will be converted during Present() to the appropriate format. If you only require 16 bit color, it's likely faster to use 16 bit back buffers. This will be a speed boost during all drawing, and you'll get a minor speed hit during Present.

If you'd like 32 bits, but can make do with 16 if that's all that will be displayed, use the desktop format. Feel free to tack on the extra alpha even if that's not in the desktop format if you are using it.

If you're doing many passes, for complex lighting as an example, using a 16 bit backbuffer may not have the precision you'd like, as you're losing quite a bit of data each pass. Use 32 bits to get the blending to be more accuate, then let Present convert it to 16 bits.

When using CheckDeviceType, and you have an A8R8G8B8 back buffer, you can't specify that as the display mode too. The display mode is X8R8G8B8, as monitors don't have alpha. You never actually tell D3D to use X8R8G8B8 anywhere, it's just implied. It's annoying that you have to manually strip out alpha to call CheckDeviceType. What's more annoying is that there is an exception made for A2R10G10B10, as there is no X2 variety of that format.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!