D3DFMT_A8R8G8B8 == D3DFMT_X8R8G8B8

Started by
7 comments, last by Jiia 19 years, 11 months ago
Extreme newbie type question. I''m having trouble understanding the difference between these two modes. What''s up with the "X"? Is that a killed byte or what? Are these modes interchangable when setting the display mode? Are they only different when creating textures? Can I set the primary display mode to D3DFMT_X8R8G8B8 and load textures onto a D3DFMT_A8R8G8B8 surface? I''m in the middle of trying to enumerate display modes, and I''m not sure if I should treat these as the same mode or not. Thanks for any help!
Advertisement
D3DFMT_A8R8G8B8 is a 32-bit ARGB colour format that uses 8-bits for the (A)lpha, (R)ed, (G)reen and (B)lue channels.

D3DFMT_X8R8G8B8 is a 32-bit RGB colour format that uses 8-bits for the (R)ed, (G)reen and (B)lue channels. Alpha is unspecified, hence the ''X''.

Regards,
Sharky




---
< SharkyUK.com & Sharky''s Coding Corner >
< Professional Engine Programmer - Strangelite Studio >
---
"X" byte keeps the pixels aligned to 32 bits, which is good for the memory transaction speed.

-Nik

Niko Suni

I kind of figured that, but I''m still not sure of the difference in the formats. Both are 32 bits, what difference does it make if that byte is used for alpha or not? Why even have an "X" mode? Are there video cards that support the X mode, but don''t support the alpha mode? Why in the hell would they do that? Why would the primary display mode need alpha at all? What is it going to let you see through to? There''s nothing behind it!

If a game wanted to use a 32 bit mode but didn''t require alpha, couldn''t they just set the alpha mode anyways and ignore it''s byte? What is the difference in what I can do graphically if I use these modes for the primary and back buffers?

And what is the difference in what I can do likewise with textures?

I appreciate any help with this, and thanks for the attemps so far
You''re about to run into the difference between AdapterDisplayMode and BackBufferMode.

If you create your device as A8R8G8B8 or X8R8G8B8, the display mode will be X8R8G8B8 (you can''t display alpha on a CRT).

If you create your device as A8R8G8B8 you will have alpha bits in your backbuffer. If you use X8R8G8B8, you won''t. This is useful for rendering two passes where the amount of mixing is determined by the first pass. This is also known as destination alpha. Only newer cards support it, apparently. This difference comes up solely with CheckDeviceFormat() I think. Nothing else, anywhere, makes some weird display mode distinction.

The same holds true with X1R5G5B5 and A1R5G5B5. So, to answer your question, yes, some cards may support the X version, and not the A version... some very old cards.

When reading a texture without alpha, the alpha will be read as 255 (or 1.0f).
Furthermore, it is theoretically possible that the device will be able to optimize the rasterizer path to ignore alpha interpolator if no alpha channel is present. This is unlikely, however, as the alpha is transported parallel to the color data, anyway.

Note that it is inefficient to actually skip the alpha byte writing, as the IA32 generally wants 32-bit alignment for best performance throughout the entire machine. So just write something to the fourth byte, even if it doesn''t matter

-Nik

Niko Suni

Thanks, that helps explain a lot.

So generally, you always set the primary display and back buffer to D3DFMT_X8R8G8B8?

And so if I use D3DFMT_X8R8G8B8, my textures will still support alpha blending?

quote:When reading a texture without alpha, the alpha will be read as 255 (or 1.0f).

I'm using the X helper functions to load textures, so I'm only assuming they set the format of the textures to match the image I'm loading. Is that how it works?

I'm going to ask another related, also completely newbie, question (I'm totally full of them). Is it better to use textures in the format of the primary display, or does it matter in 3D? What if an image is saved in 32 bit mode, but the order of color data in each pixel is off from the primary display? Does Direct3D9X convert it automatically? Would they also convert a 24 bit image to 32 bit to match the primary display? 24 to 16? if not, I can't use the *.bmp format without manually loading and converting them myself.

I really appreciate any help. I'm in a deep hole with very little light visible

[edited by - Jiia on May 8, 2004 11:56:09 AM]
quote:Original post by Jiia

So generally, you always set the primary display and back buffer to D3DFMT_X8R8G8B8?

It is better to enumerate the supported modes for the target machine than to hard-code the format. I know a few people who test their code using Voodoo 3''s as the low end cards.

quote:
And so if I use D3DFMT_X8R8G8B8, my textures will still support alpha blending?


Yes, if the hardware supports that (and generally does). This can also be verified in run-time.

quote:
I''m using the X helper functions to load textures, so I''m only assuming they set the format of the textures to match the image I''m loading. Is that how it works?


There are the *Ex versions of the texture loading functions that let you specify in what format you want the textures to be loaded.

quote:
I''m going to ask another related, also completely newbie, question (I''m totally full of them). Is it better to use textures in the format of the primary display, or does it matter in 3D? What if an image is saved in 32 bit mode, but the order of color data in each pixel is off from the primary display? Does Direct3D9X convert it automatically? Would they also convert a 24 bit image to 32 bit to match the primary display? 24 to 16? if not, I can''t use the *.bmp format without manually loading and converting them myself.


It generally is more efficient to use the same format during the entire rendering. However, on modern cards this will not affect the performance nearly as much as the possible savings of video memory due to less-accurate formats, if at all.

The D3DX loader handles very many color conversions gracefully, if needed. The texture format is usually independent of the display format, but some devices may place restrictions on their compatibility. The restrictions can also be determined in run-time.

-Nik

Niko Suni

That answered everything (which was quite a lot!), thanks!

This topic is closed to new replies.

Advertisement