RGB is actually BGR...!

Started by
10 comments, last by neneboricua19 15 years, 11 months ago
Why is that in Directx RGB is actually taken as BGR? (textures,vertex data,etc.) And for vertex colors ARGB (in docs) is actually BGRA ! Even for the texture format D3DFMT_R8G8B8 the colors are actually taken as B8G8R8. This is nowhere specified, not in Microsoft docs at least. And there is no D3DFMT_B8G8R8, so I have to do the convertion myself! However there is D3DFMT_X8B8G8R8 ... so wtf? Microsoft is killing me with their formats...
Advertisement
You actually expected a Microsoft product to make sense? ;)
There's no point on trying to understand it. I always get confused with ARGB BGRA ABGR,etc even after so many years of working with texture formats.

Why there's no point? because it depends on machine endianess. If you change the OS (or worse, the CPU architecture) what previously was treated as ARGB, is now BGRA (it's flipped) because one is little endian and the other is a big endian PC

My advice, make some test apps, and see how it looks. And never take something for granted. It may change in the future. The only thing granted here is that if you have something that is "ARGB" and another texture is "ABGR", they will always be different (you know that to make the conversion in this example, you'll have to convert the last 3 components), no matter on what OS/PC you are.

Hope this helps
Dark Sylinc
Actually, all D3D formats are named such that when read as the appropriate word size, the bits are where you'd expect. I think it's not actually mentioned in the docs, but is in D3D9types.h (just press F12 when your cursor is on a format name and VStudio should whisk you away to the appropriate header).

If you read A8R8G8B8 as a DWORD, and A will be in bits 24-31, R in 16-23, G in 8-15, and B in 0-7. A1R5G5B5 will have A in bit 15 if you read it as a WORD. If you're reading A8R8G8B8 as bytes you'll get blue first because of the endian of X86 processors. But you can't read a single element of all formats as bytes. This applies to formats such as A4L4, A1R5G5B5, A4R4G4B4, A2R10G10B10, or any 16 and 32 bit formats like L16, or R32F. These all work when reading the appropriate word size.

No card currently in existance that I know of actually supports R8G8B8, and D3DX silently promotes it to X8R8G8B8. If you call create calls on the D3D device itself, rather than D3DX, they'll just fail (except in pool scratch). As nothing supports 24 bit R8G8B8, why make a 24 bit B8G8R8? So nothing will support it too?
To further add to your confusion, D3D10 has changed the ordering, so BGRA in D3D9 is represented as ARGB in D3D10! If you're writing an application that targets both of these APIs you've got to take that into consideration. Not to mention that OpenGL's ordering is also different from that of D3D9.

@Namethatnobodyelsetook,
What's your take on D3D10's ordering of color components? How is that justified? I guess it's been ordered so, in a little endian machine, components are encountered one by one, as one moves to higher memory addresses. Right?
I haven't really looked into D3D10 yet, so I can't comment it's bit/byte ordering. Sorry.
Quote:Original post by Ashkan
To further add to your confusion, D3D10 has changed the ordering, so BGRA in D3D9 is represented as ARGB in D3D10!
Are you sure?? I don't remember such a semantic change the last time I was playing around with D3D10 [oh]

Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

If you look at the code sample on the ID3D10Texture1D::Map doc page, you'll see that if you get a pointer to the bits of an RGBA texture, you walk the memory in A, B, G, R order for the same reasons that Namethatnobodyelsetook stated.

neneboricua
Quote:Original post by jollyjeffers
Quote:Original post by Ashkan
To further add to your confusion, D3D10 has changed the ordering, so BGRA in D3D9 is represented as ARGB in D3D10!
Are you sure?? I don't remember such a semantic change the last time I was playing around with D3D10 [oh]

Jack


Here

The enumerations are definitely reveresed in order, which can be misleading. I took that as a sign that the underlying bit order is changed too, but on a second thought, I might be wrong, so why did they change the ordering in the enums? You're the expert after all. Would someone shed some light on this issue please?
This is like complaining about little vs. big endian the first time you use a Intel vs. Motorola processor. Just get used to it. It's certainly not something that's difficult to adapt to.

This topic is closed to new replies.

Advertisement