Pixel Format Encodings?

Started by
3 comments, last by level10boy 22 years, 10 months ago
I''m currently making a 2D sprite engine game and have two questions on the tip of my tong that I just can''t find answers for concerning pixel formats. Q1. When setting the display mode to RGB 16bit colour, I know that to find out how many bits are used per pixel for the RGB components you check the value of the dwRGBBitCount member of a DDPIXELFORMAT structure. My question is this, if the hardware is using 5.5.5 bit encoding for 16bit RGB pixels, will dwRGBBitCount return the value 15 or 16. Q2. When loading in 16bit .bmp files, how can you tell whether the pixels for the bitmap being loaded have been encoded in a 5.5.5 or 5.6.5 pixel format. Thankz level10boy
Advertisement
1. it returns 16. 5.5.5 is just a 16 bits mode, only the last bit is either unused or used for alpha.

2. you can''t, as to my knowledge there are no official specifications for 16 bits bmp''s, just 8 or 24 bits. But if there we''re, I''d bet my car that it would be 5.6.5, as that would give the bitmap more color while not increasing the filesize.
In that case, if the hardware was to use a 5.5.5 pixel encoding and not 5.6.5, how would you detect it.

Thankz
There''s another member of the pixelformat structure
called dwGBitMask.

If it''s a 555 card there''ll be 5 bits set in it, or 6
for 565.
Thanks a lot, I''ve looked into it and using the masks is definitely the way to go.

This topic is closed to new replies.

Advertisement