Archived

This topic is now archived and is closed to further replies.

level10boy

Pixel Format Encodings?

Recommended Posts

I''m currently making a 2D sprite engine game and have two questions on the tip of my tong that I just can''t find answers for concerning pixel formats. Q1. When setting the display mode to RGB 16bit colour, I know that to find out how many bits are used per pixel for the RGB components you check the value of the dwRGBBitCount member of a DDPIXELFORMAT structure. My question is this, if the hardware is using 5.5.5 bit encoding for 16bit RGB pixels, will dwRGBBitCount return the value 15 or 16. Q2. When loading in 16bit .bmp files, how can you tell whether the pixels for the bitmap being loaded have been encoded in a 5.5.5 or 5.6.5 pixel format. Thankz level10boy

Share this post


Link to post
Share on other sites
1. it returns 16. 5.5.5 is just a 16 bits mode, only the last bit is either unused or used for alpha.

2. you can''t, as to my knowledge there are no official specifications for 16 bits bmp''s, just 8 or 24 bits. But if there we''re, I''d bet my car that it would be 5.6.5, as that would give the bitmap more color while not increasing the filesize.

Share this post


Link to post
Share on other sites
There''s another member of the pixelformat structure
called dwGBitMask.

If it''s a 555 card there''ll be 5 bits set in it, or 6
for 565.

Share this post


Link to post
Share on other sites