Archived

This topic is now archived and is closed to further replies.

Rufe

Converting 24 bit Bitmaps to 16/32 bit Bitmaps

Recommended Posts

If i''m not mistaken bitmaps only come in monochrome, 16 colors, 256 colors, and 24 bit color. I''m working on a directX game which I would prefer to run in either 16 bit or 32 bit color. Thus, for some time now i have been trying to convert a 24 bit bitmap into....anything that looked right. I wrote file io code to input the bitmap, read the header, extract the binary image itself. All of the information about the bitmap fills out correctly and I am able to see an almost proper image (the shapes are right but the colors are blaring and wrong) on the screen so I''m almost positive the file io is fine and my color depth conversion is screwed. My understanding of 24-bit color is as follows: 8 bits of red, 8 bits of green, 8 bits of blue Making the binary: (rrrrrrrrggggggggbbbbbbbb) My understanding of 16-bit color is as follows: 5 bits of red, 6 bits of green, 5 bits of blue Making the binary: (rrrrrggggggbbbbb) My understanding of 32-bit color is as follows: 8 bits reserved, 8 bits red, 8 bits green, 8 bits blue Making the binary: (xxxxxxxxrrrrrrrrggggggggbbbbbbbb) I have been leaving the reserved bits = 0... If I''m not mistaken this should mean I can take a 24-bit bitmap and copy move rgb into the proper positions on a 32-bit DirectX Surface. However, my picture comes out with bad shapes and bad colors. If I take the 24-bit bitmap and cut off 3 bits from red/blue and 2 bits from green I can get the proper shapes with the wrong colors. It would seem whacking off 2-3 bits from each color is a bad idea and would distort the color. Thus 32-bits would seem the solution, however, my 32-bit conversion only seems to screw it up more... Is it ok to leave the reserved bits = 0? Am I mistaken in the layout of the binary? Any info would help, this includes info on the binary layout or simply a better way to do bitmaps Thanks! -Alan

Share this post


Link to post
Share on other sites
One thing you should remember is that while a lot of modern (well...modern by computer technology standards) video cards use the 5-6-5 method, there is also the 5-5-5 method. Look up GetPixelFormat() in the DirectX help file for info on how to find out which method a video card uses.

Share this post


Link to post
Share on other sites
Problem solved! I was doing some bad type casting on the binary values. Thanks for the info on the 5-5-5, 5-6-5 thing, I was unaware of the difference in video cards!
-Alan

Share this post


Link to post
Share on other sites