Jump to content
  • Advertisement


This topic is now archived and is closed to further replies.


Converting 24bpp to 16bpp

This topic is 6236 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey, I got this little(?) problem. First I made a exclusive fullscreen directx screen with 24bpp. Then I loaded a bitmap (24bpp) into a char* buffer. I copied pixels by copying 3 char everytime. Worked fine, but now I want to change the 24bpp displaymode into a 16bpp. (I guess that will change the performance a lot) How do I convert the 24bpp (windows paint ) bitmap into the 16bpp format of my directxbackbuffer??? Thanks in advance! Gr, BoRReL

Share this post

Link to post
Share on other sites
You need to do a simple loop for each pixel, split the RGB components, shift them to the new bit depth ( get these from the DDPIXELFORMAT structure or whatever your API provides, just remember, they may be anything). Now recombine them and you have your 16bit pixel.

[edit] Remembered not everyone uses DX [/edit]

Edited by - mr_jrt on June 25, 2001 8:53:51 AM

Share this post

Link to post
Share on other sites

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!