Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Bitmap 16bit

This topic is 6475 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
Just convert them to 24 bit using PSP or another editing tool, then convert them to 16 bit at runtime .

Or write a tool to convert them (or use TGA, it comes in many different bit depth, including 16 bit. Of course it means that you''ll have to rewrite some parts of the GPDUMB engine).

Hope this helps

Share this post


Link to post
Share on other sites
Ok I can convert to 24bit but I don''t know how to "convert them to 16 bit at runtime" hehe I seen some code on this forum,... arg it look very hard! And yeah I know, making the engine is the boring part of programming, but when it''s done we can create the game..... and now I need to modify a engine taht I didn''t done and I understand a half of it.... arg can you help me? hehe and how the hell the 16bit test prog on the CD work? they ARE 16bit bitmap! I modified all my code to include 16bit bitmap like this exemple, but my bitmap are 8bit do the prog open and close automatic...

Share this post


Link to post
Share on other sites
It''s your lucky day...I made a little program to convert 24-bit bitmaps to 16-bits not long ago...You can get it at:
www.xpress.se/~jagi1216/bmp16.exe

"Paranoia is the belief in a hidden order behind the visible." - Anonymous

Share this post


Link to post
Share on other sites
Look like the bitmap reallt convert to 16bit, but what are 555 and 565? Well I tried with the two format and to prog open and close automatic again

Share this post


Link to post
Share on other sites
555 and 565 are 16bit modes.
555 = 5 bits for red, 5 bits for green and 5 bits for blue
565 = 5 bits for read, 6 bits for green and 5 bits for blue

It depends on your video card, 555 is used by older video cards and 565 in newer cards.
I think there''s an article about this in gamedev.
Anyway, read about it in the DX7 SDK, there''s some info about it over there.



The road to success is always under construction

Share this post


Link to post
Share on other sites
Just thought I'd throw this out there... to determine whether the video card in the computer your program is running on uses 565 or 555 format, you can do something like this:

    
DDPIXELFORMAT pixel; // structure holding pixel format data

int color_depth; // set to 15 for 555 format, or 16 for 565

// zero out structure

ZeroMemory(&pixel, sizeof(pixel));
pixel.dwSize = sizeof(pixel);

// get pixel format

lpdd_primary->GetPixelFormat(&pixel);

// dwGBitMask will be 0x07E0 for 565, or 0x03E0 for 555
if (pixel.dwGBitMask == 0x07E0)
color_depth = 16;
else
color_depth = 15;


The value dwGBitMask is a bit mask that you logically AND with a 16-bit pixel value in order to extract the bits representing the green component of the color.

-Ironblayde
 Aeon Software

Edited by - Ironblayde on August 29, 2000 9:50:13 PM

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!