Bitmap 16bit

Started by
8 comments, last by Dunge 23 years, 7 months ago
I use the GPDUMB engine to display 16bit bitmaps, but how can I convert 8bit bitmaps to 16bit?
Advertisement
Fire up Paint Shop and convert from 8bit to 16bit



The road to success is always under construction
Goblineye EntertainmentThe road to success is always under construction
PSP have 16color and 24bit, no 16bit
Just convert them to 24 bit using PSP or another editing tool, then convert them to 16 bit at runtime .

Or write a tool to convert them (or use TGA, it comes in many different bit depth, including 16 bit. Of course it means that you''ll have to rewrite some parts of the GPDUMB engine).

Hope this helps
Oops, meant 24bit
Goblineye EntertainmentThe road to success is always under construction
Ok I can convert to 24bit but I don''t know how to "convert them to 16 bit at runtime" hehe I seen some code on this forum,... arg it look very hard! And yeah I know, making the engine is the boring part of programming, but when it''s done we can create the game..... and now I need to modify a engine taht I didn''t done and I understand a half of it.... arg can you help me? hehe and how the hell the 16bit test prog on the CD work? they ARE 16bit bitmap! I modified all my code to include 16bit bitmap like this exemple, but my bitmap are 8bit do the prog open and close automatic...
It''s your lucky day...I made a little program to convert 24-bit bitmaps to 16-bits not long ago...You can get it at:
www.xpress.se/~jagi1216/bmp16.exe

"Paranoia is the belief in a hidden order behind the visible." - Anonymous
Look like the bitmap reallt convert to 16bit, but what are 555 and 565? Well I tried with the two format and to prog open and close automatic again
555 and 565 are 16bit modes.
555 = 5 bits for red, 5 bits for green and 5 bits for blue
565 = 5 bits for read, 6 bits for green and 5 bits for blue

It depends on your video card, 555 is used by older video cards and 565 in newer cards.
I think there''s an article about this in gamedev.
Anyway, read about it in the DX7 SDK, there''s some info about it over there.



The road to success is always under construction
Goblineye EntertainmentThe road to success is always under construction
Just thought I'd throw this out there... to determine whether the video card in the computer your program is running on uses 565 or 555 format, you can do something like this:

    DDPIXELFORMAT pixel;  // structure holding pixel format dataint color_depth;      // set to 15 for 555 format, or 16 for 565// zero out structureZeroMemory(&pixel, sizeof(pixel));pixel.dwSize = sizeof(pixel);// get pixel formatlpdd_primary->GetPixelFormat(&pixel);// dwGBitMask will be 0x07E0 for 565, or 0x03E0 for 555if (pixel.dwGBitMask == 0x07E0)    color_depth = 16;else    color_depth = 15;    


The value dwGBitMask is a bit mask that you logically AND with a 16-bit pixel value in order to extract the bits representing the green component of the color.

-Ironblayde
 Aeon Software

Edited by - Ironblayde on August 29, 2000 9:50:13 PM
"Your superior intellect is no match for our puny weapons!"

This topic is closed to new replies.

Advertisement