• ### Popular Now

• 13
• 18
• 19
• 27
• 10

#### Archived

This topic is now archived and is closed to further replies.

# 16-bit color and....a pallete?

This topic is 6075 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I recently switched from good ''ol 8 bit color (with the pallete) to 16-bit RGB mode. I''m assuming that since it''s in RGB mode, not "palletizized"[sic] mode that it has no pallete. If my assumptions are correct, then I won''t have to read the pallete out of my image files, just load the pixel information? Or is there a new trick to 16bit mode? Thanks! -Sponge99

##### Share on other sites
your palette probably has the colors determined, via a 0-63 range per color... thats 6 bits per pixel... 16-bit color is RGB, which has 5-6-5 (the 6 bits is for the green, 5 for red and blue)... so you may wish to convert your palette (which is 6-6-6 bits per channel) to a 16-bit pixel per color... then simply copy the colors from the palette, into a 16-bit bitmap, in the order determined by your palettized image.
hope this helps/is what you wanted to know

##### Share on other sites
If your source images are in 8bit pallete mode, then you will need to examine the pallete at start-of-load, and insert "hard" values into every pixel, making sure to convert them to 16 bit colours, like dmounty said.

If your source images are in 24bit mode, then you will need to load the image to a temporary buffer, and convert each pixel to 16 bit colour in a new buffer.

##### Share on other sites
It''s worth noting that in 16 bit color mode, the 5-6-5 format isn''t guarenteed. 5-5-5 is possible too, so make sure to actaully check what the system has before assuming

##### Share on other sites
If your images are already in 24 bit mode then why don''t you just use 24 or 32 bit color modes? It makes the whole process a lot shorter.

##### Share on other sites
Yes, I''d choose a 32 bit mode, it is more simple. However, refrain from using the 24-bit mode, because there aren''t a lot of graphic cards that support it.

##### Share on other sites
Assuming you're using DirectDraw, working in 16 bit color isn't difficult at all. My code for writing directly to the surface for 16 and 32 bit modes is almost identical, same with loading bitmaps manually. If your having trouble with 16 bit color, look through the DirectDraw surface locking sample in the SDK. For your own purposes you'll want to use something faster than the equation they use for writing though. Personally here's what I've been doing:

  wRed = (ColorValue >> dwBands[0]) << dwBitShift[0]);wGreen = (ColorValue >> dwBands[1]) << dwBitShift[1]);wBlue = (ColorValue >> dwBands[2]) << dwBitShift[2]);(WORD) *(pSurfaceMem + (SurfaceOffset)) = (WORD) ( wRed | wGreen | wBlue );

I'm not saying this is the fastest or greatest way, espeically since I more or less came up with it myself, but it seems to work. The great thing about this is that ColorValue, regardless of the RGB mode you're in, is based on 256 shades. Making loading 24 bit bitmaps into surfaces with the above code really simple. Now to explain it:

Color Value: Shade of color based on 256 shades.
dwBands : How much you have to divide 256 by in order to fit within the bits you are given to work with. This value is precomputed for speed, and uses the following algorithm to generate it.

  for( i=0; i<3; i++ ) { dwBands[i] = 256 / pow(2, dwBits[i]) ; tmp = dwBands[i]; dwBands[i] = 0; while( tmp != 1) { tmp = tmp >> 1; dwBands[i]++; } }

Note that dwBits is how many bits are allocated for that color in a pixel. In 16 bit mode this will either be 5 for each color, or 5 for red, 6 for green and 5 for blue.

dwBitShift: how far you have to shift a color value to get it to it's proper place in the WORD or DWORD for the pixel.

Rest is pretty self explanatory. Anyone have comments about this technique? It seems to be reasonably fast for plotting pixels one by one, but as I said before, I figured it out on my own, so it is questionable.

Edited by - LordElectro on July 30, 2001 12:36:18 PM

Edited by - LordElectro on July 30, 2001 12:37:44 PM

Edited by - LordElectro on July 30, 2001 12:38:27 PM

Edited by - LordElectro on July 30, 2001 12:48:53 PM

##### Share on other sites
In response to parklife, there are some adapters which don''t support 32 bpp. Therefore, I''d suggest that you have your app try 32 bpp first. If that fails, then try 24 bpp. If that fails, try 16 bpp. If that fails, then your app fails (I can''t think of a single case where this''d happen, unless you tried this on some old VGA card).

##### Share on other sites
Of course you could try to set each mode until failure, but I think the best way to do this is to get the current desktops colordepth. If it''s 8bit, quit out with a "Unsupported graphics mode" error. If it''s higher than 8bit, set your bit depth to the desktops bit depth. This will gaurentee that the mode is supported, and you don''t have to try to create the surface 4 times before knowing.

##### Share on other sites
Programs that check your desktop''s depth are annoying. What I run my desktop at shouldn''t matter to the program. A good program will give the user a choice.