Archived

This topic is now archived and is closed to further replies.

16-bit color and....a pallete?

This topic is 5983 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I recently switched from good ''ol 8 bit color (with the pallete) to 16-bit RGB mode. I''m assuming that since it''s in RGB mode, not "palletizized"[sic] mode that it has no pallete. If my assumptions are correct, then I won''t have to read the pallete out of my image files, just load the pixel information? Or is there a new trick to 16bit mode? Thanks! -Sponge99

Share this post


Link to post
Share on other sites
your palette probably has the colors determined, via a 0-63 range per color... thats 6 bits per pixel... 16-bit color is RGB, which has 5-6-5 (the 6 bits is for the green, 5 for red and blue)... so you may wish to convert your palette (which is 6-6-6 bits per channel) to a 16-bit pixel per color... then simply copy the colors from the palette, into a 16-bit bitmap, in the order determined by your palettized image.
hope this helps/is what you wanted to know

Share this post


Link to post
Share on other sites
If your source images are in 8bit pallete mode, then you will need to examine the pallete at start-of-load, and insert "hard" values into every pixel, making sure to convert them to 16 bit colours, like dmounty said.

If your source images are in 24bit mode, then you will need to load the image to a temporary buffer, and convert each pixel to 16 bit colour in a new buffer.

Share this post


Link to post
Share on other sites
If your images are already in 24 bit mode then why don''t you just use 24 or 32 bit color modes? It makes the whole process a lot shorter.

Share this post


Link to post
Share on other sites
Yes, I''d choose a 32 bit mode, it is more simple. However, refrain from using the 24-bit mode, because there aren''t a lot of graphic cards that support it.

Share this post


Link to post
Share on other sites
Assuming you're using DirectDraw, working in 16 bit color isn't difficult at all. My code for writing directly to the surface for 16 and 32 bit modes is almost identical, same with loading bitmaps manually. If your having trouble with 16 bit color, look through the DirectDraw surface locking sample in the SDK. For your own purposes you'll want to use something faster than the equation they use for writing though. Personally here's what I've been doing:

          
wRed = (ColorValue >> dwBands[0]) << dwBitShift[0]);
wGreen = (ColorValue >> dwBands[1]) << dwBitShift[1]);
wBlue = (ColorValue >> dwBands[2]) << dwBitShift[2]);
(WORD) *(pSurfaceMem + (SurfaceOffset)) = (WORD) ( wRed | wGreen | wBlue );



I'm not saying this is the fastest or greatest way, espeically since I more or less came up with it myself, but it seems to work. The great thing about this is that ColorValue, regardless of the RGB mode you're in, is based on 256 shades. Making loading 24 bit bitmaps into surfaces with the above code really simple. Now to explain it:

Color Value: Shade of color based on 256 shades.
dwBands : How much you have to divide 256 by in order to fit within the bits you are given to work with. This value is precomputed for speed, and uses the following algorithm to generate it.

  

for( i=0; i<3; i++ )
{
dwBands[i] = 256 / pow(2, dwBits[i]) ;
tmp = dwBands[i];
dwBands[i] = 0;

while( tmp != 1)
{
tmp = tmp >> 1;
dwBands[i]++;
}
}

Note that dwBits is how many bits are allocated for that color in a pixel. In 16 bit mode this will either be 5 for each color, or 5 for red, 6 for green and 5 for blue.


dwBitShift: how far you have to shift a color value to get it to it's proper place in the WORD or DWORD for the pixel.

Rest is pretty self explanatory. Anyone have comments about this technique? It seems to be reasonably fast for plotting pixels one by one, but as I said before, I figured it out on my own, so it is questionable.


Edited by - LordElectro on July 30, 2001 12:36:18 PM

Edited by - LordElectro on July 30, 2001 12:37:44 PM

Edited by - LordElectro on July 30, 2001 12:38:27 PM

Edited by - LordElectro on July 30, 2001 12:48:53 PM

Share this post


Link to post
Share on other sites
In response to parklife, there are some adapters which don''t support 32 bpp. Therefore, I''d suggest that you have your app try 32 bpp first. If that fails, then try 24 bpp. If that fails, try 16 bpp. If that fails, then your app fails (I can''t think of a single case where this''d happen, unless you tried this on some old VGA card).

Share this post


Link to post
Share on other sites
Of course you could try to set each mode until failure, but I think the best way to do this is to get the current desktops colordepth. If it''s 8bit, quit out with a "Unsupported graphics mode" error. If it''s higher than 8bit, set your bit depth to the desktops bit depth. This will gaurentee that the mode is supported, and you don''t have to try to create the surface 4 times before knowing.

Share this post


Link to post
Share on other sites
If I'm not mistaken, we're talking about display modes here.

Checking each one is the best method. Certainly you can just use the current desktop color depth. The problem with that is that the app might not be using the best possible color depth for the mode it'll use. For example, the desktop resolution might be at 1024x768 at 16 bpp. And it'd might be at 16 bpp because the adapter can't handle any higher at that resolution (or maybe the user just wants it that way). So, let's say our app is going to use 640x480. Chances are that 24 or 32 bpp are prefectly valid. So why limit the app to 16 bpp?

Check it at runtime using something like this (this is the Win32 way to change display modes):
    
DEVMODE dm;

...
dm.dmBitsPerPel = 32; // we'll start by trying 32 bpp; NOTE: you can replace this with some user-selected value so your app tries what the user specifies, and everything below that


while(dm.dmBitsPerPel >= 16)
{
// try changing the display mode

if(ChangeDisplaySettings(&dm, CDS_FULLSCREEN) != DISP_CHANGE_SUCCESSFUL)
{
if(dm.dmBitsPerPel >= 32) // if it failed with 32 bpp...

dm.dmBitsPerPel = 24; // ...try 24

else if(dm.dmBitsPerPel >= 24) // if it failed with 24 bpp...

dm.dmBitsPerPel = 16; // ...try 16

else // if it failed at 16 (or whatever else)...

return false; ...we can't do it
}
else // if the display mode change succeeded ...

break; // ...we're done trying

}

I put this code here to show how trivially simple it is to check at runtime which mode can be used.

Edited by - merlin9x9 on July 31, 2001 2:52:36 PM

Share this post


Link to post
Share on other sites