Archived

This topic is now archived and is closed to further replies.

24-bit -> 16-bit conversion

This topic is 6660 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

you can put load it with GDI, and use GDI to blit it to the surface.

Share this post


Link to post
Share on other sites
Well, here's the thing:

First, I have never been too fond of GDI.

But more importantly, the bitmap data itself is going to be in memory, and not necessarily from a file. I see no way of using GDI's LoadImage() function to grab the bitmap data from memory, only from a resource or a file.

So, I need a way to convert from 24-bit RGB source images - because I like true color - to a 16-bit format (presumable the primary display adaptor's 16-bit mode) in order to allow 16-bit display modes to be used with my game.

- Splat

Share this post


Link to post
Share on other sites
quote:
Also, is it possible to just drop the source bitmap into a temporary surface at 24-bits, then Blit it to a 16-bit surface, and let DirectDraw handle the conversion?

I have written such a thing down a couple of times, and oh well. Here it comes again.

code:

BOOL CopyBMPDataTo1555Surface (BITMAPINFOHEADER *BMPh, unsigned char *pdata, LPDIRECTDRAWSURFACE surface)
{
DDSURFACEDESC ddsd;

//Get a valid pointer to the texture surface.
memset (&ddsd, 0, sizeof (ddsd));
ddsd.dwSize = sizeof (ddsd);

if (surface->Lock(NULL, &(ddsd), DDLOCK_WAIT, NULL) == DD_OK)
{
//Copy 888 .BMP image to 1555 DirectDraw texture surface in video
//memory, converting as we go. BMP data is stored bottom to top
//(i.e., from left to right along a row, but the rows are
//stored bottom to top in memory). For applications that support
//multiple texture formats, this code fragment can be broken
//out into a separate function, and then copied and modified
//for each supported texture format.
unsigned char r,g,b;
unsigned char *oldpdata;
unsigned short color1555;
unsigned short *texSurfBase;
int x, y;
int skip; //actual vs. asked for surface width
int imageWidthInBytes;

oldpdata = pdata;
texSurfBase = (unsigned short *)ddsd.lpSurface;
imageWidthInBytes = BMPh->biWidth*IMAGEDEPTH_IN_BYTES;
skip = ddsd.lPitch - BMPh->biWidth*COLORDEPTH_IN_BYTES;

//start at beginning of last row of image
pdata += BMPh->biHeight*imageWidthInBytes - imageWidthInBytes;

for (y=0; ybiHeight; y++)
{
for (x=0; xbiWidth; x++)
{
b = *pdata++;
g = *pdata++;
r = *pdata++;

color1555 = (unsigned short)( ((r >> 3) << 10) |
((g >> 3) << 5) |
(b >> 3) );
*texSurfBase++ = color1555;
}

//go to next row of texture surface
texSurfBase += skip;

//go to start of previous row (i.e., go back
//two rows worth of bytes, the one just completed,
//and the one we want to get to the start of...
pdata -= imageWidthInBytes*2;
}

pdata = oldpdata; //reset pdata's original address

surface->Unlock (ddsd.lpSurface);

return TRUE;
}
else
{
MessageBox(hWnd, "Surface Lock failed", "yeah whatnow?", MB_OK);
return FALSE;
}

}


of course you need a pointer to the place where the bgr stuff begins.....then this'll do the rest

------------------
Dance with me......

Share this post


Link to post
Share on other sites
That code snippet is good, but what I want to avoid is that 3-bit shift that loses quality on the image.

Maybe I should rephrase my question. I guess the question now is: if I had unlimited processing time, how can I convert a 24-bit bitmap to 16-bit bitmap that looks as good as 16-bit can get?

- Splat

Share this post


Link to post
Share on other sites
you CAN make a bitmap from memory, its called "CreateBitmapIndirect"

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Nice gamasutra article, but where's the source code?

Share this post


Link to post
Share on other sites
I was thinking: What is the best way to convert a 24-bit source bitmap into a 16-bit bitmap (either 555 or 565) inside of your game? I have heard of and seen banding problems with simply lopping off the least significant 3 bits of each color component. Is there a dithering algorithm that can handle this without losing image quality?

Also, is it possible to just drop the source bitmap into a temporary surface at 24-bits, then Blit it to a 16-bit surface, and let DirectDraw handle the conversion?

- Splat

Share this post


Link to post
Share on other sites
I can't believe I missed that. Thanks TANSTAAFL. I will try CreateBitmapIndirect first. Hopefully its results look nice, and that will save me the trouble of writing my own code (which I may do anyway, since I want as much portability and quality in my code as possible)

- Splat

Share this post


Link to post
Share on other sites