I'm working on implementing 16-bit color into my 2D game. The only problem is that if I use this code to build a 16-bit color value,
#define _RGB16BIT(r,g,b) ((b%32) + ((g%32) << 5) + ((r%32) << 10))
I get bad color corruption on cards that use 6 bits for the green value. The code above works fine for cards that use 5 bits for each RGB value (like the ATI Rage Pro). So to fix it for cards that use 6 bits for the green value (like the TNT), we go like this:
#define _RGB16BIT(r,g,b) ((b%32) + ((g%32) << 6) + ((r%32) << 11))
The problem is that I think if I use the new code with cards that support only 5 bits for the green value, the colors will probably get screwed up on that.
Does anybody know of a way to find out what bit patterns a particular video card supports with code so that I can use the correct RGB values at runtime? Some Win32/DirectX/VESA query possibly?