Im trying to load a 16 bits texture generated by the GetDIBits() function in opengl. The problem is it only show in black and white... I know the code work since i can load 24 and 32 bits textures using the same code.
So, the docs says:
The bitmap has a maximum of 2^16 colors. If the biCompression member of the BITMAPINFOHEADER is BI_RGB, the bmiColors member of BITMAPINFO is NULL. Each WORD in the bitmap array represents a single pixel. The relative intensities of red, green, and blue are represented with five bits for each color component. The value for blue is in the least significant five bits, followed by five bits each for green and red. The most significant bit is not used. The bmiColors color table is used for optimizing colors used on palette-based devices, and must contain the number of entries specified by the biClrUsed member of the BITMAPINFOHEADER.
Now, i tried to load the texture using
//(bpp is equal to 16)
glTexImage2D(GL_TEXTURE_2D, 0, bpp, w, h, 0, GL_BGRA, GL_RGB5_A1, pTex);
glTexImage2D(GL_TEXTURE_2D, 0, bpp, w, h, 0, GL_BGRA, GL_UNSIGNED_SHORT_1_5_5_5_REV, pTex);
... and also GL_RGBA instead of GL_BGRA, and some other combinasons, but all i got is a black and white texture everytime.
Any idea what i am doing wrong?
Note: i know i could convert to 24 bits by hand and make this work easily, but i want to use the 16 bits texture directly to avoid a conversion.
Edited by Vortez, 17 December 2013 - 09:45 PM.