Trying to load 16 bits texture only show in black and white...

Started by
4 comments, last by Vortez 10 years, 4 months ago

Im trying to load a 16 bits texture generated by the GetDIBits() function in opengl. The problem is it only show in black and white... I know the code work since i can load 24 and 32 bits textures using the same code.

So, the docs says:

The bitmap has a maximum of 2^16 colors. If the biCompression member of the BITMAPINFOHEADER is BI_RGB, the bmiColors member of BITMAPINFO is NULL. Each WORD in the bitmap array represents a single pixel. The relative intensities of red, green, and blue are represented with five bits for each color component. The value for blue is in the least significant five bits, followed by five bits each for green and red. The most significant bit is not used. The bmiColors color table is used for optimizing colors used on palette-based devices, and must contain the number of entries specified by the biClrUsed member of the BITMAPINFOHEADER.

Now, i tried to load the texture using

//(bpp is equal to 16)

glTexImage2D(GL_TEXTURE_2D, 0, bpp, w, h, 0, GL_BGRA, GL_RGB5_A1, pTex);

glTexImage2D(GL_TEXTURE_2D, 0, bpp, w, h, 0, GL_BGRA, GL_UNSIGNED_SHORT_1_5_5_5_REV, pTex);

... and also GL_RGBA instead of GL_BGRA, and some other combinasons, but all i got is a black and white texture everytime.

Any idea what i am doing wrong?

Note: i know i could convert to 24 bits by hand and make this work easily, but i want to use the 16 bits texture directly to avoid a conversion.

Advertisement

To be sure the data was fine before passing it to opengl, i just made a converter from 16 to 24 bits and can confirm that the 16 bits image is fine.

The format, in bits, as stated above is ARRRRRGGGGGBBBBB.

Now all i need is to find how to load that texture format into opengl...

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_SHORT_5_5_5_1, data);

Hey thx, i was about to try changing the 3rd argument, that was the problem. The correct call is

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_SHORT_1_5_5_5_REV, pTex);

It's working fine now.

i do 99% of my work in GL ES. but the 3rd and 7th parameter on GL ES have to match. it could cause color issues for you if they don't in normal GL desktop

Well, in the docs, they say it could be 1,2,3 or 4, but those #defines values are way higher so i suppose it does the conversion internally. (in the glTexImage function)

ie GL_RGBA is 4 components, same as GL_BGRA.

But i get your point.

This topic is closed to new replies.

Advertisement