Jump to content
  • Advertisement
Sign in to follow this  
Vortez

OpenGL Trying to load 16 bits texture only show in black and white...

This topic is 2134 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Im trying to load a 16 bits texture generated by the GetDIBits() function in opengl. The problem is it only show in black and white... I know the code work since i can load 24 and 32 bits textures using the same code.

 

So, the docs says:

 

 

The bitmap has a maximum of 2^16 colors. If the biCompression member of the BITMAPINFOHEADER is BI_RGB, the bmiColors member of BITMAPINFO is NULL. Each WORD in the bitmap array represents a single pixel. The relative intensities of red, green, and blue are represented with five bits for each color component. The value for blue is in the least significant five bits, followed by five bits each for green and red. The most significant bit is not used. The bmiColors color table is used for optimizing colors used on palette-based devices, and must contain the number of entries specified by the biClrUsed member of the BITMAPINFOHEADER.

 

Now, i tried to load the texture using

 

//(bpp is equal to 16)

 

glTexImage2D(GL_TEXTURE_2D, 0, bpp, w, h, 0, GL_BGRA, GL_RGB5_A1, pTex);

glTexImage2D(GL_TEXTURE_2D, 0, bpp, w, h, 0, GL_BGRA, GL_UNSIGNED_SHORT_1_5_5_5_REV, pTex);

 

... and also GL_RGBA instead of GL_BGRA, and some other combinasons, but all i got is a black and white texture everytime.

 

 

Any idea what i am doing wrong?

 

Note: i know i could convert to 24 bits by hand and make this work easily, but i want to use the 16 bits texture directly to avoid a conversion.

Edited by Vortez

Share this post


Link to post
Share on other sites
Advertisement

To be sure the data was fine before passing it to opengl, i just made a converter from 16 to 24 bits and can confirm that the 16 bits image is fine.

 

The format, in bits, as stated above is ARRRRRGGGGGBBBBB.

 

Now all i need is to find how to load that texture format into opengl...

Edited by Vortez

Share this post


Link to post
Share on other sites

Hey thx, i was about to try changing the 3rd argument, that was the problem. The correct call is

 

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_SHORT_1_5_5_5_REV, pTex);

 

It's working fine now.
 

Share this post


Link to post
Share on other sites

i do 99% of my work in GL ES. but the 3rd and 7th parameter on GL ES have to match. it could cause color issues for you if they don't in normal GL desktop

Share this post


Link to post
Share on other sites

Well, in the docs, they say it could be 1,2,3 or 4, but those #defines values are way higher so i suppose it does the conversion internally. (in the glTexImage function)

 

ie GL_RGBA is 4 components, same as GL_BGRA.

 

But i get your point.

Edited by Vortez

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!