Jump to content

  • Log In with Google      Sign In   
  • Create Account

Trying to load 16 bits texture only show in black and white...


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 Vortez   Crossbones+   -  Reputation: 2698

Like
0Likes
Like

Posted 17 December 2013 - 09:41 PM

Im trying to load a 16 bits texture generated by the GetDIBits() function in opengl. The problem is it only show in black and white... I know the code work since i can load 24 and 32 bits textures using the same code.

 

So, the docs says:

 

 

The bitmap has a maximum of 2^16 colors. If the biCompression member of the BITMAPINFOHEADER is BI_RGB, the bmiColors member of BITMAPINFO is NULL. Each WORD in the bitmap array represents a single pixel. The relative intensities of red, green, and blue are represented with five bits for each color component. The value for blue is in the least significant five bits, followed by five bits each for green and red. The most significant bit is not used. The bmiColors color table is used for optimizing colors used on palette-based devices, and must contain the number of entries specified by the biClrUsed member of the BITMAPINFOHEADER.

 

Now, i tried to load the texture using

 

//(bpp is equal to 16)

 

glTexImage2D(GL_TEXTURE_2D, 0, bpp, w, h, 0, GL_BGRA, GL_RGB5_A1, pTex);

glTexImage2D(GL_TEXTURE_2D, 0, bpp, w, h, 0, GL_BGRA, GL_UNSIGNED_SHORT_1_5_5_5_REV, pTex);

 

... and also GL_RGBA instead of GL_BGRA, and some other combinasons, but all i got is a black and white texture everytime.

 

 

Any idea what i am doing wrong?

 

Note: i know i could convert to 24 bits by hand and make this work easily, but i want to use the 16 bits texture directly to avoid a conversion.


Edited by Vortez, 17 December 2013 - 09:45 PM.


Sponsor:

#2 Vortez   Crossbones+   -  Reputation: 2698

Like
1Likes
Like

Posted 17 December 2013 - 11:31 PM

To be sure the data was fine before passing it to opengl, i just made a converter from 16 to 24 bits and can confirm that the 16 bits image is fine.

 

The format, in bits, as stated above is ARRRRRGGGGGBBBBB.

 

Now all i need is to find how to load that texture format into opengl...


Edited by Vortez, 17 December 2013 - 11:31 PM.


#3 hdxpete   Members   -  Reputation: 470

Like
1Likes
Like

Posted 18 December 2013 - 12:06 AM

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_SHORT_5_5_5_1, data);



#4 Vortez   Crossbones+   -  Reputation: 2698

Like
1Likes
Like

Posted 18 December 2013 - 12:32 AM

Hey thx, i was about to try changing the 3rd argument, that was the problem. The correct call is

 

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_SHORT_1_5_5_5_REV, pTex);

 

It's working fine now.
 



#5 hdxpete   Members   -  Reputation: 470

Like
0Likes
Like

Posted 18 December 2013 - 01:05 AM

i do 99% of my work in GL ES. but the 3rd and 7th parameter on GL ES have to match. it could cause color issues for you if they don't in normal GL desktop



#6 Vortez   Crossbones+   -  Reputation: 2698

Like
1Likes
Like

Posted 18 December 2013 - 02:03 AM

Well, in the docs, they say it could be 1,2,3 or 4, but those #defines values are way higher so i suppose it does the conversion internally. (in the glTexImage function)

 

ie GL_RGBA is 4 components, same as GL_BGRA.

 

But i get your point.


Edited by Vortez, 18 December 2013 - 02:11 AM.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS