Jump to content
  • Advertisement
Sign in to follow this  
RexHunter99

OpenGL 16 bit Texture loading

This topic is 3536 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Okay, I'm new to gamedev.net, but not a n00b. I've been working on a tool that loads a model format used by a game I like. And the format also contains a 16-bit TGA image as the texture. I've extracted the texture and saved it externally (Also in memory) but I can't bind it to a texture in OpenGL. I can bind the 24/32 bit images so I made a few temporary ones in the place of the 16-bit. But since the model format only supports 16-bit TGA... I'm rather stuck. I googled around for an answer and found little that helped me out. There were bogus examples claiming you could load the data with GL_RGB5... but they failed. My data is composed of 2 byte values in place of the standard 3 bytes (RGB). For example: My first pixel in the 24-bit TGA is 16,16,0 (RGB) where as the 16-bit is composed of 64,8 as the bytes. I first thought about attempting to load it as a 16-bit, but I can't get OpenGL to load it. Next I tried converting the pixels manually and loading the data as a 24-bit image... but none of the bitshifts I tried worked (I've never been good with bits) Supposedly, the format is: byte1 = ARRRRRGG byte2 = GGGBBBBB but due to lo-hi (or was it hi-lo?) the second byte comes first (GGGBBBB then ARRRRRGG) I know it's not the API itself and it's me doing the wrong... but I can't figure it out. Help would be greatly appreciated here :) If this has been answered before, please link me to it (or possibly dumb it down or give me a solution that doesn't force me to read for a few weeks and try to learn bits, which I simply do not understand) Thanks in advance.

Share this post


Link to post
Share on other sites
Advertisement
Try this:

glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB5_A1, width, height, 0, GL_RGBA, GL_UNSIGNED_SHORT_5_5_5_1, data );

Share this post


Link to post
Share on other sites
Thanks, I never expected a reply so soon :D

Sadly that didn't work. At least it does something. It loads the data fine, I can even make out the texture... under all that garbled junk.

http://img66.imageshack.us/img66/993/c3ditscreen1bp9.png

this is what the texture actually is meant to look like (This is the 24-bit texture btw):

http://img177.imageshack.us/img177/5576/c3ditscreen0mx4.png

Any idea of a way to fix this? I'm getting a feeling that we're close now.
But we're getting somewhere at last! :) Thanks so far.

EDIT:
Sheesh, I never realized that this forum doesn't support BBCode >.> ahh well.

EDIT2: Never mind :D HTML code works instead.

[Edited by - RexHunter99 on January 16, 2009 9:28:17 PM]

Share this post


Link to post
Share on other sites
Are you using C/C++? If you are, get DevIL. If you are using GCC, use reimp to convert the coff libraries rather than compiling it yourself. This is a bit of the code I use. Also, in my experience, most 16-bit images are coded as 5-6-5 or 5-5-5-1, 1 being a useless bit.

bool GLTexture :: LoadFromImage(LPSTR strFilename)
{
UINT nImage;
char strBuffer[1024];
bool bEnabled; // Used to restore the state of GL_TEXTURE_2D at the end of the this function.

ilInit();
ilGenImages(1, &nImage);
ilBindImage(nImage);
if(!ilLoadImage(strFilename))
{
sprintf(strBuffer, "SOURCE: GLTexture::LoadFromImage()\nERROR: Could not load file %s. File likely does not exist.", strFilename);
MessageBox(0, strBuffer, "Error! Application must terminate.", MB_ICONERROR);
ilBindImage(0);
ilDeleteImages(1, &nImage);
return 0;
}
ilConvertImage(IL_RGB, IL_UNSIGNED_BYTE);

if(!(bEnabled = glIsEnabled(GL_TEXTURE_2D))) glEnable(GL_TEXTURE_2D);
glGenTextures(1, &m_nTexture);
glBindTexture(GL_TEXTURE_2D, m_nTexture);
gluBuild2DMipmaps(GL_TEXTURE_2D, ilGetInteger(IL_IMAGE_BYTES_PER_PIXEL), ilGetInteger(IL_IMAGE_WIDTH),
ilGetInteger(IL_IMAGE_HEIGHT), ilGetInteger(IL_IMAGE_FORMAT), GL_UNSIGNED_BYTE, ilGetData());
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glBindTexture(GL_TEXTURE_2D, 0);
if(!bEnabled) glDisable(GL_TEXTURE_2D); // If GL_TEXTURE_2D was originally disabled, disable it again. */


// Delete the DevIL image
ilBindImage(0);
ilDeleteImages(1, &nImage);

return 1;
}



Share this post


Link to post
Share on other sites
I'd rather not compile this with a library that I'll only use for a few functions involving the textures... Not too mention DevIL doesn't work on my PC at all.

Share this post


Link to post
Share on other sites
Looks like the R and B channels need to be swapped. There doesn't seem to be a GL_BGR5_1 format, so you might want to try plain old GL_BGR. If that doesn't work you'll just have to do some bitshifts.

Edit: You may also have to change GL_RGBA to GL_BGRA.

Share this post


Link to post
Share on other sites
So I want to try something like this?

glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB5_A1, width, height, 0, GL_BGRA, GL_UNSIGNED_SHORT_5_5_5_1, data );

I was just checking the game that these model's go with (Me and a few other people have it's source code) and the data is loaded into a structure with the following:
WORD Texture[256*256];

(The game actually allows for non-uniform textures as long as they are 256 pixels wide, so the actual array size can be larger than that, but that's basically what it is 90% of the time.)

But since the game doesn't use D3D or Glide like most normal games do, I can't exactly work out how it loads the textures... >.>

If I was to manually bitshift them into RGB (or to whatever I need) how would I do that? I've tried to bitshift a bit today... but no luck.

EDIT:
I just checked a documentation... and I thought that maybe I could try this:
GL_UNSIGNED_SHORT_5_5_5_1_REV
But doesn't that turn it into the reverse? eg; RGBA becomes ABGR

Share this post


Link to post
Share on other sites
OMG I think I just figured out how to convert a WORD into 3 RGB values!
I've been toying around with bits in GameMaker 6.1... (Yeah, don't even start about how lame GM is) and I may have gotten it! (So far all my testing has come up perfectly.)

Here's the little code I use to get the RGB from a BGRA WORD
B=((WORD>>0) & 31)*8
G=((WORD>>5) & 31)*8
R=((WORD>>10) & 31)*8

I used that on a WORD value of 2112 from the 16-bit texture. Then I checked the 24-bit texture and found that the value was indeed R=16 G=16 B=0.
Then I ran that WORD into the code above:
B=((2112>>0) & 31)*8
G=((2112>>5) & 31)*8
R=((2112>>10) & 31)*8

And it returned 16,16,0 (RGB respectively)
Now I managed to basically reverse that out of a jumble of code from the game's source code which made almost no logical sense. And I think it originally bit shifted to the right by 8 not multiplied... but this does exactly what I wanted!

Theoretically, you could use this:
A=((2112>>15) & 31)*8
To get the alpha... or maybe not... I don't know to be honest, Alpha only takes up 1 bit of the 16 and I won't need it anyway.

If I encounter any more problems, or if my method fails me, I'll come back. But for now this seems to be a case solved :D

EDIT:
I didn't double post, kittycat posted before me with some help but deleted their post (I think because I posted before they edited their post with some really helpful info)
Thanks kittycat for contributing to this :)

[Edited by - RexHunter99 on January 16, 2009 11:11:35 PM]

Share this post


Link to post
Share on other sites
Okay, so far I've gotten no problems. But I suck with bits... can someone help me reverse that code I posted above, so it can convert 3 RGB values into a WORD (with alpha)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!