Jump to content
  • Advertisement
Sign in to follow this  
mark-w

glTexImage2D with GL_RBGA

This topic is 4835 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have an image buffer that has 4 components (RGBA):
RGBA_buf = new unsigned char[image_width * image_height * 4];
trying to map onto a quad with:
glTexImage2D( GL_TEXTURE_2D, 0, GL_RGBA, image_width, image_height, 0, GL_RGBA, GL_UNSIGNED_BYTE, RGBA_buf );
what am I supposed to pass as the 7th parameter, because with GL_RGBA the image data appears corrupted... I can have it display normally with GL_RGB, but I can't get figure out how to get this to work w/ the alpha values... Any ideas?

Share this post


Link to post
Share on other sites
Advertisement
Well, assuming RGBA_buf really is a sequence of RGBA color values, with color components in that exact order, that is the constant you should pass.

How is is not working for you? What do you get? How is it corrupted? Are you sure the buffer is of the correct format?

Share this post


Link to post
Share on other sites
Probably you have not set the alpha channel (4th byte in every pixel pack).
If you see strange colors it's probably you have to swap red and blue but it depends by your buffer creation.
The bound is specified correctly and because it's a 32 bpp image there is also no problem with alignment.

Share this post


Link to post
Share on other sites
This is just a note in addition to Brother Bob's and blizzard999's posts and not actually your problem...

It's a good idea to set the internal format (3rd param) to the format with the specific bit depth you want. For example, if you want a 32-bit rgba texture with 8 bits per channel you should use GL_RGBA8 for the internal format instead of GL_RGBA. This is because, on some drivers, if it only sees that you want a general rgba texture it may choose a lower bit-depth then you want and you may get banding or other related artifacts.

Share this post


Link to post
Share on other sites
I think I'm setting the alpha channel correctly..

int image_width = 256;
int image_height = 256;

unsigned char * RGBA_buf = new unsigned char[image_width * image_height * 4];

for( int y=0; y<image_height; y++ )
for( int x=0; x<image_width; x++ ) {
int pixIdx = y * image_width + x;
RGBA_buf[pixIdx * 4 + 0] = 255; //red
RGBA_buf[pixIdx * 4 + 1] = 0; //green
RGBA_buf[pixIdx * 4 + 2] = 0; //blue
RGBA_buf[pixIdx * 4 + 3] = 255; //alpha - no transparency, opaque
}


does this appear alright?

Share this post


Link to post
Share on other sites
You have to tell us what's wrong, or we can't help you fix it. So far, you have only told us how you do it, not what result you get, in what way that result is not what you want, and what result you actually want.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!