colorkey + blending openGL

Started by
1 comment, last by Grain 17 years, 7 months ago
Greetings, I was wondering if anyone could clue me in using alpha channel data from a bitmap to work with blending modes to render certain pixels 100% transparent. Ive tried all different combinations of glBlendFunc(GL_SRC_ALPHA,GL_ONE); glBlendFunc(GL_SRC_ALPHA,GL_MINUS_SRC_ALPHA); and etc none of this seems to actually use the alpha channel data.. id appreciate a little assistance, thanks in advance. -jsloan
Advertisement
Bitmap with an alpha channel? Is that even possible?
If it is possible, how are you loading the image?

Assuming a BMP can have an alpha channel, when you call glTexImage2D, you need to be sure you're loading the images as GL_RGBA.

If you provide code, I can better help you.
Well since he mentions key-color I assume he wants to convert a specific color value to transparency.

When loading your image from file you will have to do some work on it to get it to the form you want. first you will have to allocate 4/3 the memory the image normally takes up. That's width * height * 4 bytes. instead of width * height * 4 bytes. Then when reading the date check each pixel to see if its color matches your key color. if it does set the alpha byte for that pixel to 0 other wise set it to 255.

It might look something like this
//Assumes RGB is unsigned char array holds the RGB image data//RGBA unsigned char array who's size is height * width * 4//SizeRGB is height * width * 3        for( i = 0, j = 0; i < SizeRGB; i += 3, j += 4 )        {                        if( RGB   == keyColor[0] &&                RGB[i+1] == keyColor[1] &&                RGB[i+2] == keyColor[2] )            {                RGBA[j+3] = 0;               }            else            {                RGBA[j+3] = 255;             }            RGBA[j]   = RGB;            RGBA[j+1] = RGB[i+1];            RGBA[j+2] = RGB[i+2];        }

This topic is closed to new replies.

Advertisement