Jump to content

  • Log In with Google      Sign In   
  • Create Account


Utilizing the alpha channel on an image with OpenGL


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
16 replies to this topic

#1 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 17 July 2013 - 12:17 PM

Hi!

 

I'm making a game using SDL and OpenGL and I've thought about how I should handle transparency in images. I don't think color keying is a good option, so the method I'm going to use is that I'm going to utilize the alpha channel on images. If I've understood it right, you have to tell the computer what the alpha channel means, so what I would want to do is to tell the computer that the alpha channel represents transparency. When I try to use the alpha channel for transparency, every pixel that doesn't have 100% opacity is given the color FFFFFF.

 

My question is simply: how do I set the alpha channel to transparency with OpenGL?



Sponsor:

#2 FLeBlanc   Crossbones+   -  Reputation: 3085

Like
0Likes
Like

Posted 17 July 2013 - 01:17 PM

glBlendFunc

#3 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 17 July 2013 - 02:22 PM

Thanks for the link :) At the start of my application I enable blending like so glEnable( GL_BLEND ) and then write glBlendFunc(GL_SRC_ALPHAGL_ONE_MINUS_SRC_ALPHA), every texture I draw don't show up on the screen even though all of them are completely opaque. I tried all of the argument combinations provided under "Examples", but I get the same issue. The equations are a little bit cryptic to me, so I'm sort of having trouble solving my issue. Any suggestions?



#4 FLeBlanc   Crossbones+   -  Reputation: 3085

Like
0Likes
Like

Posted 17 July 2013 - 02:57 PM

In order for us to help with your problem, you need to provide specifics: post relevant code (setup, render, etc...) and perhaps post a screenshot of the actual result at one of the image sharing sites such as imgur.com along with a description of what you expect to happen so we can compare it with what is actually happening. As you might imagine, merely posting a sentence such as "every texture I draw don't show up on the screen" really isn't all that helpful for debugging.



#5 MarekKnows.com   Members   -  Reputation: 446

Like
0Likes
Like

Posted 19 July 2013 - 07:30 AM

Thanks for the link smile.png At the start of my application I enable blending like so glEnable( GL_BLEND ) and then write glBlendFunc(GL_SRC_ALPHAGL_ONE_MINUS_SRC_ALPHA), every texture I draw don't show up on the screen even though all of them are completely opaque. I tried all of the argument combinations provided under "Examples", but I get the same issue. The equations are a little bit cryptic to me, so I'm sort of having trouble solving my issue. Any suggestions?

 

Did your texture show on the screen before you enabled GL_BLEND?  post some source code for us to see.


---
Free C++, OpenGL, and Game Development Video Tutorials @
www.MarekKnows.com
Play my free games: Ghost Toast, Zing, Jewel Thief


#6 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 22 July 2013 - 04:14 AM

I use SDL for everything except for rendering, which I use OpenGL for. I initialize OpenGL with this function (this is with blending enabled):

bool initGL()
{
    //Set the viewport
    glViewport( 0, 0, SCREEN_WIDTH, SCREEN_HEIGHT );

    //Initialize Projection Matrix
    glMatrixMode( GL_PROJECTION );
    glLoadIdentity();
    glOrtho( 0.0, SCREEN_WIDTH, SCREEN_HEIGHT, 0.0, 1.0, -1.0 );

    //Initialize Modelview Matrix
    glMatrixMode( GL_MODELVIEW );
    glLoadIdentity();

    //Initialize clear color
    glClearColor( 0.f, 0.f, 0.f, 1.f );

    //Enable texturing
    glEnable( GL_TEXTURE_2D );

    //Enable blending
    glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
    glEnable( GL_BLEND );

    //Check for error
    GLenum error = glGetError();
    if( error != GL_NO_ERROR )
    {
        printf( "Error initializing OpenGL!\n" );
        return false;
    }

    return true;
}

The Image in this post shows what the application renders without the enabling of blending on the left side, the right side is what it renders when I enable blending. The blue window within the application is made out of pieces of textures for the frame and the center is a colored quad, so when I render textures, they're not shown, but when I render geometrical shapes without textures, they're shown.

 

The for the rendering of that blue window within my application is quite large, so I'm just going to take out the rendering of one piece of texture and the rendering of a non-textured quad:

 

Non-texture quad rendering:

        /******Window fill******/

	glLoadIdentity();		//Remove any previous transformations
	glDisable( GL_TEXTURE_2D );		//Temporarily disable textures so that the colors won't get messed up

	glTranslatef( box.x + ThemeTemplate->quadWidth, box.y + ThemeTemplate->quadHeight, 0.f ); //Move to rendering point

	//Render the window fill
	glColor3ub( 0, 17, 34 );
        glBegin( GL_QUADS );
            glVertex2f( 0.f, 0.f );
            glVertex2f( box.w - ( ThemeTemplate->quadWidth * 2 ), 0.f );
            glVertex2f( box.w - ( ThemeTemplate->quadWidth * 2 ), box.h - ( ThemeTemplate->quadHeight * 2 ) );
            glVertex2f( 0.f, box.h - ( ThemeTemplate->quadHeight * 2 ) );
        glEnd();

	glEnable( GL_TEXTURE_2D );		//Re-enable textures
	glColor3f( 1, 1, 1 );		//Reset the color

	/******------End------******/

I basically just render a quad at a preset location with the color R0 G17 B34 and then reset the color to FFFFFF so that the next thing that's rendered won't have its colors messed up by this quad's colors.

 

Texture rendering:

/******Top edge******/

glLoadIdentity();		//Remove any previous transformations

//Texture coordinates
ThemeTemplate->texLeft = 16.f / TexWidth;
ThemeTemplate->texRight = 18.f / TexWidth;
ThemeTemplate->texTop = 0.f;
ThemeTemplate->texBottom = 8.f / TexHeight;

//Set translation point
TranslateX = box.x + ThemeTemplate->quadWidth;
TranslateY = box.y;

glTranslatef( TranslateX, TranslateY, 0.f );		//Move to rendering point

//Fill the whole space
for ( int positionX = 0; positionX < ( box.w - ( ThemeTemplate->quadWidth * 2 ) ); positionX += 2 )
{
	//Render textured quad
	glBegin( GL_QUADS );
	   glTexCoord2f( ThemeTemplate->texLeft,  ThemeTemplate->texTop );    glVertex2f          ( 0.f,	0.f );

	  glTexCoord2f( ThemeTemplate->texRight, ThemeTemplate->texTop );    glVertex2f(          2.f,	0.f );

	  glTexCoord2f( ThemeTemplate->texRight, ThemeTemplate->texBottom ); glVertex2f(          2.f,	ThemeTemplate->quadHeight );

	  glTexCoord2f( ThemeTemplate->texLeft,  ThemeTemplate->texBottom ); glVertex2f(          0.f,	ThemeTemplate->quadHeight );
	glEnd();

	glTranslatef( 2, 0, 0.f );	//Update rendering point
}

/******------End------******/

This is the top egde of the window's frame, I first select a piece of a sprite sheet and then set the rendering point. The frame is build up by small textures lined up to form a frame, so the for loop just makes sure that the textures are rendered next to each other. I then simply make a quad and render a texture that comes from a section of a spritesheet, this section is set with 'ThemeTemplate->texLeft' ect.

 

I'm quite confused when it comes to the source of the problem as I'm new to OpenGL, but perhaps the problem lies in my loading of textures?

 

(Sorry for the code dump)

bool Texture::load_texture( SDL_Surface *image )
{
	// Check that the image's width is a power of 2
        if ( (image->w & (image->w - 1)) != 0 )
	   fprintf( stderr, "warning: image.bmp's width is not a power of 2\n");
    
        // Also check if the height is a power of 2
        if ( (image->h & (image->h - 1)) != 0 )
           fprintf( stderr, "warning: image.bmp's height is not a power of 2\n");

	//Get the number of channels in the SDL_Surface
	nOfColors = image->format->BytesPerPixel;
	if ( nOfColors == 4 )
	{
		fprintf( stderr, "4 channels on texture\n" );
		if ( image->format->Rmask == 0x000000ff )
			texFormat = GL_RGBA;
		else
			texFormat = GL_BGRA;
	}
	else if ( nOfColors == 3 )
	{
		fprintf( stderr, "3 channels on texture\n" );
		if ( image->format->Rmask == 0x000000ff )
			texFormat = GL_RGB;
		else
			texFormat = GL_BGR;
	}
	else
	{
		fprintf( stderr, "warning: the image is not truecolor\n" );
		//This error should not go unhandled
	}
    
        // Have OpenGL generate a texture object handle for us
        glGenTextures( 1, &data );
    
        // Bind the texture object
        glBindTexture( GL_TEXTURE_2D, data );
        
        // Edit the texture object's image data using the information SDL_Surface gives us
	glTexImage2D( GL_TEXTURE_2D, 0, nOfColors, image->w, image->h, 0, texFormat, GL_        UNSIGNED_BYTE, image->pixels );

	// Set the texture's stretching properties
        glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR );
        glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

	// Unbind texture
        glBindTexture( GL_TEXTURE_2D, NULL );

	// Save the dimensions of the texture
	TextureWidth = (float)image->w;
	TextureHeight = (float)image->h;

	return true;
}

The output to stderr is "4 channels on texture", for every texture, so all of the textures in my application has 4 channels. (I'm loading .png files, so they should also have 4 channels).

 

Let me know if there's something else you need to know.



#7 MarekKnows.com   Members   -  Reputation: 446

Like
0Likes
Like

Posted 22 July 2013 - 10:19 AM

You need to call glBindTexture( GL_TEXTURE_2D, yourTextureID ); before you call glBegin( GL_QUADS ) for your textured quads.  Are you doing this? 


---
Free C++, OpenGL, and Game Development Video Tutorials @
www.MarekKnows.com
Play my free games: Ghost Toast, Zing, Jewel Thief


#8 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 22 July 2013 - 10:30 AM

You need to call glBindTexture( GL_TEXTURE_2D, yourTextureID ); before you call glBegin( GL_QUADS ) for your textured quads.  Are you doing this? 

Yes, there's a line in the last code snippet 'bool Texture::load_texture( SDL_Surface *image )' where I call that function

// Bind the texture object
glBindTexture( GL_TEXTURE_2D, data );

This line is executed before I draw anything.

 

I also forgot to show the picture, so here it is: https://twitter.com/TheMagicalPot/status/359349302939754496

 

"The Image in this post shows what the application renders without the enabling of blending on the left side, the right side is what it renders when I enable blending. The blue window within the application is made out of pieces of textures for the frame and the center is a colored quad, so when I render textures, they're not shown, but when I render geometrical shapes without textures, they're shown."



#9 MarekKnows.com   Members   -  Reputation: 446

Like
1Likes
Like

Posted 22 July 2013 - 10:40 AM

 

You need to call glBindTexture( GL_TEXTURE_2D, yourTextureID ); before you call glBegin( GL_QUADS ) for your textured quads.  Are you doing this? 

Yes, there's a line in the last code snippet 'bool Texture::load_texture( SDL_Surface *image )' where I call that function

 

I see that you bind a texture when you are creating the texture, but at the end you also unbind glBindTexture( GL_TEXTURE_2D, NULL );

 

This means that you have no texture active.  If you go and try to render a texture but you don't have a texture bound, then you are not going to see anything.  

 

When you are calling glTexCoord2f, you need to make sure that you have a bound texture.  I don't see a glBindTexture call in that part of your code.

 

As a simple test, delete glBindTexture( GL_TEXTURE_2D, NULL )


---
Free C++, OpenGL, and Game Development Video Tutorials @
www.MarekKnows.com
Play my free games: Ghost Toast, Zing, Jewel Thief


#10 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 22 July 2013 - 12:31 PM


As a simple test, delete glBindTexture( GL_TEXTURE_2D, NULL )

It didn't solve the problem. I forgot to add a piece of code, I bind the texture right before I render it:

glBindTexture( GL_TEXTURE_2D, ThemeTemplate->TemplateSheet->get_data() );		//Set texture ID

I really don't know what the problem is, but maybe it has something to do with the arguments passed when I enable blending?



#11 MarekKnows.com   Members   -  Reputation: 446

Like
0Likes
Like

Posted 22 July 2013 - 12:49 PM

perhaps your texture file loader code is not working.... did you look in your debugger to see what values you have for:

 

image->w

image->h

image->pixels


---
Free C++, OpenGL, and Game Development Video Tutorials @
www.MarekKnows.com
Play my free games: Ghost Toast, Zing, Jewel Thief


#12 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 22 July 2013 - 02:27 PM

The values for image->w and image->h are correct, I have no idea about how to check image->pixel which is a pointer of type void, but when I write image->pixels as an integer, the output for a 32x32 texture is "86048840" if that says anything.

 

I searched a bit more about other people who have the same problem, but the only solution I've found was to switch over to the DevIL which takes care of image loading.



#13 MarekKnows.com   Members   -  Reputation: 446

Like
0Likes
Like

Posted 22 July 2013 - 02:37 PM

your image->pixels value will be a bunch of bytes that represent all your pixel data.  For example if you are sending RGBA values then you will have 4 bytes per pixel.  each byte is just a value from 0 to 255 which represents the color or alpha value.  


---
Free C++, OpenGL, and Game Development Video Tutorials @
www.MarekKnows.com
Play my free games: Ghost Toast, Zing, Jewel Thief


#14 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 23 July 2013 - 04:27 AM

I still don't know how to check such information, but it seems like there shouldn't be anything wrong with the pixels since the textures work fine when blending isn't enabled. It however seems like this problem isn't really going anywhere, so I could try with the DevIL image loading library instead of SDL's one.



#15 Kaptein   Prime Members   -  Reputation: 1949

Like
0Likes
Like

Posted 25 July 2013 - 02:17 AM

I'd check if the alpha channel is 0 for all pixels



#16 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 25 July 2013 - 04:02 AM

Ok, I gave up on SDL's image loading thingy and switched over to DevIL and it worked perfectly. For me I just added this piece of code right after I initialized OpenGL:

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glEnable( GL_BLEND );

This will basically enable blending (check out http://www.opengl.org/sdk/docs/man/xhtml/glBlendFunc.xml for more information about glBlendFunc).

 

To find out how to install and load images with DevIL into OpenGL textures, I'd recommend this link: http://lazyfoo.net/tutorials/OpenGL/06_loading_a_texture/index.php

 

Note: for Visual Studio users, you have to open up 'ilu.h' and replace the line:

#include <IL/il.h>

With:

#include <il.h>

Since Visual Studio don't include certain files in the same way that other IDEs do.

 

That worked for me, I hope it works for anyone else who's having the same problem :)



#17 The Magical Pot   Members   -  Reputation: 178

Like
0Likes
Like

Posted 25 July 2013 - 04:13 AM

It might actually be the case that SDL only supports colorkeying and not blending, that might explain it since I've successfully used SDL for colorkeying while blending didn't work.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS