Sign in to follow this  
ShinkaFudan

[SOLVED] [SDL] PNG Transparency with OpenGL?

Recommended Posts

ShinkaFudan    100
So, I checked into every topic I could find on the matter, and none of them seem to resolve my issue. I've gone through the process of using SDL_image to load in PNG files to SDL, and then convert them to OpenGL textures for quicker rendering. I've enabled GL_BLEND and GL_ALPHA_TEST, but when I put either of the following lines into the code, everything disappears. "glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);" glAlphaFunc(GL_GREATER, 0.9f); I tried putting in GL_ONE and GL_ONE for the glBlendFunc arguments, but that just made everything partially transparent and blending together, which is not what I need. The PNG I am loading in and displaying (on both a 2D quad and a large quasi-quad made of many triangles) has a part of it that is transparent. None of the image is partially transparent. It is either 100% visible or fully transparent. When the image is loaded in, the part that should be transparent is black, as if it is a 24-bit image. The image is 1024x1024 and when loaded, the SDL_Surface has a pitch of 4096, confirming that is it a 32-bit image. In addition, I've loaded the file into an editor and it has definitely been saved as a 32-bit image because the transparency stays. The code for loading the image is just this (It requires that you know about SDL_image) ///// SDL_Surface* Surface = IMG_Load("TestImage.png"); Image = convert_surface(Surface); SDL_FreeSurface(Surface); ////// convert_surface is just a function that is supposed to convert an SDL_Surface to an OpenGL texture. "Image" is an unsigned integer that I use with OpenGL to keep track of the image. convert_surface is this: ///// int convert_surface( SDL_Surface *surface ) { // new dimensions int w = surface->w; int h = surface->h; SDL_PixelFormat *pixf = SDL_GetVideoSurface()->format; SDL_Surface *image = SDL_CreateRGBSurface( SDL_SWSURFACE, w, h, 32, pixf->Bmask, pixf->Gmask, pixf->Rmask, pixf->Amask ); SDL_BlitSurface( surface, NULL, image, NULL ); GLuint txid; glGenTextures( 1, &txid ); glBindTexture( GL_TEXTURE_2D, txid ); glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT); glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT); glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR); glTexImage2D (GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels ); SDL_FreeSurface( image ); return txid; } ///// I am assuming that the call to SDL_CreateRGBSurface may have something to do with it, as I would need something to create an RGBA surface, but as you can see, it has an argument for the alpha value. I have tried feeding it SDL_SRCALPHA for the first argument, but that doesn't change anything. I've tried using SDL_DisplayFormatAlpha after loading the image to a surface, that doesn't seem to do anything either. I've uploaded the source (the full Visual Studio project for if anyone needs it) and I would appreciate someone taking a look at this and perhaps tinkering with it to see what it is I've made my mistake on. The file is located here: [FILE REMOVED DUE TO PROBLEM BEING SOLVED] [Edited by - ShinkaFudan on February 1, 2009 11:37:53 PM]

Share this post


Link to post
Share on other sites
Jack Sotac    528
To clarify, is the problem that the terrain disappears or that the png image isn't being displayed with transparent areas.

Are you using Windows or Linux?

Have you tried disabling blending when drawing the map terrain.

Here's what I'm getting.
Free Image Hosting at www.ImageShack.us

Share this post


Link to post
Share on other sites
ShinkaFudan    100
That's what you get when you run it? It's working for you... strange.

I'm using Windows...

To clarify, the problem is that the part that is transparent in your image is not in mine. It's just black. Now that I see it working for you, I'm a bit more confused. At least I know it's not technically a problem with my code.

Did you change anything to get that result? The uploaded project had glBlendFunc and glAlphaFunc commented out... Did you uncomment them? (As mentioned, for me this just makes everything disappear)

This is what I see:

[FILE REMOVED DUE TO PROBLEM BEING SOLVED]

and from another angle (W,A,S,D,Q,E, and the arrow keys move the camera)

[FILE REMOVED DUE TO PROBLEM BEING SOLVED]

[Edited by - ShinkaFudan on February 1, 2009 11:18:07 PM]

Share this post


Link to post
Share on other sites
ShinkaFudan    100
You must have changed something. I had someone else test it and they had a black part as well. Could you please tell me what you changed?

Or, if you didn't change anything, could you tell me what graphics card you have? Perhaps it is a hardware issue. The only people I can find to test this have nVidia cards, which is what I have as well as the person I had test it earlier with the same result as me.

[Edited by - ShinkaFudan on December 17, 2008 9:28:35 AM]

Share this post


Link to post
Share on other sites
Jack Sotac    528
Yeah I had to change some things to get it to run. For instance, I linked to the prebuilt SDL_image library instead of compiling it(likely has nothing to do with the problem). Also I un-commented the glBlendFunc() but left the glAlphaFunc() commented out. I'd also replaced image->pixels with surface->pixels in convert_surface(). This got things working which pointed me to a problem with convert_surface().

In any case, I think I've managed to get it back to where you were having the problem. Try these things before doing the blit.


In convert_surface(), set the color mask explicitly (especially the alpha mask)
SDL_Surface *image = SDL_CreateRGBSurface( SDL_SWSURFACE, w, h, 32, BMASK, GMASK, RMASK, AMASK );

I haven't checked, but mostly likely the value Amask you got from the main video surface is zero which messes up things when you really do need to create a surface with an alpha channel.


Also turn off the SRC_ALPHA flag on the source surface when you want to do direct copy blits.
SDL_SetAlpha(surface,0,0);
This allows all the alpha values from the source surface to be copied to the destination surface during blitting.


Lookup SDL_SetAlpha() and SDL_BlitSurface() in SDL's docwiki and you can see why RGBA blits to RGBA surfaces have a few gotchas(it's bitten me more than a few times).

In conclusion ,it looks like the alpha values weren't being copied to the texture surface so that was why nothing was showing up when the alpha blending mode was set.

Let me know if that fixes things. Good Luck.

btw, I'm using WindowsXP with an Nvidia card also

Share this post


Link to post
Share on other sites
ShinkaFudan    100
It seems that above all, it was that I had left out

SDL_SetAlpha(surface, 0, 0);

from the convert. I altered the rest as well, as those changes all seem to make sense. I'm a bit lost in the guide for those functions but I think I understand what the problem was. In any case, it's working great now.

Thank you so much for your help. (+rating)

Share this post


Link to post
Share on other sites
zer0sum    100
Quote:
Original post by ShinkaFudan
So, I checked into every topic I could find on the matter, and none of them seem to resolve my issue. I've gone through the process of using SDL_image to load in PNG files to SDL, and then convert them to OpenGL textures for quicker rendering.

I've enabled GL_BLEND and GL_ALPHA_TEST, but when I put either of the following lines into the code, everything disappears.

"glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);"

glAlphaFunc(GL_GREATER, 0.9f);

I tried putting in GL_ONE and GL_ONE for the glBlendFunc arguments, but that just made everything partially transparent and blending together, which is not what I need. The PNG I am loading in and displaying (on both a 2D quad and a large quasi-quad made of many triangles) has a part of it that is transparent. None of the image is partially transparent. It is either 100% visible or fully transparent.

When the image is loaded in, the part that should be transparent is black, as if it is a 24-bit image. The image is 1024x1024 and when loaded, the SDL_Surface has a pitch of 4096, confirming that is it a 32-bit image. In addition, I've loaded the file into an editor and it has definitely been saved as a 32-bit image because the transparency stays.

The code for loading the image is just this (It requires that you know about SDL_image)

/////
SDL_Surface* Surface = IMG_Load("TestImage.png");
Image = convert_surface(Surface);
SDL_FreeSurface(Surface);
//////


convert_surface is just a function that is supposed to convert an SDL_Surface to an OpenGL texture. "Image" is an unsigned integer that I use with OpenGL to keep track of the image. convert_surface is this:

/////
int convert_surface( SDL_Surface *surface )
{
// new dimensions
int w = surface->w;
int h = surface->h;

SDL_PixelFormat *pixf = SDL_GetVideoSurface()->format;
SDL_Surface *image = SDL_CreateRGBSurface( SDL_SWSURFACE, w, h, 32, pixf->Bmask, pixf->Gmask, pixf->Rmask, pixf->Amask );

SDL_BlitSurface( surface, NULL, image, NULL );

GLuint txid;
glGenTextures( 1, &txid );
glBindTexture( GL_TEXTURE_2D, txid );

glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D (GL_TEXTURE_2D, 0, GL_RGBA, image->w, image->h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels );

SDL_FreeSurface( image );
return txid;
}
/////


I am assuming that the call to SDL_CreateRGBSurface may have something to do with it, as I would need something to create an RGBA surface, but as you can see, it has an argument for the alpha value. I have tried feeding it SDL_SRCALPHA for the first argument, but that doesn't change anything. I've tried using SDL_DisplayFormatAlpha after loading the image to a surface, that doesn't seem to do anything either.

I've uploaded the source (the full Visual Studio project for if anyone needs it) and I would appreciate someone taking a look at this and perhaps tinkering with it to see what it is I've made my mistake on.


The file is located here:
[FILE REMOVED DUE TO PROBLEM BEING SOLVED]


Hi. I have a very similar problem and I'm trying to figure out where I went wrong.

Can I ask what blending you used in glBLendFunc to make this work?

Also, where and why did you use SDL_SetAlpha to make your conversion work?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this