Can some opengl flag stop alphafunc working

Started by
5 comments, last by knobby67 16 years, 2 months ago
Hi all, I've been working on OpenGL code for a few months now all is working fine except for tga transparency. First let me say I totally understand the theory. It's just I can't get it to work with my own code. My guess is I'm setting some flag in screen setup or in my modelloader that won't allow the image to go transparent. My other theory is I used SDl image to load the TGA and that may cause a problem? I'm hoping someone here has experience this. And can give some pointers So when I use glAlphaFunc(GL_GREATER,0.1f); my object totally vanishes when I use glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA); the object vanishes when I use glBlendFunc(GL_ONE,GL_ONE_ONE); the whole object goes transparent as if the alpha channel has no masking effect. I know my tga is ok as it's of a net tutorial. So can anyone advise if 1 some opengl flag has an effect 2 sdl image does not load tga correctly (I don't think it's this as I've tried loading with another method) 3 is it possible my mitmap filters have some effect I'm using glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR_MIPMAP_NEAREST ); glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR ); and glTexEnvf(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE); gluBuild2DMipmaps( GL_TEXTURE_2D, 3, surface->w, surface->h,GL_RGBA,GL_UNSIGNED_BYTE, surface->pixels ); Thanks in advance this has me totally baffled.
Advertisement
I think you might need to change 3 to 4 in gluBuild... like so:

gluBuild2DMipmaps( GL_TEXTURE_2D, 4, surface->w, surface->h, GL_RGBA, GL_UNSIGNED_BYTE, surface->pixels );


I've never found an actual explanation of the 1, 2, 3, and 4 options for the internal format parameter, but I'm guessing they correspond to bits per pixel / num channels or something, so 3 would effectively be GL_RGB (or an equivalent one that OpenGL chooses perhaps based on the format of the data you put in), 4 would be GL_RGBA8 or equivalent etc.

After that just using glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA) (and not forgetting glEnable(GL_BLEND) as I always do) should work fine.

If that doesn't work it must be something else. :)

Oh, and the default value of GL_TEXTURE_ENV_MODE is GL_MODULATE, so you shouldn't need to change that.
thanks tried it and all I get is a totally invisible object. But I must say I have learn something about the 3 4 parameter so thanks :)
Putting number as the second parameter to that function (and the glTexImage*() functions) is the old OpenGL1.0 way of doing things, since GL1.1 you could supply GL_RGB, GL_RGBA etc to that parameter to specify the internal format.

And yes, I know the tutorials online don't do that, but that's because tutorials suck and everyone learnt from old tutorials and just never updated their knowledge before making their own equally sucky tutorial [smile]
Will Nehe ever get updated? I'm going to stop refering people to that bulldink. All their tutorials use glaux. Some authors fix their tutorials but some never intend to come back to them.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
is there someway I can test a loaded in texture to see if it has the alpha channel is included in the OpenGL code?
Bloody hell solved it, I never use set the GL_COLOR_MATERIAL flag! I use 3ds models which give light/material properties so I never use coloured quads, how come now texture tutorial tells you to set that flag !

This topic is closed to new replies.

Advertisement