Sign in to follow this  
ShmeeBegek

Problem with Alpha channel using glCopyTexImage2D

Recommended Posts

Hello, I am having what seems to be a very simple (but very frustrating) problem with glCopyTexImage2D, it seems to not ever copy the alpha channel. I know that the alpha channel is working because I use it in other parts of my app., and I know that my alpha blending is set up properly because I can make the texture fade in and out by adjusting the alpha value passed to glColor4f. The RGB channels copy correctly, and even when I set the internal format at GL_ALPHA I still seem to get only Alpha values of 1. Here is the bit of code concerned:
[SOURCE]
	glBindTexture(GL_TEXTURE_2D,op->tsqr[(loc.y*op->nsqx)+loc.x]);

    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

	glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, 64, 64, 0);

[/SOURCE]
[/source] Thanks for any help, ~SPH

Share this post


Link to post
Share on other sites
use texSUBimage,
im not 100% sure but perhaps the window needs dst_alpha for this to work, u can specify this when u create the window

Share this post


Link to post
Share on other sites

Thanks for the reply but I couldn't make much sense of it,

What does glTexSubImage2D (I assume that you were referring to this) have to do with the problem? And what does "dst_alpha" have to do with it?

glCopyTexImage2D copies a part of the framebuffer into a texture, in case you didn't know...

Thanks for all help, ~SPH

Share this post


Link to post
Share on other sites
zedzeek was saying that your framebuffer needs alpha to be able to retrieve it. check the value of GL_ALPHA_BITS. If it's zero you cannot get back the alpha from the framebuffer (using CopyTexImage, ReadPixels or blending). You may have to explicitely request a destination with alpha when you create your window.

Share this post


Link to post
Share on other sites
Quote:
Original post by ShmeeBegekI know that the alpha channel is working because I use it in other parts of my app.,

Beware, a "working alpha channel" in OpenGL means two things : a working alpha component in the framebuffer, or a working alpha component of a texture. It seems you're talking about the latter since you're telling that alpha blending works. However, you also need to make the former working, which roughly can be checked by GL_ALPHA_BITS as Zongo pointed out.
/* once the GL context is valid : */
GLint alpha_bits;
glGetIntegerv(GL_ALPHA_BITS, &alpha_bits);
printf("The framebuffer uses %d bit(s) per the alpha component\n", alpha_bits);
fflush(stdout);

Share this post


Link to post
Share on other sites
Hello again,

I did this last night and did in fact find that I have 0 alpha bits, this is rather aggervating because I am using GLUT and the GLUT documentation states that RGBA should provide and alpha channel and that RGB is an alias for it. In fact it is the opposite, RGBA is an alias for RGB.

Does anyone know how to get an Alpha channel with GLUT? Or should I resort to SDL or WINAPI for my little test app.?

Thanks, ~SPH

(NOTE: Perhaps I just have an old GLUT version?)
(EDIT: I have checked and I have GLUT 3.7.6, which is the latest release and which is officially documented as having an RGBA mode as said before... but which does not in fact)

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this