Jump to content
  • Advertisement
Sign in to follow this  
Lode

OpenGL glTexImage2D with GL_ALPHA

This topic is 3042 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I've got a floating point buffer of pixel values, with only one float per pixel (so a single component instead of 4 components for RGBA). My intention is, to create a texture from this floating point buffer, which will be displayed by OpenGL as if it were a completely white texture, but with a varying alpha channel, where the values of the alpha channel are those from the floating point buffer. The official documentation of glTexImage2D (http://www.opengl.org/sdk/docs/man/xhtml/glTexImage2D.xml) says under "GL_ALPHA":
Quote:
GL_ALPHA Each element is a single alpha component. The GL converts it to floating point and assembles it into an RGBA element by attaching 0 for red, green, and blue. Each component is then multiplied by the signed scale factor GL_c_SCALE, added to the signed bias GL_c_BIAS, and clamped to the range [0,1] (see glPixelTransfer).
If I read that correctly, they're saying that R, G and B will be 0. But then RGB would be black instead of white, which isn't what I want. Do you know what a good method with OpenGL is to get a texture the way I described? Actually, it might be that using the GL_c_BIAS described above would do it. E.g. GL_GREEN_BIAS and such. BUT here is the very strange thing! In fact, it turns out on my computer that, even though GL_GREEN_BIAS and the others are 0, in fact, the texture already looks as if it's white! So as if it's already what I want! But since that doesn't make sense according to the official documentation, I think maybe it's only accidently correct on my computer and might be wrong on some others. What could cause this? I set both internalFormat and format to GL_ALPHA, and type to GL_FLOAT. I check with glGet and the bias values are 0. Thanks!

Share this post


Link to post
Share on other sites
Advertisement
I just implemented this, but in OpenGL 3.2 where GL_ALPHA no longer exists. Basically what I did was load it as GL_RED and then do the swizzling in my fragment shader:


in vec2 ex_TexCoord;
uniform sampler2D my_texture_id;

void main()
{
vec4 fc;
fc = texture2D(my_texture_id, ex_TexCoord);
gl_FragColor = vec4(1.0,1.0,1.0,fc.r);
}



-me

Share this post


Link to post
Share on other sites
I'm using plain old OpenGL 1.X.

My video card doesn't even support OpenGL 3 afaik (OpenGL version returns "2.1.2 NVIDIA 190.53").

Any idea how it works there and how exactly this GL_ALPHA is used?

Share this post


Link to post
Share on other sites
Well you could still use a shader. Otherwise, I don't know. OpenGL 3.x is nice because all the "how the hell does this work again" stuff was depricated [smile]. Nothing that I posted, however, is specifically OpenGL 3.2. It should work with whatever drivers you currently have for whatever video card you have.

-me

Share this post


Link to post
Share on other sites
2 things.

1.) "The GL converts it to floating point and assembles it into an RGBA element" sounds like it will still be stored as RBGA, so the obvious is if you want color with alpha, send 1,1,1, alpha....

2.) What are you trying to do then, black/white are colors, you said you only care about alpha. What type of blending are you trying to do?

Share this post


Link to post
Share on other sites
Quote:
Original post by dpadam450
2 things.

1.) "The GL converts it to floating point and assembles it into an RGBA element" sounds like it will still be stored as RBGA, so the obvious is if you want color with alpha, send 1,1,1, alpha....

2.) What are you trying to do then, black/white are colors, you said you only care about alpha. What type of blending are you trying to do?


I can't send it 1,1,1,alpha, because the buffer I send contains 1 value per color channel, not 4...

I'm trying to draw a colored mask over the screen. So the texture should be white with alpha, but I can then use glColor to draw it with any color. If it'd be black instead of white, then glColor wouldn't have effect, it'd always look black.

I think I have a theory about why it actually seems to work for me: maybe OpenGL ignores the RGB of a GL_ALPHA texture, so it doesn't matter than they're 0?

Share this post


Link to post
Share on other sites
How about rendering it to a texture? The old way, render it to the blank framebuffer, than copy the whole RGBA to the RGBA texture.

Or maybe I misunderstood something (since you have constant updating, you can add this to the beginning of the render code)

Share this post


Link to post
Share on other sites
Quote:
I can't send it 1,1,1,alpha, because the buffer I send contains 1 value per color channel, not 4...


Well to make it as easy as possible, put those values in there, make a temporary buffer and fill it 1,1,1,alpha.

You could try changing the glTexEnv to GL_REPLACE instead of modulate. That might bypass the texture color and use the glColor instead.

Share this post


Link to post
Share on other sites
I think what you want is GL_LUMINANCE. The lum value will be replicated for RGB and the alpha will be 1.0
#2, you said that your data is float, so you should not use GL_ALPHA or GL_LUMINANCE because then your float data gets converted to 8 bit integers. You need to use GL_ALPHA32F_ARB from the extension GL_ARB_texture_float.

Your call would be
glTexImage2D(GL_TEXTURE2D, 0, GL_ALPHA32F_ARB, width, height, 0, GL_ALPHA, GL_FLOAT, pixels)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!