Jump to content
  • Advertisement
Sign in to follow this  
dpadam450

*solved Targa image with alpha

This topic is 4148 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I tried to do blending with (GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA), but it only uses the alpha from glColor4f(). My image is RGBA and I'm not sure how to use the alpha. I also tried alpha testing too. [Edited by - dpadam450 on May 16, 2007 11:06:12 AM]

Share this post


Link to post
Share on other sites
Advertisement
are u sure youre loading + setting up the image correctly
use glintercept to check this

Share this post


Link to post
Share on other sites
Are you asking what is wrong with your alpha channel with your .tga? Do you have a correctly loaded .tga image? How are you doing this? Is the color alignment correct? RGBA or BGRA? Are you trying to do an alpha testing, blending or both?

Need more info.

Share this post


Link to post
Share on other sites
Right now the image displays correctly, but the background is supposed to be 0 alpha, but it displays as a 1.0 alpha. The color data is fine, but the alpha doesnt show. I'm wondering if I have to set something up or does GL_RGBA already take care of the image for rasterization?

**I guess I should have just asked: I'm going to put this into a shader, and it seems that you can only grab a vec3 (RGB) from a texture lookup. True?

[Edited by - dpadam450 on May 15, 2007 9:55:30 PM]

Share this post


Link to post
Share on other sites
There are 4-vector types in pretty much every shading language. You should be able to use it just like a 3-vector, except that it has the extra member, ex: vec.a or vec.w or vec[3] depending on context.

Your texture also needs to be set up with all 4 RGBA channels.

What happens when you set the alpha to 0.5?

Share this post


Link to post
Share on other sites
Right now I moved it to a shader, something must be wrong when I buffer the data. Right now I wrote all the alpha bits to 0 and I test:

vec4 pixel = texture2D(tex, gl_TexCoord[0].st);
if(pixel.a > .9)
discard

it's discarding every pixel so I'm assuming that it never actually put the data in and just filled them with 1.0? Either that or I need to add some code to enable grabbing the alpha part.

Share this post


Link to post
Share on other sites
Quote:
Original post by dpadam450
Right now I moved it to a shader, something must be wrong when I buffer the data. Right now I wrote all the alpha bits to 0 and I test:

vec4 pixel = texture2D(tex, gl_TexCoord[0].st);
if(pixel.a > .9)
discard

it's discarding every pixel so I'm assuming that it never actually put the data in and just filled them with 1.0? Either that or I need to add some code to enable grabbing the alpha part.


From what I can gather, the texture2D call and subsequent alpha comparison both look valid. Do you have reference material on-hand?

http://www.lighthouse3d.com/opengl/glsl/index.php?textureMulti -- Perfect for visually debugging OpenGL texture code.

http://developer.3dlabs.com/downloads/index.htm -- OpenGL tools. GLSLvalidate is very useful.

Share this post


Link to post
Share on other sites
Quote:
Original post by dpadam450
I'm wondering if I have to set something up or does GL_RGBA already take care of the image for rasterization?
GL_RGBA is just a constant, so what do you mean by that? If your image's alpha channel isn't getting sent to the texture I'd bet that there's a problem with your texture creation, so could you post that code? You are using GL_RGBA or one of the more specific 4-component formats for the internalFormat parameter as well as the format parameter, right?
Quote:
Original post by dpadam450
**I guess I should have just asked: I'm going to put this into a shader, and it seems that you can only grab a vec3 (RGB) from a texture lookup. True?
False. Every texture look-up function returns a vec4. I direct you to the GLSL specification.

Share this post


Link to post
Share on other sites
It's fixed. I used NEHE a long time ago and in their tutorial they just have a number for internal format. So instead of the enum "GL_RGB" they just put "3". So my external format was correct but internal format was still set to GL_RGB.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!