Jump to content
  • Advertisement
Sign in to follow this  
ViperG

Damage Decals

This topic is 4836 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Why does the following gl command: glBlendFunc(GL_DST_ALPHA,GL_SRC_ALPHA); Ignore GL_DST_ALPHA. For example, the DST ALPHA could be totally 0, the whole way, yet it still draws. SRC_ALPHA is a tga with an alpha layer that says what it can draw. but I only what the SRC_ALPHA layer that is visable, only be visiable over the dst_alpha. but it won't work. would a register combiner help me here? I need to multiple the DST_ALPHA (Alpha buffer) by the result of SRC_ALPHA [Edited by - ViperG on August 17, 2005 3:07:32 PM]

Share this post


Link to post
Share on other sites
Advertisement
The simplest solution is to use an ALPHA texture on a quad rendered over the polygon (make sure to separate it a little from the surface and use GL_LEQUAL for the z test); using GL_MODULATE as texture environment you can modulate the texture with the color you want (black in this case).
Remeber to enable GL_BLEND with the 'classic' glBlendFunction(GL_ONE_MINUS_SRC_ALPHA, GL_ONE);

Share this post


Link to post
Share on other sites
well I'm trying to do perpixel blending here by drawing what I ONLY want blended in the texture by sending it to the alpha buffer.

then I only want to draw black on the texture.

so by using glBlendFunc(GL_DST_ALPHA,GL_SRC_ALPHA);

you would think it would work. but it draws the texture as if I was doing

glBlendFunc(GL_ONE, GL_SRC_ALPHA);

And it's making me angry.

basicly (GL_DST_ALPHA,GL_SRC_ALPHA) = (GL_ONE, GL_SRC_ALPHA)

which makes me go WTF

you can write anything in the alpha layer or write nothing at all, but it still makes GL_DST_ALPHA = GL_ONE

At least that is what it's doing in my code.

Share this post


Link to post
Share on other sites
GL_DST_ALPHA will source alpha from the framebuffer. If your framebuffer does not have an alpha channel then GL_DST_ALPHA will always be 1.

Enigma

Share this post


Link to post
Share on other sites

sizeof(PIXELFORMATDESCRIPTOR), // Size Of This Pixel Format Descriptor
1, // Version Number
PFD_DRAW_TO_WINDOW | // Format Must Support Window
PFD_SUPPORT_OPENGL | // Format Must Support OpenGL
PFD_DOUBLEBUFFER, // Must Support Double Buffering
PFD_TYPE_RGBA, // Request An RGBA Format
bits, // Select Our Color Depth
0, 0, 0, 0, 0, 0, // Color Bits Ignored
1, // No Alpha Buffer
0, // Shift Bit Ignored
0, // No Accumulation Buffer
0, 0, 0, 0, // Accumulation Bits Ignored
16, // 16Bit Z-Buffer (Depth Buffer)
8, // No Stencil Buffer
0, // No Auxiliary Buffer
PFD_MAIN_PLANE, // Main Drawing Layer
0, // Reserved
0, 0, 0 // Layer Masks Ignored





glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);
glClear(GL_COLOR_BUFFER_BIT);






for example if I do glBlendFunc(GL_DST_ALPHA, GL_ONE); or GL_SRC_COLOR

it works fine. you only see the parts that are over pixels with a alpha higher than 0.

but if I change it to glBlendFunc(GL_DST_ALPHA, GL_ZERO), or glBlendFunc(GL_DST_ALPHA, GL_SRC_ALPHA). it no longer compares the alpha buffer.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!