alpha question

This topic is 1123 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

I want to achieve a effect where a black image that covers the whole screen slowly fade away, revealing what's behind it, by decreasing its alpha value in its fragment shader.

The way I do it is create a quad that covers the whole screen. Its texture is a complete black image. I declared a variable a whose value is 1.0, and a uniform dt for which i will send the delta_time, the difference between the last frame and the current frame, each frame. And then I subtracted dt from a, and used a as the alpha value for the final color's alpha value. But it seems that the result is the image is still completely black. I guessed the variable a is still 1.0. I'd like to ask if there's a proper way to fade away the black image?

const GLchar* TextFsource2 = "#version 330\n
uniform float dt;  in vec2 textuv; uniform sampler2D textsampler;
out vec4  color; float a = 1.0;
void main() {  a -= dt; color = vec4 ( texture2D(textsampler, textuv).rgb, a ); } "

Share on other sites

a will be set to 1.0 every time the fragment shader is called, so it won't gradually decrease as you decrement it by dt.

Edited by Lactose!

Share on other sites

I have tried sending a -= delta_time for that from the main application, but it seems that no matter how large a value i give it it always decreased too quickly, like a -= delta_time * 0.00001f. a is 1.0f.

Share on other sites
    uniform float alpha; //This needs to come from the main application
in vec2 textuv;
uniform sampler2D textsampler;
out vec4  color; float a = 1.0;
void main()
{
color = vec4 ( texture2D(textsampler, textuv).rgb, alpha );
}


You need to not do any calculation in the shader at all as mentioned earlier, I just fixed up your shader so you can see what it would look like.
If you do want to do this in a shader you need to not send in the dt from frame to frame, you need to send in an actual elapsed time since the effect started which also gets updated on the CPU not the GPU. In the case of the elapsed time, you would actually do the minus.

To change the speed at which the fade effect works you need to take into account how fast your application is updating and maybe add a speed up/slow down factor to achieve what you want.
Edited by NightCreature83

Share on other sites

I have tried sending a -= delta_time for that from the main application, but it seems that no matter how large a value i give it it always decreased too quickly, like a -= delta_time * 0.00001f. a is 1.0f.

When you want the effect to start, set 'a' to 1.0f.

a = 1.0f;

Every tick, you do:

a -= (deltaTime / DurationOfEffectInSeconds); //'DurationOfEffectInSeconds' is a constant-like variable.
if(a < 0.0f) a = 0.0f;

This uses math to get the exact result you want.
If you want it to last five seconds, set 'DurationOfEffectInSeconds' to 5.0f. If you want it to last half a second, set it to '0.5f'.

Another way to do it is by having a timer and subtracting time from it. You then calculate the alpha by how much time is remaining.

When the effect begins, you do:

fadeTimeRemaining = DurationOfEffectInSeconds;

Every tick, you do:

fadeTimeRemaining -= deltaTime;
if(fadeTimeRemaining < 0.0f) fadeTimeRemaining = 0.0f;

To calculate the alpha you do:

float alpha = (fadeTimeRemaining / DurationOfEffectInSeconds);

Edited by Servant of the Lord