• Advertisement
Sign in to follow this  

Fading A Texture To Black Over Time

This topic is 4223 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Ok, I've been working on a visualization for Windows Media Player that uses Direct3D, point sprite particles, and HLSL for some really cool effects. Here's my problem however: Because my particles use a 1:1 blend mode (DESTBLEND = SRCBLEND = 1), after a while (because I don't clear the render texture which achieves a smear effect), it all washes out to white and then nothing can be seen. Right now, I have the particles rendering to a texture and then again to the back buffer directly. After the particles get rendered to the bbuffer, I also draw a full screen quad and apply the rendered texture to it. This gives the effect of being able to see the particles get smeared on the background, but still see them "above" it. But once again, because the BG is all white after a while, I can't even see the bbuffer particles that well. My question: Is there a way that I can slowly fade all the color in the texture away to black so that the wash-out effect lasts only for a few seconds at most? I have been working on this for a few days and can't seem to get a good method to work.

Share this post


Link to post
Share on other sites
Advertisement
I'm more of an OGL person, but here's my 2 cents/idea:

Create an algorithm to calculate how many pixels you want to subtract every frame.

You would accomplish this by:

Retrieve the RGB proportion

Say you have a pixel with the with the values: 128,255,128.

You want to dissolve to black over 5 seconds well say, and you're running at 60 FPS.

5 x 60 = 300

So, over 300 frames, you want to take 128,255,128 and have 0,0,0.

So looking at this logically, you would have 3 "deceleration" variables.

128 / 300, 255 / 300, 128 / 300

Every frame, you will subtract those 3 values from the current color, and after 300 frames, you should be black. I don't know if DX allows floating values with decimals for colors, so it might take a little adjusting.

Share this post


Link to post
Share on other sites
Well, the problem is, I'm running a pixel shader that is taking the current pixel from the texture and multiplying it by 0.33 (arbitrary value for now) so that the pixel keeps getting closer to black, but the pixel always comes out at 33% of the original value all the time, i.e. if my particles are white, they come out the same dull gray (0.33 or (85,85,85)) every time. For whatever reason unknown to me, the effect is not cumulative...

Share this post


Link to post
Share on other sites
The reason the effect is not culmitive is because pixel shaders don't save their results unless you manually render to a texture, so if you have some source texture and multiply all pixels by .33, the result gets written to the screen but the source texture is unchanged.

Share this post


Link to post
Share on other sites
That's what I was figuring, but for the life of me I can't figure out how to get the texture to store the results. I've been trying rendering the particles to the first texture, then copy that to another texturing with my shader multiplying by .33 and then rendering to the screen, but that's really no different.

Any ideas how to use the current texture you're rendering to as the texture you're reading from?

Share this post


Link to post
Share on other sites
Basically you have to do this, change render target to a texutre. Render your thingy with your pixel shader, then change render target to the back buffer and render your new texture as a quad with no pixel shader. I'm not sure if you can render to the texture you are using so you may have to alternate between 2 textures, using the result of your last render as the input texture of the next render

Share this post


Link to post
Share on other sites
OK, so I've got my particles and texture B rendering to texture A. Texture A is then rendered to texture B. Then texture B is rendered to the back buffer and presented.

So overall my render loop looks like this:


Clear(); //Clear back buffer

/////////////////////////
// Render to Texture A
/////////////////////////
SetRenderTarget(0, pRenderSurface);

Clear(); //Clear render surface 1 (texture A)

BeginScene();
RenderParticles();
RenderFromTexture2(); //Draws a quad with texture B applied
EndScene();

/////////////////////////
// Render to Texture B
/////////////////////////
SetRenderTarget(0, pRenderSurface2);

Clear(); //Clear render surface 2 (texture B)

BeginScene();
RenderFromTexture1(); //Draws a quad with texture A applied
EndScene();

/////////////////////////
// Render to Back Buffer
/////////////////////////
SetRenderTarget(0, pBackBuffer);

BeginScene();
RenderFromTexture2();
EndScene();

Present();



Now, when I run it, I have it draw a single static particle in the middle (for test purposes) and it seems that between texture renders, something is getting stretched out because that particle image gets "expanded" or pulled along the bottom right quadrant. I can use pixel shaders to center the texture coords so that it doesnt expand only to the right and down, but then it just blooms from the center out. Any ideas?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement