• Advertisement

Archived

This topic is now archived and is closed to further replies.

Fade the Display

This topic is 5099 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I would like to fade my display to black; I really don’t want to use gamma fade. Is there an easy way to dump the back buffer to a vertex buffer and them manipulate the alpha values. I could use addictive blending; but I was thinking that it would be better to capture the current back buffer? Thanks PS: Using DirectX 9. _______________________________________ Understanding is a three edged sword... Independent Games compete Head to Head ... www.FreelanceGames.com

Share this post


Link to post
Share on other sites
Advertisement
Guest Anonymous Poster
What about just drawing a quad right in front of the near clip plane and fading it to black?

Share this post


Link to post
Share on other sites
You''ve already been given the best (I think) general answer. Another easy solution is to fade your lights away, assume you''re rendering normal, lit geometry.

Share this post


Link to post
Share on other sites
I am Alpha Blending a Quad at the front of the scene. However, since the effect of the Alpha Blend is cumulative and I haven''t captured the back buffer the speed of the fade is processor specific? My Quad is not using a texture just a simple diffuse color? How can I get more control over the blending rate?


dwColor = D3DCOLOR_RGBA(0,0,0,9);;
pVertices[0].dwColor = pVertices[1].dwColor = pVertices[2].dwColor = pVertices[3].dwColor = dwColor;

m_pD3DDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
m_pD3DDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);



_______________________________________
Understanding is a three edged sword...

Independent Games compete Head to Head ...
www.FreelanceGames.com

Share this post


Link to post
Share on other sites
Like you should probably be doing with all your game objects, update the alpha of the diffuse black fade polygons with a fixed time scale so that it is as processor independant as possible.

Share this post


Link to post
Share on other sites
Yes, but since the effect is cumulative; I can only call the render 256 times before the screen has been completely faded to black. I could write the code to adjust when I call render but that seems like a hack. Isn''t there a way I can apply the fade effect with more precision? Ideally, I would like to be able to pass the elapsed time and use it as a multiplier for the alpha channel. However, since the effect is cumulative I cannot?

Share this post


Link to post
Share on other sites
Your effect should NOT be cumulative. Each frame, render a black quad in front of the WHOLE SCENE with the alpha of the vertices increasing at some non-CPU dependant rate. That is, always render the scene at full brightness, and then each frame your black quad becomes stronger and stronger based on TIME not FRAME.

Share this post


Link to post
Share on other sites
Keep in mind that I am not clearing the screen, rendering the scene, and then rendering the Quad. I am only rendering the Quad over the existing scene. Maybe, I have my alpha blending settings wrong; but I am seeing a cumulative effect?

Should this be the case?

Share this post


Link to post
Share on other sites
I keep rendering the scene while I fade, I see what you mean now if you don''t. Sounds like you are back to locking the back buffer and changing the pixel values if you can''t continue to render the scene.

Share this post


Link to post
Share on other sites
I have not heard of this programming model where you do not have a game loop that draws everything every frame between d3dDevice.Present calls. I''m not saying it''s totally invalid, but certainly not very popular...

I guess if you''re not rerendering the scene each frame, you''re going to have accuracy problems. It boils down to that fact that the pixels we see each frame represent an instantaneous sampling of the game world. To further sample those samples (and then those samples...) is going to be difficult to implement with any cross CPU consistency for ANY type of effects. Pixels are just our once-per-frame window into the world!

Why can''t you wait until the fade is finished to stop rendering?

Share this post


Link to post
Share on other sites
quote:
I have not heard of this programming model where you do not have a game loop that draws everything every frame between d3dDevice.Present calls. I'm not saying it's totally invalid, but certainly not very popular...


It is called the State Pattern;

Design Pattern
Elements of Reusable
Object-Oriented Software
Page 305
GOF

And you’re probably correct; it is likely not very popular among Game Developers. That said; it simplifies transitions by allowing your code to be as cohesive as possible. However, the down side of the additional cohesiveness is the loss of the data (except for the back buffer) from the previous state.

Why can't I just lock the back buffer; copy the data to a vertex buffer; and manipulate the vertices? Does anyone have sample code for this?


[edited by - Sean Doherty on March 5, 2004 6:47:29 PM]

Share this post


Link to post
Share on other sites
> Why can''t I just lock the back buffer; copy the data to a vertex buffer; and manipulate the vertices? Does anyone have sample code for this?

The back buffer should just contain pixel data, not vertices. However, you could lock the back buffer and push the data into a texture (or several textures). Basically taking a snapshot of the scene. In my own experience, locking the backbuffer on a hardware device and reading the pixel data can be slow, so it depends on whether you mind a slight delay before the fade process begins (certainly no more than a second, I''d estimate, and you only need to do it once).

Or, if your device supports it, you can render straight to texture(s).

In either case, once you have some textures that represent the current scene that you want to fade over, you could:

a. draw the "screenshot" in ortho mode
b. draw the alpha-blended quad on top of it
c. render

and repeat these steps, darkening the alpha blend in step (b) each time.

Just one approach, and not at all elegant, but allows you to use the rendered back buffer data while maintaining some level of hardware acceleration.

Would it be possible to keep rendering your game scene every frame then popping an alpha-blended quad on top of it? Seems like you could avoid some headache this way. Unless you''re rendering something on top of the alpha-fade that''s going to eat up more cycles, like a complex menu or something???? In that case, this approach (or another) *may* be worthwhile, but I''d check it on different GPU''s/hardware to avoid the types of problems mentioned in MasterWorks_Software''s last post.

Share this post


Link to post
Share on other sites
quote:

The back buffer should just contain pixel data, not vertices



Well of course it contains pixels and not vertices; why would anyone even think it contained vertices? On another unrelated note, please ignore that part of my post.

I actually decided to take the path of least resitance and move the fading Quad to the previous state. This allows me to fade the keep rendering the scene while I fade the Quad and it is processor independent.

quote:

Why not use the gammacontrol to do a fade?



I didn't want to use gamma because there are a number of conflicting posts on whether it is a stable.

Lastly, the way I am doing it now allows me to continue to animate the scene while the fade it being executed.



[edited by - Sean Doherty on March 6, 2004 1:07:24 PM]

Share this post


Link to post
Share on other sites
As far as I know, DX gamma (as well as OpenGL gamma) are perfectly stable.

Share this post


Link to post
Share on other sites

  • Advertisement