Sign in to follow this  
magicstix

OpenGL Simulating CRT persistence?

Recommended Posts

magicstix    191
HI all,
I want to simulate CRT persistence in a render-to-texture effect. Essentially I'm looking to simulate an old CRT screen, like an analog oscilloscope or a radar screen. If I were using OpenGL, I figure the best way to do this would be to use an accumulation buffer, but DirectX lacks such a capability.

So then, what would be the best way to achieve this effect with hardware acceleration in D3D11?

Share this post


Link to post
Share on other sites
Hodgman    51220
You can make an "accumulation buffer" just by creating a new render target (texture) and accumulating values into it.

e.g. to keep 10% of the previous frame around ([i]and 1% of the frame before that, and 0.1% of the frame before that...[/i])[code]Render scene to target #1.
Blend target #1 into target #2 with 90% alpha.
Display target #2 to screen.[/code]

Share this post


Link to post
Share on other sites
magicstix    191
Is there a way to blend the two targets in a blit-style approach? The only way I know to do it would require me to render two quads, one into the other, and I assume that's not best practice.

Share this post


Link to post
Share on other sites
Hodgman    51220
You only need one quad -- bind the "bottom" layer as the current render-target, then draw a quad textured with the "top" layer.
Rendering quads is indeed the standard way to do it - it's what the GPUs are designed to be good at. Most specialized 2D operations have been thrown out of the hardware these days.

Actually, it's often done with a single triangle that's large enough to just cover the screen, e.g. if the screen is the box:
[code]|\
| \
|__\
| |\
|__|_\[/code]but drawing quads is easier to think about ;)

Share this post


Link to post
Share on other sites
magicstix    191
I can't quite get my blending to work right on this. The image gives a nice trail, but never quite fades out completely:
[img]http://s13.postimage.org/3ydewknsn/badblend.png[/img]

I have my blending set up as follows:
[CODE]
rtbd.BlendEnable = true;
rtbd.SrcBlend = D3D11_BLEND_SRC_ALPHA;
rtbd.DestBlend = D3D11_BLEND_SRC_ALPHA;
rtbd.BlendOp = D3D11_BLEND_OP_ADD;
rtbd.SrcBlendAlpha = D3D11_BLEND_ONE;
rtbd.DestBlendAlpha = D3D11_BLEND_ONE;
rtbd.BlendOpAlpha = D3D11_BLEND_OP_ADD;
rtbd.RenderTargetWriteMask = D3D11_COLOR_WRITE_ENABLE_ALL;
[/CODE]

I've tried other blend settings but this is the only one that gives a trail. Others will remove the trail completely and leave me with just the dot. I'm not clearing the 2nd render target between frames (which in this case happens to be the back buffer) but I am clearing the first RTV between frames (the texture for the screen-sized quad). The dot itself is rendered as a small quad with exponential alpha fall-off from the center.

Any ideas on what I'm doing wrong?

Share this post


Link to post
Share on other sites
magicstix    191
[quote name='Such1' timestamp='1354491221' post='5006437']
I think you are not clearing the buffers after u used them.
[/quote]

Like I said in the post, I'm not clearing the back buffer. This is intended because it gives the accumulated trail in the first place. The problem is the trail never reaches zero.

Share this post


Link to post
Share on other sites
Such1    435
You have 2 backBuffer, you should do something like this:
clean both buffers
loop:
render buffer1 on buffer2 with 90%
clean buffer1
render what you want on buffer 2
switch places between buffer 1 and 2
your image is now on buffer1

Share this post


Link to post
Share on other sites
CryZe    773
Do what Such1 said. Also your blend state description should look like this:
[CODE]
rtbd.BlendEnable = true;
rtbd.SrcBlend = D3D11_BLEND_SRC_ALPHA;
rtbd.DestBlend = D3D11_BLEND_INV_SRC_ALPHA;
rtbd.BlendOp = D3D11_BLEND_OP_ADD;
rtbd.SrcBlendAlpha = D3D11_BLEND_ONE;
rtbd.DestBlendAlpha = D3D11_BLEND_ZERO;
rtbd.BlendOpAlpha = D3D11_BLEND_OP_ADD;
rtbd.RenderTargetWriteMask = D3D11_COLOR_WRITE_ENABLE_ALL;
[/CODE] Edited by CryZe

Share this post


Link to post
Share on other sites
magicstix    191
[quote name='unbird' timestamp='1354568386' post='5006774']
This might actually be a precision problem. Are you using low-color-resolution rendertargets/backbuffer/textures (8 bit per channel) ?
[/quote]

I'm using 32-bit color for the backbuffer (R8G8B8A8) but 32 bit float for the texture render target. I didn't know your backbuffer could go higher than 32bit (8 bit per channel) color... When I try R32G32B32A32_FLOAT for the back buffer I get a failure in trying to set up the swap chain.

Maybe I need to accumulate in a second texture render target instead of the back buffer?


-- Edit --

I forgot to mention I've changed my blending a bit. I'm using a blend factor now instead of straight alpha blend, but I'm still having the same effect with not getting it to fade completely to zero.

Here are my current settings:
[CODE]
rtbd.BlendEnable = true;
rtbd.SrcBlend = D3D11_BLEND_SRC_COLOR;
rtbd.DestBlend = D3D11_BLEND_BLEND_FACTOR;
rtbd.BlendOp = D3D11_BLEND_OP_ADD;
rtbd.SrcBlendAlpha = D3D11_BLEND_ONE;
rtbd.DestBlendAlpha = D3D11_BLEND_ONE;
rtbd.BlendOpAlpha = D3D11_BLEND_OP_ADD;
rtbd.RenderTargetWriteMask = D3D11_COLOR_WRITE_ENABLE_ALL;

/* .... */
float blendFactors[] = {.99, .97, .9, 0};
g_pImmediateContext->OMSetBlendState(g_pTexBlendState, blendFactors, 0xFFFFFFFF);
[/CODE]

If I understand this correctly, it should eventually fade to completely black, since the blend factor will make it slightly darker every frame, yet I'm still left with the not-quite-black trail. Edited by magicstix

Share this post


Link to post
Share on other sites
Such1    435
Why do you have this? float blendFactors[] = {.99, .97, .9, 0};
shouldn't it be something like:
float blendFactors[] = {.9, .9, .9, .9};
And no, it will never fade completely(theoretically), but it should get really close.

Share this post


Link to post
Share on other sites
Hodgman    51220
Did you try CryZe's blend mode, AKA "alpha blending"?
[quote name='Such1' timestamp='1354585556' post='5006898']it will never fade completely(theoretically), but it should get really close.[/quote]You've got to keep the 8-bit quantization in mind with regards to this.
If the background is 1/255, then when you multiply by 0.99, you still end up with 1/255 -- e.g. [font=courier new,courier,monospace]intOutput = round( 255 * ((intInput/255)*0.99) )[/font]

Instead of directly blending the previous contents and the current image, there's other approaches you could try.
e.g. you could render the previous contents into a new buffer using a shader that [i]subtracts[/i] a value from it, and then add the current image into that buffer. This way you'll definitely reach zero, even in theory [img]http://public.gamedev.net//public/style_emoticons/default/wink.png[/img]

Share this post


Link to post
Share on other sites
magicstix    191
[quote name='Hodgman' timestamp='1354590401' post='5006939']
Did you try CryZe's blend mode, AKA "alpha blending"?
[quote name='Such1' timestamp='1354585556' post='5006898']it will never fade completely(theoretically), but it should get really close.[/quote]You've got to keep the 8-bit quantization in mind with regards to this.
If the background is 1/255, then when you multiply by 0.99, you still end up with 1/255 -- e.g. [font=courier new,courier,monospace]intOutput = round( 255 * ((intInput/255)*0.99) )[/font]

Instead of directly blending the previous contents and the current image, there's other approaches you could try.
e.g. you could render the previous contents into a new buffer using a shader that [i]subtracts[/i] a value from it, and then add the current image into that buffer. This way you'll definitely reach zero, even in theory [img]http://public.gamedev.net//public/style_emoticons/default/wink.png[/img]
[/quote]

Yes I tried Cryze's recommendation, however it didn't look right either. I like how color blending looks over pure alpha better anyway, since I can fade the individual channels separately and get a "warmer" looking fade that looks even more like a CRT. I see your point about the dynamic range, and I agree that subtracting would be best, except when you subtract 1 from 0 you still clamp at zero, so the accumulation buffer's dark bits would block out where the "new" accumulated yellow bits should go.

I think I'll try and get around the dynamic range issue by rendering into a second texture, one that's 32-bit float, instead of using the backbuffer. This is how it'd be used in practice anyway, so using the backbuffer for this test is probably not a real representation of the technique. Hopefully the greater dynamic range will let the accumulation eventually settle on zero.

Here's what I mean by the "warmer" look of using color blending instead of alpha, it looks a lot more phosphor-like:
[img]http://s12.postimage.org/cklbi8jlp/warmblend.png[/img]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By markshaw001
      Hi i am new to this forum  i wanted to ask for help from all of you i want to generate real time terrain using a 32 bit heightmap i am good at c++ and have started learning Opengl as i am very interested in making landscapes in opengl i have looked around the internet for help about this topic but i am not getting the hang of the concepts and what they are doing can some here suggests me some good resources for making terrain engine please for example like tutorials,books etc so that i can understand the whole concept of terrain generation.
       
    • By KarimIO
      Hey guys. I'm trying to get my application to work on my Nvidia GTX 970 desktop. It currently works on my Intel HD 3000 laptop, but on the desktop, every bind textures specifically from framebuffers, I get half a second of lag. This is done 4 times as I have three RGBA textures and one depth 32F buffer. I tried to use debugging software for the first time - RenderDoc only shows SwapBuffers() and no OGL calls, while Nvidia Nsight crashes upon execution, so neither are helpful. Without binding it runs regularly. This does not happen with non-framebuffer binds.
      GLFramebuffer::GLFramebuffer(FramebufferCreateInfo createInfo) { glGenFramebuffers(1, &fbo); glBindFramebuffer(GL_FRAMEBUFFER, fbo); textures = new GLuint[createInfo.numColorTargets]; glGenTextures(createInfo.numColorTargets, textures); GLenum *DrawBuffers = new GLenum[createInfo.numColorTargets]; for (uint32_t i = 0; i < createInfo.numColorTargets; i++) { glBindTexture(GL_TEXTURE_2D, textures[i]); GLint internalFormat; GLenum format; TranslateFormats(createInfo.colorFormats[i], format, internalFormat); // returns GL_RGBA and GL_RGBA glTexImage2D(GL_TEXTURE_2D, 0, internalFormat, createInfo.width, createInfo.height, 0, format, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); DrawBuffers[i] = GL_COLOR_ATTACHMENT0 + i; glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0 + i, textures[i], 0); } if (createInfo.depthFormat != FORMAT_DEPTH_NONE) { GLenum depthFormat; switch (createInfo.depthFormat) { case FORMAT_DEPTH_16: depthFormat = GL_DEPTH_COMPONENT16; break; case FORMAT_DEPTH_24: depthFormat = GL_DEPTH_COMPONENT24; break; case FORMAT_DEPTH_32: depthFormat = GL_DEPTH_COMPONENT32; break; case FORMAT_DEPTH_24_STENCIL_8: depthFormat = GL_DEPTH24_STENCIL8; break; case FORMAT_DEPTH_32_STENCIL_8: depthFormat = GL_DEPTH32F_STENCIL8; break; } glGenTextures(1, &depthrenderbuffer); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); glTexImage2D(GL_TEXTURE_2D, 0, depthFormat, createInfo.width, createInfo.height, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, depthrenderbuffer, 0); } if (createInfo.numColorTargets > 0) glDrawBuffers(createInfo.numColorTargets, DrawBuffers); else glDrawBuffer(GL_NONE); if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) std::cout << "Framebuffer Incomplete\n"; glBindFramebuffer(GL_FRAMEBUFFER, 0); width = createInfo.width; height = createInfo.height; } // ... // FBO Creation FramebufferCreateInfo gbufferCI; gbufferCI.colorFormats = gbufferCFs.data(); gbufferCI.depthFormat = FORMAT_DEPTH_32; gbufferCI.numColorTargets = gbufferCFs.size(); gbufferCI.width = engine.settings.resolutionX; gbufferCI.height = engine.settings.resolutionY; gbufferCI.renderPass = nullptr; gbuffer = graphicsWrapper->CreateFramebuffer(gbufferCI); // Bind glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fbo); // Draw here... // Bind to textures glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, textures[0]); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, textures[1]); glActiveTexture(GL_TEXTURE2); glBindTexture(GL_TEXTURE_2D, textures[2]); glActiveTexture(GL_TEXTURE3); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); Here is an extract of my code. I can't think of anything else to include. I've really been butting my head into a wall trying to think of a reason but I can think of none and all my research yields nothing. Thanks in advance!
    • By Adrianensis
      Hi everyone, I've shared my 2D Game Engine source code. It's the result of 4 years working on it (and I still continue improving features ) and I want to share with the community. You can see some videos on youtube and some demo gifs on my twitter account.
      This Engine has been developed as End-of-Degree Project and it is coded in Javascript, WebGL and GLSL. The engine is written from scratch.
      This is not a professional engine but it's for learning purposes, so anyone can review the code an learn basis about graphics, physics or game engine architecture. Source code on this GitHub repository.
      I'm available for a good conversation about Game Engine / Graphics Programming
    • By C0dR
      I would like to introduce the first version of my physically based camera rendering library, written in C++, called PhysiCam.
      Physicam is an open source OpenGL C++ library, which provides physically based camera rendering and parameters. It is based on OpenGL and designed to be used as either static library or dynamic library and can be integrated in existing applications.
       
      The following features are implemented:
      Physically based sensor and focal length calculation Autoexposure Manual exposure Lense distortion Bloom (influenced by ISO, Shutter Speed, Sensor type etc.) Bokeh (influenced by Aperture, Sensor type and focal length) Tonemapping  
      You can find the repository at https://github.com/0x2A/physicam
       
      I would be happy about feedback, suggestions or contributions.

    • By altay
      Hi folks,
      Imagine we have 8 different light sources in our scene and want dynamic shadow map for each of them. The question is how do we generate shadow maps? Do we render the scene for each to get the depth data? If so, how about performance? Do we deal with the performance issues just by applying general methods (e.g. frustum culling)?
      Thanks,
       
  • Popular Now