Theory of FBOs

Started by
8 comments, last by EndlessDiemention 11 years, 5 months ago
Hi!

I'm trying to get a basic post screen shader system working. When I draw directly to the screen, I get around 300fps when just drawing one quad. When I put on a basic bloom filter, I have several issues.

First, the frame rate drops to around 30-60. So basically, I draw the scene to an FBO, then draw the FBO on a quad onto yet another FBO, whilst shading it. I repeat this 4 times: The first stage collects the full resolution scene image. This is then downscaled and blurred horisontally to another FBO. Next, it's blurred vertically onto another FBO. Finally, the full resolution image and the blurred one are passed to a final fragment shader which composits the two onto a quad on the viewport. I could do it in one go, but I wanted a really thick blur without way too many samples. Further, I intend to extend this system to do more complex things like HDR etc.etc.This is kinda my first shot at OpenGL - I'm using C#/OpenTK for it.

Also, when I downsize the FBO texture it results in Moire Patterns when there are a lot of lines on screen. Can you use mipmaps for FBOs? What do you normally do in this circumstance? Do you draw the scene twice to two differently sized FBOs?

Basically, I'm sure my research has gone terribly wrong somewhere since the framerate drops that much. Am I doing it (basically) right?
Advertisement
you probably are, a cheap alternative is simply glGenerateMipmaps
and the average luminance of the scene will be texture2DLod(....., , 1000); just put as many zeroes as you feel like :)
it will sample the highest mipmap level (1px)
works well for me.. unfortunately i dont like bloom, and feel it doesnt add anything to my game except brightness, and problems...
you can't have white textures anymore, for example

note that you can use the same scheme to get lens flare and cheap godrays :)
just a little bit more complex..

has anyone actually used bloom successfully, ever?
http://registry.gimp.org/files/sample1bl.JPG
this is just a regular photo, with bloom filter, which looks ok
but in a game... bloom ruined my clouds :P
Each time you draw to an FBO you're writing to memory. So, if you draw a full-resolution pass to an FBO and then draw a screen-space quad to get it on-screen, you've done double the number of writes as compared to a normal render. Using large FBO's is one of the easiest ways your performance can become fillrate-bound.

I think you're on the right track with the idea of multiple FBO's. I'd try to draw your scene to a low-res FBO first, storing only the data you need for your luminance calculation, and blurring that. Assuming your bloom FBO is drawn with a viewport of w/2 by h/2 , you're drawing (w*h)/4 pixels, a quarter of what you were drawing before per-pass. Therefore drawing and blurring, followed by drawing the scene at full resolution, could be done in under double the number of writes (assuming no overdraw). For sampling, you could try doing it as you are, or just build it into the pass that draws your scene at full resolution (though if you're using different shaders for your skybox, etc, that may be messier)

Regarding the Moire patterns - Shrinking your texture should eliminate these as well. You'll probably need to try sampling the blurred texture different ways to get your bloom data to not look blocky, though.
So Kaptein, I can add mipmaps to a FBO texture, and then every time level 0 is updated, the texture's highest mipmap is updated (and it's 1px x 1px), so I can just sample that for the average brightness.

So since the FBO uses mipmaps, can I somehow force it to use maybe it's second mipmap for redrawing to another FBO to downscale it?

Koehler, so basically you suggest just drawing the scene twice to a smaller and then a larger FBO. I'll give it a shot, thanks.
OK, so I tried "GL.GenerateMipmaps()" for the FBO texture, and although it fixes my problems with the ugly patterns, running it every time the FBO is updated makes it a slideshow. Badly.

If I run it once, the mipmaps are generated only for that Level0 texture. So then when the camera moves, the old image is displayed until I run GL.GenerateMipmaps() again. I have a feeling that GenerateMipmaps is something to be done when loading textures, not every damn frame. Any ideas?
really? generatemipmaps is very fast for me :P
but yes, you probably want to render to a very small FBO using blur horiz / vert
then, perhaps you can generate mipmaps for that just to get the 1px version
i can't tell you why glGenerateMipmaps is so slow for you, all things considering -- openGL is very happy when it can read from mipmaps or compressed images or both
perhaps just bad luck.. personally i generate 3 mipmaps per frame (but not that many levels)

not sure i understand your earlier post, but you should sample using texture2DLod( , , non-zero number);
if you sample level 0, then openGL will be unhappy and quickly fillrate bound
if you try to compress the images, fillrate is less of an issue but you will probably get a slideshow again
there aren't that many choices here... mostly just rendering to a really small FBO
i can't tell you why glGenerateMipmaps is so slow for you
In some OpenGL implementations, it runs on the CPU. Which means it's not particularly fast, plus it has two PCIe transfer overheads. In other implementations, it runs on the GPU and is lightning fast. Unluckily, you have no way of knowing.

One strategiy that works for a downscale to 1/2 in each direction is sampling between texels with linear filtering (offset everything by half a pixel). This samples and averages 4 texels at the cost of one using texture hardware.
Another strategy if you want to scale down much further is to use "aniso decimation". Google for that word, there exists a nVidia whitepaper on the technique. This trick exploits anisotropic filtering hardware for a similar effect.
i see, that's very unfortunate :/
in my case, i will do what i always do: whoever has implementations that generate mipmaps on the cpu, will simply have to get a computer that can play games...
i know it's a hard stance, and probably very unpopular, but i make games for myself =)
i have recently gone the fbo route with one of the generatemipmap instances, which was for blurring the depth end of my scene
mostly because i got to write my own blurring, which is miles better, and yes using hardware bilinear interpolation always helps =)
it's noteworthy that on performance, using fbo to blur vertically and horizontally actually lowered my fps slightly compared to the glGenerateMipmap solution (even with the hardware bilinear tricks, and only 1 step in each direction!)
and with mipmaps i generated 2 levels, which i could interpolate between using texture2DLod(..)
i have to say, i like the mipmap version better, but unfortunately, it doesn't look quite as good!

unfortunately i dont like bloom, and feel it doesnt add anything to my game except brightness, and problems...
you can't have white textures anymore, for example

Than you definitely did something wrong. Luminance isn't about absolute values, but about relative values. There's no white. Gray and black are just as white as white is. They are just darker. But there's no limit as to how bright white can get.
What bothers me even more, is that you are rendering your albedo values directly to your frame buffer. Albedo is the complement of the absorbtion factor of a material when the light that was not reflected enters the material. But in your case, there's no light at all. Just the 100% albedo (0% absorbtion) being rendered directly to the frame buffer as some white value with a given brightness (probably 1.0f).

Also Bloom is not just a cheap non-realistic effect. Bloom actually simulates how the light gets scattered when travelling through the lenses. So the resulting image isn't actually any brighter at all anyways. Bloom actually is part of the lens flare simulation, just the simplest part of it.

[quote name='Kaptein' timestamp='1352992676' post='5001236']
unfortunately i dont like bloom, and feel it doesnt add anything to my game except brightness, and problems...
you can't have white textures anymore, for example

Than you definitely did something wrong. Luminance isn't about absolute values, but about relative values. There's no white. Gray and black are just as white as white is. They are just darker. But there's no limit as to how bright white can get.
What bothers me even more, is that you are rendering your albedo values directly to your frame buffer. Albedo is the complement of the absorbtion factor of a material when the light that was not reflected enters the material. But in your case, there's no light at all. Just the 100% albedo (0% absorbtion) being rendered directly to the frame buffer as some white value with a given brightness (probably 1.0f).

Also Bloom is not just a cheap non-realistic effect. Bloom actually simulates how the light gets scattered when travelling through the lenses. So the resulting image isn't actually any brighter at all anyways. Bloom actually is part of the lens flare simulation, just the simplest part of it.
[/quote]

yes, i have no clue on the matter
my own implementation is probably not bloom in any sense of what you'd expect
however, my complaint was on other games' overuse, or perhaps theres a cheaper variant that does the same thing but simply increases brightness on everything
my experience of bloom isn't really with my own usage of it, since i tried it only once for myself (and HDR rendering is very new to me)
i don't know.. i once saw a bloom example of the crysis engine which i thought looked reasonable, or good, if you will
the crysis example also had a comparison image without bloom, and it looked very stale =) so, there's at least a way to do this right

one of these days i will take a look at this again, but for now im hopelessly struggling against fps =)
thanks for the tips

This topic is closed to new replies.

Advertisement