Tone mapping + blur
Hi!
I'm trying to write a demo, which will implement different tone mapping operators in real time. I'm just at the very beginning and implemented just one at the moment (the one from jollyjeffers' demo).
I've got small problem - when exactly is blurred texture added? After or before tone mapping? In this article there's a picture of the HDR pipeline, which suggests that it's added before tone mapping. However both jollyjeffers' demo and HDRFormats sample from DX SDK apply blurred texture after tone mapping.
I tried both methods and they both produce some results, however i'm not sure which one is correct (if there is "correct" way to do it).
PS. Expect many more HDR and tone mapping related questions soon. :)
Thanks in advance
This is up to debate. Personally, I put my tone mapping pass first (then bright pass, h+v blooms, and then combine) so that the actual bloom effect isn't included in the tone mapping operation. I've had some success with this - here are some quick screenies (don't mind the black spots, they are just from crappy lightmaps):
Here is a short video I recorded:
I also do my luminance calculation a bit different. Instead of downsampling the scene every frame, I only do it 10 times per second, then lock the final 1x1 RT and cache the results. This allows me to do two things:
(1) Save a lot of fillrate from doing all of the downsampling every frame
(2) Have a nice timed exposure effect, since I can keep the luminance results from the previous samples.
It does require you to lock a render-target, but since the slowest part of that is the data transfer, it works out ok (since it's just a 1x1 64-bit texture). I also use the Reinhardt auto-exposure equation, presented in one of his later papers:
This seems to work well, and can approximate a pretty good exposure level for an arbitrary scene. It certainly beats having to estimate one yourself for every different scene you work with.
Here is a short video I recorded:
I also do my luminance calculation a bit different. Instead of downsampling the scene every frame, I only do it 10 times per second, then lock the final 1x1 RT and cache the results. This allows me to do two things:
(1) Save a lot of fillrate from doing all of the downsampling every frame
(2) Have a nice timed exposure effect, since I can keep the luminance results from the previous samples.
It does require you to lock a render-target, but since the slowest part of that is the data transfer, it works out ok (since it's just a 1x1 64-bit texture). I also use the Reinhardt auto-exposure equation, presented in one of his later papers:
// // Calculate exposure, of the form: // // (2 * log(avgLum) - log(minLum) - log(maxLum)) // --------------------------------------------- // log(maxLum) - log(minLum) // exp = 0.18 * 4^ // // Note that all logs are base2. //
This seems to work well, and can approximate a pretty good exposure level for an arbitrary scene. It certainly beats having to estimate one yourself for every different scene you work with.
Thanks!
Unfortunately i'm a newbie (started playing with DirectX just 2 weeks ago), so i've got few more questions, as results obtained by my (in fact jollyjeffers' :) ) demo are quite strange. Despite the fact that probably i'll just use 2D .hdr images rendered on the full screen quad to show results of different tone mapping operators (i think it will show the differences better than dynamic 3D scene), i'd like to solve all the problems before moving on.
Here are 2 pictures:
bloom added before tone mapping:
bloom added after tone mapping:
You can notice 2 things - white regions appear along borders of the image (in both cases) and in the second image the box and the skybox blend together.
Jollyjeffers uses in his demo a quadratic curve distrubtion instead of gaussian distribution. Maybe this is the reason why it looks so strange?
Also i'm not sure about bloom texture format. Right now, as i was adding it before tone mapping, it was A16B16G16R16F. This way the bloom result influences tone-mapping, because more pixels are bright and average luminance increases a bit (and i'm not sure if this is a good thing). If i'd add bloom after tone mapping as 8bpp texture maybe it would look better (there would be no bloom influence on luminance). However this way i'd perform bright pass and v+h blur on a normal, non-hdr, 8bpp texture. Which values should i reject in this case in bright pass as they have range of only 0-255?
And the last thing regarding bloom - why the bright pass is done on downscaled texture (original texture has 640x480, the one used for bright pass has 320x240 and the ones used for filtering have 80x60)? Is it because it's faster this way? To downscale the texture to 320x240 you need to take 4 samples for each pixel anyway.
Is there any reason to store luminance values in 32bits (precision?)? Right now i'm storing them in G32R32F, but if i would add auto-exposure calculations i'd need to store also minimum luminance somewhere and i don't know if i should use 32 or 16 bits per pixel. Also probably it would be better to calculate exposure outside the pixel shader as it is the same for all pixels (at least for this tone mapping algorithm). However if i'd decide to calculate it in the pixel shader (which would be much easier, just need to add few lines) is there any way to get the value of calculated exposure outside of the pixel shader, so i could display it on the screen?
Generally the results suck at the moment (especially when compared to yours :) ). It seems like i can't set proper exposure - i.e. when looking at the sky it is very overexposed and the box and the building are underexposed, even without any bloom.
[Edited by - g0nzo on July 24, 2006 4:11:45 PM]
Unfortunately i'm a newbie (started playing with DirectX just 2 weeks ago), so i've got few more questions, as results obtained by my (in fact jollyjeffers' :) ) demo are quite strange. Despite the fact that probably i'll just use 2D .hdr images rendered on the full screen quad to show results of different tone mapping operators (i think it will show the differences better than dynamic 3D scene), i'd like to solve all the problems before moving on.
Here are 2 pictures:
bloom added before tone mapping:
bloom added after tone mapping:
You can notice 2 things - white regions appear along borders of the image (in both cases) and in the second image the box and the skybox blend together.
Jollyjeffers uses in his demo a quadratic curve distrubtion instead of gaussian distribution. Maybe this is the reason why it looks so strange?
Also i'm not sure about bloom texture format. Right now, as i was adding it before tone mapping, it was A16B16G16R16F. This way the bloom result influences tone-mapping, because more pixels are bright and average luminance increases a bit (and i'm not sure if this is a good thing). If i'd add bloom after tone mapping as 8bpp texture maybe it would look better (there would be no bloom influence on luminance). However this way i'd perform bright pass and v+h blur on a normal, non-hdr, 8bpp texture. Which values should i reject in this case in bright pass as they have range of only 0-255?
And the last thing regarding bloom - why the bright pass is done on downscaled texture (original texture has 640x480, the one used for bright pass has 320x240 and the ones used for filtering have 80x60)? Is it because it's faster this way? To downscale the texture to 320x240 you need to take 4 samples for each pixel anyway.
Is there any reason to store luminance values in 32bits (precision?)? Right now i'm storing them in G32R32F, but if i would add auto-exposure calculations i'd need to store also minimum luminance somewhere and i don't know if i should use 32 or 16 bits per pixel. Also probably it would be better to calculate exposure outside the pixel shader as it is the same for all pixels (at least for this tone mapping algorithm). However if i'd decide to calculate it in the pixel shader (which would be much easier, just need to add few lines) is there any way to get the value of calculated exposure outside of the pixel shader, so i could display it on the screen?
Generally the results suck at the moment (especially when compared to yours :) ). It seems like i can't set proper exposure - i.e. when looking at the sky it is very overexposed and the box and the building are underexposed, even without any bloom.
[Edited by - g0nzo on July 24, 2006 4:11:45 PM]
Quote:Original post by g0nzo
Jollyjeffers uses in his demo a quadratic curve distrubtion instead of gaussian distribution. Maybe this is the reason why it looks so strange?
I noticed that my distribution really changed the output of the bloom blurs, so yea, that could be it.
Quote:Also i'm not sure about bloom texture format. Right now, as i was adding it before tone mapping, it was A16B16G16R16F. This way the bloom result influences tone-mapping, because more pixels are bright and average luminance increases a bit (and i'm not sure if this is a good thing).
Yea, this is the reason that I do tone mapping first. That way, the bloom itself doesn't screw it up.
Quote:If i'd add bloom after tone mapping as 8bpp texture maybe it would look better (there would be no bloom influence on luminance). However this way i'd perform bright pass and v+h blur on a normal, non-hdr, 8bpp texture.
You want to be using HDR floating-point textures the whole way through. Start by rendering the geometry into an FP texture, then use FP textures for the bright pass, blooms, ect. This way, you use a true 64- or 128- bit pipeline, and you don't get that washed out look.
Quote:And the last thing regarding bloom - why the bright pass is done on downscaled texture (original texture has 640x480, the one used for bright pass has 320x240 and the ones used for filtering have 80x60)? Is it because it's faster this way? To downscale the texture to 320x240 you need to take 4 samples for each pixel anyway.
Yea, it is faster to do the downsample, then use that smaller texture for the bright pass and blurs. It may be 4 or more samples, but you save so many more iterations through the PS. Fillrate can creep up on you pretty fast.
Quote:Is there any reason to store luminance values in 32bits (precision?)? Right now i'm storing them in G32R32F, but if i would add auto-exposure calculations i'd need to store also minimum luminance somewhere and i don't know if i should use 32 or 16 bits per pixel.
I just always have a minimum luminance of 0, to avoid having an additional full-blown FP render-target. In most scenes, there is usually at least one dark spot, so I found that this works ok.
Quote:Also probably it would be better to calculate exposure outside the pixel shader as it is the same for all pixels (at least for this tone mapping algorithm).
Yea, I calculate it outside of the pixel shader, along with many of the other luminance calculations. I can do this since I sample & cache the luminance texture at 10fps or so.
Thank you very much!
One last question (for now [smile] ): how does it work that bloom texture is A16B16G16R16F, scene texture after tone mapping is A8B8G8R8 and i can add bloom to the scene without tone mapping the bloom texture?
One last question (for now [smile] ): how does it work that bloom texture is A16B16G16R16F, scene texture after tone mapping is A8B8G8R8 and i can add bloom to the scene without tone mapping the bloom texture?
Quote:Original post by g0nzo
One last question (for now [smile] ): how does it work that bloom texture is A16B16G16R16F, scene texture after tone mapping is A8B8G8R8 and i can add bloom to the scene without tone mapping the bloom texture?
Good question...at that point, you do loose some data, but it seems to be ok, since you are going for that 'overbright' look, anyways. I guess the best way to determine the effects of things like this would be to directly compare tone mapping before bloom and tone mapping after bloom.
Thanks again! Hope you're not getting bored [smile]
How to read average and maximum values from 1x1 G32R32F luminance texture (I use MDX, but in "normal" DX it should be similar)? I've found out that i'm supposed to lock the texture/surface (with Texture.LockRectangle?). It returns GraphicsStream object, which has Read method, but i'm not sure how to read these 2 32bit FP values from it.
I'll try to implement gaussian distribution tomorrow and check if it will help with my blooming problems.
[EDIT]
I've noticed one more thing regarding bloom. As i'm creating bloom texture out of HDR scene texture, which is not influenced by exposure (not before tone mapping), changes to exposure doesn't influence bloom texture in any way. I'm not sure if i'm doing something wrong or it is just the way it works when you add bloom after tone-mapping (however i'm not sure if adding bloom before tone-mapping would change anything), but it looks strange i.e.
low exposure:
high exposure:
Notice that there's absolutely no change in bright pass texture (upper-left). I've compared it with rthdribl demo and in this demo exposure influences bloom like you'd expect. Am I doing something wrong or it is totally different way of doing bloom?
BTW. Thanks for the RHCP concert [smile]
[Edited by - g0nzo on July 25, 2006 1:46:36 PM]
How to read average and maximum values from 1x1 G32R32F luminance texture (I use MDX, but in "normal" DX it should be similar)? I've found out that i'm supposed to lock the texture/surface (with Texture.LockRectangle?). It returns GraphicsStream object, which has Read method, but i'm not sure how to read these 2 32bit FP values from it.
I'll try to implement gaussian distribution tomorrow and check if it will help with my blooming problems.
[EDIT]
I've noticed one more thing regarding bloom. As i'm creating bloom texture out of HDR scene texture, which is not influenced by exposure (not before tone mapping), changes to exposure doesn't influence bloom texture in any way. I'm not sure if i'm doing something wrong or it is just the way it works when you add bloom after tone-mapping (however i'm not sure if adding bloom before tone-mapping would change anything), but it looks strange i.e.
low exposure:
high exposure:
Notice that there's absolutely no change in bright pass texture (upper-left). I've compared it with rthdribl demo and in this demo exposure influences bloom like you'd expect. Am I doing something wrong or it is totally different way of doing bloom?
BTW. Thanks for the RHCP concert [smile]
[Edited by - g0nzo on July 25, 2006 1:46:36 PM]
What I do is apply bloom after tonemapping (so that it does not influence the tonemapping) but I use post-tonemapped luminance for the bloom. So basically when doing your initial bright pass (before blurring), calculate what the final color after tonemapping would be before the applying the brightpass calculation. So the idea is that bright areas (after tonemapping) will bloom and the bloom changes with the tonemapping.
Quote:Original post by g0nzo
How to read average and maximum values from 1x1 G32R32F luminance texture (I use MDX, but in "normal" DX it should be similar)? I've found out that i'm supposed to lock the texture/surface (with Texture.LockRectangle?). It returns GraphicsStream object, which has Read method, but i'm not sure how to read these 2 32bit FP values from it.
Actually, you should be using IDirect3DDevice9::GetRenderTargetData(), since this is a render-target, and can't be locked through the texture interface. Then, cast your data stream to float* (since a G32R32F is just 2 32-bit floats) and retrieve them that way.
If you use G16R16F, it's a bit trickier. There are D3DX functions to convert a 16-bit float into a 32-bit one, but I couldn't get it to work.
Quote:I've noticed one more thing regarding bloom. As i'm creating bloom texture out of HDR scene texture, which is not influenced by exposure (not before tone mapping), changes to exposure doesn't influence bloom texture in any way. I'm not sure if i'm doing something wrong or it is just the way it works when you add bloom after tone-mapping (however i'm not sure if adding bloom before tone-mapping would change anything)
Yea, that is odd, because the bloom passes should be fed off of the tone mapping pass (which uses the exposure value). I'd check to make sure that everything is correct there - perhaps your bloom passes aren't using the correct input render target.
Quote:BTW. Thanks for the RHCP concert [smile]
Yea, it was a fun show [wink]
I'm not sure i understand. So you're guys saying that I should perform tone mapping on the HDR scene texture, from which I obtain LDR scene texture. Perform bright pass on this LDR texture (which is influenced by exposure), blur the result and add it again to LDR texture? If that's correct then why use 16bpp FP textures for bright pass and burring steps if the data being processed is 8bpp (LDR scene texture after tone-mapping) and the result is also added to the same tone-mapped texture?
Also does anyone know MDX?
I tried the following code to read luminance values from the texture:
It works, but I get values like i.e. -3.123 and 0.3123. Also the length of the GraphicsStream (gs.Length) in bytes is (long)2147483647. Quite strange...
[Edited by - g0nzo on July 26, 2006 3:49:38 AM]
Also does anyone know MDX?
I tried the following code to read luminance values from the texture:
Surface luminanceSurface = device.CreateOffscreenPlainSurface(1, 1, Format.G32R32F, Pool.SystemMemory);SurfaceLoader.FromSurface(luminanceSurface, luminanceTextures[0].GetSurfaceLevel(0), Filter.None, 0);GraphicsStream gs = luminanceSurface.LockRectangle(LockFlags.ReadOnly);float[] luminanceValues = new float[2];Marshal.Copy(gs.InternalData, luminanceValues, 0, 2);luminanceSurface.UnlockRectangle();
It works, but I get values like i.e. -3.123 and 0.3123. Also the length of the GraphicsStream (gs.Length) in bytes is (long)2147483647. Quite strange...
[Edited by - g0nzo on July 26, 2006 3:49:38 AM]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement