Jump to content

  • Log In with Google      Sign In   
  • Create Account

HDR adaptation, avg lum isn't calculated properly


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
7 replies to this topic

#1 Juliean   GDNet+   -  Reputation: 2749

Like
0Likes
Like

Posted 07 November 2013 - 05:35 AM

Hello,

 

after starting over with implemented HDR for my new engine, I'm experiencing an issue I had the last time too: The scenes adaptation reacts mostly entierly to the luminance in the center of the scene, not the averaged value. Take a look at the two screenshots. The first one shows a fully adapated scene. The second one is almost exactly the same, though slightly moved to the right. Now obviously the shadowy part of that object is in the middle, and that seems to drag the "average" luminance I calculated to become extremely low, therefore the scene flashes in all white. If I move the camera slightly to eigther the right, left, up or down, it becomes "normal" again. This happens everywheren, whenever there is a very dark/bright part in the middle of the screen, the adaptation goes nuts. Therefore I deduce that the average luminance calculation is broken. I've even got multiple different settings, and all produce this same behaviour:

 

- DirectX11, shader:

#include "../../../Base3D/Effects/Vertex.afx"

sampler InputSampler : register(s0);

Texture2D <float4> Luminance  : register(t0);

float4 mainPS(VS_OUTPUT i) : SV_TARGET0
{
	float4 inTex = Luminance.Sample(InputSampler, i.vTex0);
	
	return inTex;
};

I'm using this shader with a linear sampler to downsample the scenes luminance by 2times until there is only a 1x1 target left.

 

- DirectX11 auto mipmap generation:

 

Since the downsampling first produced the issue, I decided to try out auto-mipmap generation. It still produces exactly the same effect, the luminance in the middle of the scene almost entirely determines the avg luminance.

#include "../../../Base3D/Effects/Vertex.afx"

cbuffer instance : register(b2)
{
	float2 params; // x = delta, y = miplevel
}

sampler InputSampler : register(s0);

Texture2D <float> CurrentLum  : register(t0);
Texture2D <float> PreviousLum  : register(t1);

float4 mainPS(VS_OUTPUT i) : SV_TARGET0
{
	float fAdaptedLum = PreviousLum.Sample(InputSampler, i.vTex0);
	float fCurrentLum = CurrentLum.SampleLevel(InputSampler, i.vTex0, (int)params.y);
	
	const float fTau = 0.5f;
    float fNewAdaptation = fAdaptedLum + (fCurrentLum - fAdaptedLum) * (1 - exp(-params.x * fTau));

	return float4(fNewAdaptation, 0.0f, 0.0, 1.0);
};

- DirectX9, shader (almost identically to the DX11 one), also the same result.

 

Now, is there anything I'm missing? The way I used to do this has been taken from an old NVIDIA sample, but this one doesn't even get gamma-correction right, so I doubt it is accurate...

 

 

Attached Thumbnails

  • HDRAdaptation.jpg
  • HDRAdaptation2.jpg


Sponsor:

#2 skarab   Members   -  Reputation: 500

Like
0Likes
Like

Posted 07 November 2013 - 06:31 AM

some cards doesn't handle mipmap generation for float textures, and some do it on the CPU side (even for non float), you need to generate mipmaps with your on shaders.

 

(oh and auto generation doesnt work for render targets, havent looked your code through.)


Edited by skarab, 07 November 2013 - 06:41 AM.


#3 Juliean   GDNet+   -  Reputation: 2749

Like
0Likes
Like

Posted 07 November 2013 - 06:57 AM


some cards doesn't handle mipmap generation for float textures, and some do it on the CPU side (even for non float), you need to generate mipmaps with your on shaders.

(oh and auto generation doesnt work for render targets, havent looked your code through.)

 

DirectX11 at least should be able to handle auto generation for render targets, in fact its the only use for it. http://msdn.microsoft.com/en-us/library/windows/desktop/ff476426%28v=vs.85%29.aspx See the remarks about needing to have RENDER_TARGET and SHADER_RESOURCE set. I did check the result though, and the mip levels are correctly calculated at least that there are miplevels, but it appears it doesn't sample right...



#4 skarab   Members   -  Reputation: 500

Like
0Likes
Like

Posted 07 November 2013 - 07:05 AM

Have you checked the mipmap visually ? (maybe there is no bilinear support as well, I dont know this directx 11 thing sorry)



#5 N.I.B.   Members   -  Reputation: 1213

Like
0Likes
Like

Posted 07 November 2013 - 07:32 AM

When you sample the Luminance texture, you need to sample the lowest mip level, since this is the level which holds the average value.

You call Luminance.Sample(), so unless you create the SRV with only the last mip-level, this will almost certainly will not sample from there. You also don't need texCoord from the VS, as the final mip-level is 1 pixel, and you can just use (0.5, 0.5).

 

The old NVIDIA sample probably did gamma correction inside the shader. Now-days, if you create your resources as SRGB, it's done automatically in the HW.



#6 mhagain   Crossbones+   -  Reputation: 8285

Like
0Likes
Like

Posted 07 November 2013 - 12:50 PM

Using texture.SampleLevel, you can just pass in an absurdly high value for the LOD (something like 666 biggrin.png) and it will clamp to the smallest miplevel, which should be your 1x1 level.  I'm not too certain about sampling from a luminance rendertarget however; last time I did this I sampled from an RGBA one and dotted the float4 with (0.3, 0.59, 0.11, 0.0) to get the luminance.

 

The other thing I found is that for certain game types - such as an FPS - you move through the world so fast that the average luminance in the scene changes rapidly, which led to a flickering/strobing effect.  To resolve that I updated the average luminance at some well-defined interval (0.1 seconds worked well for me, you can feel free to experiment) and interpolated between that and the previous average luminance based on time passed.  That worked quite well and gave a good illusion of eye-adaptation over time - not physically correct (in real life it takes a lot longer) but good enough to show off the effect in a reasonably convincing manner.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#7 Matias Goldberg   Crossbones+   -  Reputation: 3723

Like
0Likes
Like

Posted 07 November 2013 - 04:02 PM

 


some cards doesn't handle mipmap generation for float textures, and some do it on the CPU side (even for non float), you need to generate mipmaps with your on shaders.

(oh and auto generation doesnt work for render targets, havent looked your code through.)

 

DirectX11 at least should be able to handle auto generation for render targets, in fact its the only use for it. http://msdn.microsoft.com/en-us/library/windows/desktop/ff476426%28v=vs.85%29.aspx See the remarks about needing to have RENDER_TARGET and SHADER_RESOURCE set. I did check the result though, and the mip levels are correctly calculated at least that there are miplevels, but it appears it doesn't sample right...

 

Have you checked you create the RTs with D3D11_RESOURCE_MISC_GENERATE_MIPS?

Furthermore, check your sampling states are correct.

 

I wouldn't trust mipmapping to do this work though. Drivers often allow overriding mip settings to improve performance by trading quality; or to avoid blurry textures; which could break your HDR.



#8 Juliean   GDNet+   -  Reputation: 2749

Like
0Likes
Like

Posted 08 November 2013 - 02:44 AM


Have you checked the mipmap visually ? (maybe there is no bilinear support as well, I dont know this directx 11 thing sorry)


Have you checked you create the RTs with D3D11_RESOURCE_MISC_GENERATE_MIPS?

 

Regarding this, I did check that at least my render target is created with the GENERATE_MIPS flag, and the mips are indeed being generated. I don't have any screenshots, but I'm not sure I could tell whether it would have correct filtering or not, so I'll try to get a cap of the mipmaps so some of you could look at it...


When you sample the Luminance texture, you need to sample the lowest mip level, since this is the level which holds the average value.

You call Luminance.Sample(), so unless you create the SRV with only the last mip-level, this will almost certainly will not sample from there. You also don't need texCoord from the VS, as the final mip-level is 1 pixel, and you can just use (0.5, 0.5).

 

No, no, I'm calling Sample on the 1x1 luminance output target from the last call, I'm using SampleLevel passing in the correct mip level (11 for 1378x768) on the scenes luminance texture. Thanks to for the hint with the texture coordinates, it didn't fix anything but at least now my shader is a little simplier.

 


The old NVIDIA sample probably did gamma correction inside the shader. Now-days, if you create your resources as SRGB, it's done automatically in the HW.

 

I'm pretty sure it didn't do it anywhere, there was no pow() anywhere, and SRGB in the effects was explicitely set to false. Maybe someone screwed up before posting the sample, or something...

 


Using texture.SampleLevel, you can just pass in an absurdly high value for the LOD (something like 666 biggrin.png) and it will clamp to the smallest miplevel, which should be your 1x1 level. I'm not too certain about sampling from a luminance rendertarget however; last time I did this I sampled from an RGBA one and dotted the float4 with (0.3, 0.59, 0.11, 0.0) to get the luminance.

 

Ah, thats good to know and saves me some complexity :D Well, I'm doing the RGB to LUM conversion before calculating the average luminance, so that should be fine :D


The other thing I found is that for certain game types - such as an FPS - you move through the world so fast that the average luminance in the scene changes rapidly, which led to a flickering/strobing effect. To resolve that I updated the average luminance at some well-defined interval (0.1 seconds worked well for me, you can feel free to experiment) and interpolated between that and the previous average luminance based on time passed. That worked quite well and gave a good illusion of eye-adaptation over time - not physically correct (in real life it takes a lot longer) but good enough to show off the effect in a reasonably convincing manner.

 

I though that could be a reason too, and I quess I'll need a solution to that at some point, but in my case even a suddle move of the camera like in this case of 1 screen pixel, it goes from average dark to unrealistically bright in half a second, so thats something else...

 


I wouldn't trust mipmapping to do this work though. Drivers often allow overriding mip settings to improve performance by trading quality; or to avoid blurry textures; which could break your HDR.

 

Well, I went to doing auto-mipmapping since my first sample-shader approach failed with the same result. Is there anything wrong about my pass-through shader, the first one I posted? I made sure the filter for the scene luminance is set to linear, so I quess I need to do something else inside the shader to get it to work?






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS