HDR Rendering (Average Luminance)

Started by
30 comments, last by Medo Mex 8 years, 4 months ago

@Adam Miles:

1. When I set the view port and texture size to a smaller resolution, the GPU will handle the down sampling for me, should I let the GPU handle all the downsamples for me instead of creating Downscale2x2 function in the pixel shader? or I should write Downscale2x2 in pixel shader? What is the difference?

2. Why I can't just downsample to 1x1 in one single pass?

Advertisement

2. Why I can't just downsample to 1x1 in one single pass?


You theoretically could, but pixel shaders work mostly independently from each other so you would have one thread doing all the work since each thread can only output one pixel. It would be like doing the whole process on the CPU except slower.

Your gpu has tons of threads it can use. Unfortunately this means the last half dozen or so down samples aren't using it to its full potential.

1) If by "the GPU will handle downsampling for me" you mean that you simply perform one texture read per pixel with Bilinear sampling turned on, then yes. A 2x2 downsample is just one texture read at the interpolated UV coordinate for a full screen quad. The hardware will take care of reading all 4 texels in the quad and average them.

2) Because then the pixel shader will only run one thread and that thread would have to read 2,073,600 texels from the original texture. The GPU is a massively parallel machine that only performs anywhere close to its peak performance when you make use of thousands of threads at once. By downsampling in a single step performance would probably be 1000x worse than what we've proposed.

Adam Miles - Principal Software Development Engineer - Microsoft Xbox Advanced Technology Group

Now, I have the scene average luminance.

What is the next step? Should I do tone mapping now or calculate the adapted luminance and then do tone mapping?

I found this code for tone mapping and tried to test it now but I'm not getting the expected results, after using the following code the scene get sometimes bright and sometimes dark (depending on where the camera is looking at)


// Tex[0] = Scene texture
// Tex[1] = Scene average luminance

float WHITE = pow( 1, 2 );
float ALPHA = 0.72;
float4 color = Tex[0].Sample( SampleType, IN.TexCoord );

float lum = dot( color.rgb, float3( 0.2126, 0.7152, 0.0722 ));  // Pixel luminance
lum = (lum * (1 + lum / WHITE)) / (1 + lum);                    // Reinhard tonemapping
lum *= ALPHA / (Tex[0].Sample( SampleType, float2( 0.5, 0.5 )).r);     // scaled luminance

color *= lum;
return saturate( color );

Why don’t you follow the steps demonstrated clearly in the HDR sample included in the DirectX SDK?


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

@L. Spiro: Because I'm trying to understand how exactly it works and I have been looking on different HLSL samples with different approaches.

I'm asking why the above code is not working as expected? Is the reason because I'm using the average luminance instead of calculating the adapted luminance or something else?

Yeah, it's because you're not using adapted luminance. The goal of the adapted luminance is to smooth out transitions by smoothly converging to the current average luminance over time.

I'm now doing adapted luminance calculation pass, but the scene keep flashing a brighter color.

When passing the time elapsed to CalculateAdaptedLuminancePS shader, if I change the line:


shaderConstBuffer.timeElapsed = TimeElapsed;

To any static number (for example):


shaderConstBuffer.timeElapsed = 2.0f;

It doesn't flash anymore.

If I try to output the adapted luminance on the screen to see how it looks like, I see it's flashing as well.

Any idea what could be causing that problem?

What does TimeElapsed represent? What does your shader look like?

@Styves:

CurrentTime = timeGetTime();

TimeElapsed = CurrentTime - LastTime;

Pixel Shader:


float4 CalculateAdaptedLuminance(PS_IN input): SV_TARGET
{
    float fAdaptedLum = Tex[0].Sample(SampleType, float2(0.5f, 0.5f)).r;
    float fCurrentLum = Tex[1].Sample(SampleType, float2(0.5f, 0.5f)).r;
    
    // The user's adapted luminance level is simulated by closing the gap between
    // adapted luminance and current luminance by 2% every frame, based on a
    // 30 fps rate. This is not an accurate model of human adaptation, which can
    // take longer than half an hour.
    float fNewAdaptation = fAdaptedLum + (fCurrentLum - fAdaptedLum) * ( 1 - pow( 0.98f, 30 * elapsedTime ) );
    return float4(fNewAdaptation, fNewAdaptation, fNewAdaptation, 1.0f);
}

I'm guessing that the problem could be with passing the last adapted luminance to the above shader, which is Tex[0]

In the first frame, I have no last adapted luminance calculated yet, what should I pass to Tex[0] in the first frame?

This topic is closed to new replies.

Advertisement