DX11 HDR implementation

Started by
6 comments, last by Hawkblood 10 years ago

I want to implement HDR in my engine but I'm not sure how and documentation appears to be scarce. I think I need to use format DXGI_FORMAT_R32G32B32A32_FLOAT when creating the device and swap chain:


	// Initialize the swap chain description.
	ZeroMemory(&swapChainDesc, sizeof(swapChainDesc));

	// Set to a single back buffer.
	swapChainDesc.BufferCount = 1;

	// Set the width and height of the back buffer.
	swapChainDesc.BufferDesc.Width = (UINT)SCREEN_WIDTH;
	swapChainDesc.BufferDesc.Height = (UINT)SCREEN_HEIGHT;

	// Set regular 32-bit surface for the back buffer.
	swapChainDesc.BufferDesc.Format = DXGI_FORMAT_R32G32B32A32_FLOAT;

	// Set the refresh rate of the back buffer.
	if(m_vsync_enabled)
	{
		swapChainDesc.BufferDesc.RefreshRate.Numerator = numerator;
		swapChainDesc.BufferDesc.RefreshRate.Denominator = denominator;
	}
	else
	{
		swapChainDesc.BufferDesc.RefreshRate.Numerator = 0;
		swapChainDesc.BufferDesc.RefreshRate.Denominator = 1;
	}

	// Set the usage of the back buffer.
	swapChainDesc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;

	// Set the handle for the window to render to.
	swapChainDesc.OutputWindow = hWnd;

	// Turn multisampling off.
	swapChainDesc.SampleDesc.Count = 1;
	swapChainDesc.SampleDesc.Quality = 0;

	// Set to full screen or windowed mode.
	swapChainDesc.Windowed = true;//*********************** will be full screen windowed

	// Set the scan line ordering and scaling to unspecified.
	swapChainDesc.BufferDesc.ScanlineOrdering = DXGI_MODE_SCANLINE_ORDER_UNSPECIFIED;
	swapChainDesc.BufferDesc.Scaling = DXGI_MODE_SCALING_UNSPECIFIED;

	// Discard the back buffer contents after presenting.
	swapChainDesc.SwapEffect = DXGI_SWAP_EFFECT_DISCARD;

	// Don't set the advanced flags.
	swapChainDesc.Flags = 0;

	// Set the feature level to DirectX 11.
	featureLevel = D3D_FEATURE_LEVEL_11_0;

	// Create the swap chain, Direct3D device, and Direct3D device context.
	result = D3D11CreateDeviceAndSwapChain(NULL, D3D_DRIVER_TYPE_HARDWARE, NULL, 0, &featureLevel, 1, 
					       D3D11_SDK_VERSION, &swapChainDesc, &swapchain, &dev, NULL, &devcon);
	if(FAILED(result))
	{
		return false;
	}

This fails. I must have another setting wrong or I can't use that format.....

Advertisement
You still want to create an 8_UNORM to be displayed on the monitor, but then also make a 16_Float render target for rendering the scene (which you'll then copy over to the 8-bit one via tonemapping).

Does it have to be 16_Float?

Walk me through the process. Is it:

create the device and swapchain like above, but with 8_UNORM

create a separate render target with 16_Float

render the scene (to the created render target)

and then render a quad that is sized to the screen with the render target as its texture? Or do you mean something different?


// Set regular 32-bit surface for the back buffer. swapChainDesc.BufferDesc.Format = DXGI_FORMAT_R32G32B32A32_FLOAT;

Thats not a regular 32-bit surface, thats a 128bit floating point surface ;)


Does it have to be 16_Float?

It does not, but having less bits saves bandwidth, and for most applications you will likely not need full 32 bit float precision for scene luminance.


and then render a quad that is sized to the screen with the render target as its texture? Or do you mean something different?

Thats the basic process, yes.

The main reason I want to use HDR is to reduce blending issues. If I have texture with a single color on a high detailed sphere, and I have a light shinning on it, there will be a gradient along the sphere as the light direction becomes perpendicular to the normal of each face (it fades in intensity from facing the light to facing away from the light).

In my old engine, when the camera was close to the surface of the sphere, this gradient was very apparent. I had distinct shades of gray (when the color was gray/white for the texture).

Will using HDR reduce this? It sounds to me that it wouldn't because the final result is still 8 bit channels.......

Your output will always be 8-bits per channel on current displays. You don't get more colors by using a wider display surface. What you *do* get is higher resolution in the math you are performing. But the final result displayed on your screen will be 8 bit per channel.

HDR rendering allows you to go beyond the 0..1 range in your lighting, but *not* on your display.

It sounds like you would be more interested in tone mapping, which HDR pipelines use to remap the high dynamic range back into the displayable 0..1 range. Whilst typically used with HDR, it can still be applied to non-HDR rendering also.

John Hable (who used to work at Naughty Dog) wrote some articles you may find of use:

http://filmicgames.com/archives/75

http://www.slideshare.net/ozlael/hable-john-uncharted2-hdr-lighting

n!

You might just have a cheap monitor that's causing the colour-banding to appear. Many LCDs are only 6bits per channel, with the expensive ones being 8bits and the unaffordable ones being 10bits ;)

I have LED full 1080p....

I'm going to read those articles and see if I can understand them (and implement).

EDIT:

The first one is easy to read and to the point. Thanks. It seems fairly simple to implement, so I'll try that.

I'm thinking the problem I'm trying to fix won't be fixed by this, but I want to use it anyway. The problem may be the in the shader's sampler state...... Don't know.

This topic is closed to new replies.

Advertisement