Forward and Deferred Buffers

Started by
12 comments, last by matt77hias 6 years, 6 months ago

My current setup for Forward and Deferred (Forward does not use a GBuffer):

  • GBuffer
  1. Base Color buffer (4 bytes/texel)
  2. Material buffer (4 bytes/texel)
  3. Normal buffer (4 bytes/texel)
  4. Depth buffer (4 bytes/texel)
  • Ping-pong HDR buffers (post-processing)
  1. HDR 1 (16 bytes/texel)
  2. HDR 2 (16 bytes/texel)
  • Swap Chain
  1. Back buffer (4 bytes/texel)
  2. Depth buffer (4 bytes/texel)

Currently none of these buffers is shared. I wonder if it would be beneficial to share some of them to reduce the number of buffers:

  • GBuffer depth buffer + Swap Chain depth buffer
  • GBuffer base color buffer + HDR 1
  • GBuffer material buffer + Swap Chain back buffer (or should one directly use an sRGB texture for the Swap Chain back buffer?)

 

🧙

Advertisement
1 hour ago, matt77hias said:

Currently none of these buffers is shared. I wonder if it would be beneficial to share some of them to reduce the number of buffers:

In some of the games I've worked on, we've used a render-target-pool. When doing a pass that temporarily needs some buffers, you ask the pool for them (e.g. bloom might request 1x ping-pong with HDR bit-depth, 1x half-resolution with 8-bit bit-depth, and 2x quarter resolution with 8-bit bit-depth), then when you're finished using them, you return them to the pool so that subsequent rendering passes can possibly reuse them.

In D3D12 and consoles, you can actually allocate multiple textures over the top of each other. E.g. maybe 2x 8-bit textures and 1x 16 bit texture in the same bit of memory. You can then start the frame by using the 2x 8-bit textures, and when you're finished with them, you can mark them as invalid, mark the 16-bit texture as valid, and then start using that 16-bit texture for ping-ponging, etc... This kind of this is not possible on older API's though.

1 hour ago, matt77hias said:
  • GBuffer base color buffer + HDR 1

This doubles the size of your colour data in the GBuffer. Often GBuffer passes are bottlenecked by memory bandwidth, so you want to keep the GBuffer size as low as possible. :(

1 hour ago, matt77hias said:

GBuffer material buffer + Swap Chain back buffer

In some API's, reading from the swap-chain is disallowed. Make sure that it's allowed in the APIs that you're using :)

1 hour ago, matt77hias said:

GBuffer depth buffer + Swap Chain depth buffer

You don't even need a swap-chain depth buffer at all. The monitor doesn't display depth ;)

13 minutes ago, Hodgman said:

This doubles the size of your colour data in the GBuffer. Often GBuffer passes are bottlenecked by memory bandwidth, so you want to keep the GBuffer size as low as possible.

The idea was to have some more precision if using linear space. But I am going to convert to gamma. 

On the other hand one of the HDRs will be idle till lighting is finished, so I thought using it earlier. But you're right bandwidth is also important (I seem a bit biased to memory consumption).

13 minutes ago, Hodgman said:

You don't even need a swap-chain depth buffer at all. The monitor doesn't display depth

:o is actually an artifact from the time I started d3d programming. It is the depth buffer for forward rendering, didn't really found a good class to store that one in :P . So I dropped it in my Renderer class which only owns the adapter, output, device, device context, swap chain, back buffer RTV, DSV and display configuration, but doesn't really do actual "rendering".

🧙

22 minutes ago, Hodgman said:

render-target-pool

and all of these resources had a SRV, RTV and UAV attached?

🧙

8 hours ago, matt77hias said:
  • Ping-pong HDR buffers (post-processing)
  1. HDR 1 (16 bytes/texel)
  2. HDR 2 (16 bytes/texel)

I hope you mean 16-bits/channel here rather than 16 bytes/texel! Using R32G32B32A32_FLOAT for the HDR buffers would be practically unheard of. R16G16B16A16_FLOAT is fine for almost everything you would want to do and R11G11B10_FLOAT is commonly used for even most AAA titles.

Adam Miles - Principal Software Development Engineer - Microsoft Xbox Advanced Technology Group

53 minutes ago, ajmiles said:

I hope you mean 16-bits/channel here rather than 16 bytes/texel! Using R32G32B32A32_FLOAT for the HDR buffers would be practically unheard of. R16G16B16A16_FLOAT is fine for almost everything you would want to do and R11G11B10_FLOAT is commonly used for even most AAA titles.

I actually meant 16 bytes/texel. You can use half floats instead but I wonder how HDR is that? In my native understanding full floats come at the cost of 4x 4byte/texel textures you use somewhere in a game? Or for full HD something like 2^25 = 32MB.

🧙

Half floats were literally invented for HDR :)

As above, using full floats is unheard of as doubling your memory bandwidth really will impact performance a lot. 

Half's go from 0 to around 60k, which is a pretty massive dynamic range. You can exceed the maximum value quite easily with a lot of bright lights, which results in 'inf' being stored in that pixel, so you do need to check for that / clamp at 60k.

If you need more range, you can move the multiply by 'exposure' out of your tone mapping shader and into every forward/lighting shader instead. As above, this is how people manage to get away with using even 11_11_10_FLOAT.

 

18 hours ago, matt77hias said:

and all of these resources had a SRV, RTV and UAV attached?

They had different flags. Making a resource UAV compatible may affect the memory layout compared to just SRV compatible, so you should use the minimum set of flags during resource creation. Likewise, some had mips while others didn't, etc. 

When allocating from the pool you'd specify size, formats, usage, capability, etc. 

12 hours ago, Hodgman said:

exposure

How do you define "exposure"?

🧙

1 hour ago, matt77hias said:

How do you define "exposure"?

In real life, it's basically the amount of light that the sensor/film is exposed to in order to create the image.

  • Increasing the aperture (the size of the hole that lets in light) will increase the amount of light coming in, but will also increase the strength of depth of field effects / narrow the depth of focus.
  • Increasing the shutter time (the time that the hole is open) will increase the amount of light coming in, but will also increase the strength of motion blur effects.
  • Increasing the ISO value (the sensitivity of the sensor/film) will increase the amount of light that is captured, but will also increase the strength of film-grain / noise effects.

Many engines are now trying to model real cameras, so that people trained with real-world tools will be immediately comfortable in engine, and also so that in-game post-processing effects look more natural / more like cinema.

If you're not trying to emulate a real camera though, then "exposure" is just an arbitrary multiplier that you use to rescale the input data as the first step of a tonemapper. You can either pick an exposure value manually for each scene in your game, or use an auto-exposure technique where you look at the average brightness of the lighting buffer (or more commonly - the geometric mean of the luminance in the lighting buffer) and use that average value to pick a suitable exposure value where the final image won't be too bright or too dark.

5 minutes ago, Hodgman said:

In real life, it's basically the amount of light that the sensor/film is exposed to in order to create the image.

  • Increasing the aperture (the size of the hole that lets in light) will increase the amount of light coming in, but will also increase the strength of depth of field effects / narrow the depth of focus.
  • Increasing the shutter time (the time that the hole is open) will increase the amount of light coming in, but will also increase the strength of motion blur effects.
  • Increasing the ISO value (the sensitivity of the sensor/film) will increase the amount of light that is captured, but will also increase the strength of film-grain / noise effects.

Many engines are now trying to model real cameras, so that people trained with real-world tools will be immediately comfortable in engine, and also so that in-game post-processing effects look more natural / more like cinema.

So radiance?

🧙

This topic is closed to new replies.

Advertisement