Jump to content

  • Log In with Google      Sign In   
  • Create Account

We need your help!

We need 7 developers from Canada and 18 more from Australia to help us complete a research survey.

Support our site by taking a quick sponsored survey and win a chance at a $50 Amazon gift card. Click here to get started!


ajmiles

Member Since 09 Jul 2013
Offline Last Active Yesterday, 03:02 PM

Posts I've Made

In Topic: [D3D12] Array of cbuffers?

25 August 2015 - 02:55 PM

You can't, but you should really stick to a StructuredBuffer for what you're trying to do.

 

A wave/warp of threads is almost certainly going to access your array in a divergent manner and that's not a good thing to do with a constant buffer on some GPUs. Trying to access an array of constant buffers using a feature of Shader Model 5.1 is taking divergence to a whole new level and will not run well at all.


In Topic: [D3D12] Array of cbuffers?

25 August 2015 - 10:43 AM

Are you sure you want an array of two constant buffers and not just a single constant buffer with an array of two matrices?


In Topic: sRGB on diffuse textures or gbuffer color texture?

25 August 2015 - 10:31 AM

 


tonemgub, on 25 Aug 2015 - 5:09 PM, said:

Those light buffers do not store any color information AFAIK - they just store light intensities, which are always linear,


Lights have colour don't forget, the intensity of each channel isn't necessarily equal.

 

The per-channel intensities are still linear though. You could use an SRGB texture to store them, but what's the point, if all your calculations are in linear space?

 

 

Because sRGB targets can be used to store whatever you want, linear or non-linear. The curve applied to the the values before reads and writes acts as a good, free/cheap means of retaining precision on smaller values while sacrificing it for bigger values. Any value that somehow contributes to the final perceptual colour (either the textures themselves or the per pixel lighting values) is a good candidate for this. The reason being that the human eye is significantly better at perceiving small differences in closely matched dark colours than similarly distanced bright colours.

 

There's probably a good evolutionary benefit to being able to see the predator in the shadows than being able to pick apart very bright colours that are already very visible. For that reason, if we don't have as many bits available for storage as we'd like, it would be better to dedicate them to dark colours/intensities than to those at the other end of the scale.


In Topic: sRGB on diffuse textures or gbuffer color texture?

25 August 2015 - 08:28 AM

Those light buffers do not store any color information AFAIK - they just store light intensities, which are always linear,

 

 

Lights have colour don't forget, the intensity of each channel isn't necessarily equal.

 

 

 

 

First of all, you're going to want a lighting buffer format that lets you store values > 1.0, so R16G16B16A16_UNORM is not what you're after. That same format as _FLOAT would do the trick, but DXGI_FORMAT_R11G11B10_FLOAT would also work well.

 

Why would you want values larger than 1.0? Dosn't the same RGB values normalized to [0.0, 1.0] give the same result?

 

Also does this not mess up lighting since the diffuse textures are UNORM and when sampled in the shader aren't they still in the [0.0, 1.0] range (since there is no equivalent float format for a SRV)? So you would only output color on the very low-end and very dark as a result?

 

If you don't have the ability to store values > 1 in your lighting buffer then you're going to lose information in areas of the scene where the total light intensity exceeds 1. Light intensity isn't stored on a scale of "Black" to "White", there's no limit to how bright something can appear with enough light (or lights) shining at it. There is no maximum light intensity that you can map to the value of 1, you need a scale where 0 is "no light" and the upper limit is (within reason) unbounded.

 

Imagine a light of intensity '1.0' shining directly at a white wall. Where dot(N, L) is 1, the resulting colour is "White * 1.0f", which is still 1.0f. Now add a second light shining at the same area of the wall, now you have twice the illumination because you have two lights. Logically the light intensity on this area of the wall is now 2.0 rather than 1.0, so you need a texture format that can store values that exceed 1.

 

Your textures are still UNORM, as they represent values between Black and White, not "no light" to some unbounded intensity (although an Emissive texture may want to be HDR). It's the calculation of texture [0->1] * light intensity [0->inf] that results in HDR values, not the diffuse textures themselves.


In Topic: sRGB on diffuse textures or gbuffer color texture?

25 August 2015 - 07:51 AM

So much misinformation!
 
First of all, you're going to want a lighting buffer format that lets you store values > 1.0, so R16G16B16A16_UNORM is not what you're after. That same format as _FLOAT would do the trick, but DXGI_FORMAT_R11G11B10_FLOAT would also work well.
 
Secondly, you really do want your diffuse textures sampled as sRGB. You make think that there's no point in going from sRGB -> Linear -> sRGB again, but once you factor in the fact you're bilinearly / anisotropically filtering these textures, it no longer makes mathematical sense to perform linear filtering on non-linear data. When the hardware performs sRGB -> Linear decoding it doesn't take an average of the 4 sRGB encoded texels and then convert the result, it converts the 4 texels from sRGB to linear *and then* averages them. Work it out on paper with a few examples and you'll see that the two orders of operation are not equivalent. In areas of contrast in the texture this can make a big difference. sRGB decoding and encoding is free on all hardware that I know of in this era, so use it.
 
Thirdly, it is possible to have two SRVs / RTVs of the same texture in both SRGB and non-SRGB formats, you just have to create the underlying resource in a "TYPELESS" format, but that's beside the point as it shouldn't be necessary for what you're doing.
 
And finally, I'm not sure what this "in D3D all textures are floating point" is about. It reads like nonsense to me!

PARTNERS