matt77hias

Members
  • Content count

    549
  • Joined

  • Last visited

Community Reputation

476 Neutral

3 Followers

About matt77hias

Personal Information

Social

  • Twitter
    matt77hias
  • Github
    matt77hias

Recent Profile Visitors

8200 profile views
  1. Gamma correction - sanity check

    I use texconv as well. You need to manually specify whether the input and/or output represents sRGB: -srgb, -srgbi, or -srgbo: Use sRGB if both the input and output data are in the sRGB color format (ie. gamma ~2.2). Use sRGBi if only the input is in sRGB; use sRGBo if only the output is in sRGB. But sometimes it is really trial and error. DirectXTex even has a flag forceSRGB for loading .dds files which do not have an explicit sRGB format but should be treated as if the raw data represents sRGB values. So even the .dds format which explicitly encodes the format, can be misused. Nothing is certain anymore.
  2. Gamma correction - sanity check

    My apologies, but it is a habit of continuously submitting, rereading and editing.
  3. Gamma correction - sanity check

    Ok I'll try to rephrase: Lets assume we have a sprite with an UNORM_SRGB with a single stored color of [0.73,0.73,0.73,1] We now want to sample from that texture in a pixel shader which will write to the back buffer which also has a UNORM_SRGB format. If we sample from the texture, the hardware knows that we deal with an SRGB texture and performs the conversion from SRGB to linear color space before sampling: [0.73,0.73,0.73,1] > [0.5,0.5,0.5,1]. Then, the sampling is performed and we get [0.5,0.5,0.5,1]. Next, we write that color to the back buffer. The hardware knows that we deal with an SRGB back buffer and performs the conversion from linear to SRGB color space (after blending) [0.5,0.5,0.5,1] > [0.73,0.73,0.73,1]. The display will then apply that "signal" and we will perceive the "signal" as half way between pure black and pure white. If you write the image (e.g. snapshot) to disk and use some image viewer, the image viewer will say that each texel has a raw value of [0.73,0.73,0.73,1].
  4. Gamma correction - sanity check

    I do not know the exact linear to sRGB function. A gamma of 2.2 is a (cheap) approximation. You will see 186 on your display, you will see 128 in your shader calculations. (see my edit to my previous post)
  5. Gamma correction - sanity check

    Assume you have sRGB textures and an sRGB back buffer: You write: 128 (linear color space) The hardware performs the sRGB conversion: ~ (128/255)^(1/2.2)*255 = 186 (sRGB color space) If you sample that texel, the hardware will give you again ~ (186/255)^(2.2)*255 = 128 (linear color space), but if you just look at your screen you will see the value 186 (sRGB color space). Stated differently you want to see the middle grey intensity between black and white, but the intensity curve of your display is non-linear, so you adapt your linear intensity value of 128 to 186 which will be half way between black and white for your display.
  6. Gamma correction - sanity check

    That would always be the case. The linear and sRGB corrected curves have the same value at the lowest (0) and highest (1 or 255) value, but the curves differ in between. I think the most easy thing you can verify first is a simple sprite. Create a sprite with an sRGB format, load the sprite as resource with an sRGB format and render (no operations) that sprite to a back buffer with an sRGB format. You should see the same result as in your (sRGB) image viewer.
  7. Gamma correction - sanity check

    I don't know Vulkan, but do these map to the similarly named DXGI_FORMATs? It seems strange that you only have BGRA and no RGBA? I needed the latter to use my back buffer as an UAV (though, now I do not reuse the back buffer for some intermediate calculations). But I don't get the combination LDR and forward+? Do you just use forward+ because you use lots of lights per view, but few per tile? If you had lots of lights per pixel, light contributions would accumulate a lot, making LDR less sufficient?
  8. Gamma correction - sanity check

    It is an approximation which can be good or bad for certain regions. If I do it manually, I use Frostbite's approximation. Though, I only convert color coefficients, which are used as multipliers for textures, from sRGB to linear color space on the application side. This does not result in a loss of precision, since you would typically use a float4 (which has more precision than an UNORM texel) for transferring such coefficients to the GPU. The hardware performs all the encoding and decoding between sRGB and linear color space for textures. You cannot do this manually, because the encoding and decoding are non-linear operations: sampling + conversion != conversion + sampling. The hardware performs the conversion before sampling, which is the correct order. If you want to do the conversion yourself, you run into problems since your conversion happens after the sampling which would only be correct if you restrict yourself to point samplers. You can also consider using this for the back buffer. Though the transfer function is different?
  9. Gamma correction - sanity check

    If you use sRGB formats, then the hardware will do the decoding and encoding between sRGB and linear space for you. But at the very last step where you transfer the content of your non-back buffer to your back buffer, you probably want to customize the common gamma value of 2.2. Some game provide the option to adjust the brightness: "move the slider till the image is barely visible". I basically work with half floats (16 bits) per color channel for storing my "images". Only at the very end of my pipeline, I transfer the HDR content to the LDR back buffer after applying my final operations: eye adaptation, tone mapping and custom gamma correction.
  10. Gamma correction - sanity check

    Why not using non-sRGB for the back buffer to support custom gamma correction (brightness adjustment)? Why not using a half float / channel for HDR support? The buffers of your GBuffer that deal with sRGB colors will benefit from UNORM_SRGB. Using only 1 byte/channel for storing data in sRGB color space as data in linear color space, has not enough precision (you will need 10 bits or so per channel).
  11. DX11 Rendering CS output to backbuffer [SOLVED]

    Small note: ZeroMemory was introduced for C# programmers wanting to write C++, you can just do = {}; (braced initializer) as well.
  12. __declspec(selectany)

    I noticed lots of such cases in DirectXMath (header only API) for common vectors and a giant list of predefined colors. Just noticed that both flags are defaults for the release modes in Visual Studio 2017.
  13. __declspec(selectany)

    Very informative post. Didn't know the anonymous namespace vs static and int x vs extern int x{} constructs. Thank you very much for pointing these out.
  14. __declspec(selectany)

    Thanks for the clarifications and the analogies between functions and variables.
  15. Yes. (This is of course not an issue for your users, just noticed it in the Visual Studio files.) My apologies. Didn't know.