• Advertisement
Sign in to follow this  

Vulkan When is sRGB conversion being done in Vulkan?

This topic is 479 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm in the process of moving my old Direct3D11 graphics code to Vulkan, and figured that I might as well get gamma correct from the beginning, but it's not quite right.

 

The problem is that the resulting image is too bright. What's being done as of now is simply to render objects with an albedo map to a G-buffer and then blitting that to the swapchain. There are many things that can influence the final result, and considering that the final result contradicts my intuition, there ought to be something I don't understand and I hope somebody can verify what is right and explain what is wrong.

 

Previously (which actually looked fine) I was using the first fit for swapchain format and color space when creating the swapchain, which in my case meant a UNORM format. In lack of other options, Vulkan seems to force color space to be sRGB, and since the specification indicates that the color space defines how the presentation engine interprets the data, I figured I might as well just use an sRGB format to be correct because I don't do any manual conversions.

 

This change made my image too bright, so my thought was that maybe the textures were loaded wrong the whole time. For DDS textures, the library provides the format, for which I use the Vulkan equivalent. For PNG textures I have to specify the format myself which is RGBA8_UNORM. I was surprised that the textures were not sRGB, but I double checked opening the images in VS and RenderDoc.

 

My G-buffer is RGBA8_UNORM.

 

All in all, I render linear textures to a linear G-buffer followed by blitting to an sRGB swapchain with sRGB color space. If I understand correctly, hardware should do sRGB conversions when needed, most notably when blitting the G-buffer. To me, this seems like it should work. Funny thing is that changing G-buffer to sRGB produces the same result.

 

What about shaders? If I sample an sRGB texture (which I don't right now), will the data be linear in the shader? If I write from the fragment shader, will there be any conversion depending on the render target format or does it just write the data as is, assuming that I have correctly converted it myself?

Share this post


Link to post
Share on other sites
Advertisement

Data sampled from an sRGB image view is always converted to linear when read by a shader. Similarly, writing to an sRGB image view will convert the linear values written by the shader to sRGB automatically. Blending is also performed in linear space for sRGB render targets.

 

Compared to OpenGL, Vulkan's sRGB features are much simpler. sRGB is simply an encoding for color data. It's simply an encoding with more precision around 0.0 and less around 1.0. That's all you really need to know when you use it. It's similar to how 16-bit floats, 10-bit floats, RGB9E5, etc work. You don't have to care about the actual encoding, just write and read linear colors. The intermediate encoding just affects the precision.

Edited by theagentd

Share this post


Link to post
Share on other sites

I would be surprised if sRGB is different in Vulkan vs OpenGL vs Direct3D. Like the previous post mentioned its just an encoding. Chances are the result you are seeing is correct, but because you've been use to the incorrect behavior all along, the correct behavior is now an anomaly. With that said, with sRGB, there are 2 things to be consider, 1. The shader inputs ( textures ) and the the target of the shader output ( render target, swap chains etc ). So just changing your textures is just a part of the puzzle. sRGB texture samples are converted to linear when read in shader. When you write fragment output, the output is assumed to be linear. If the rendertarget is a sRGB format rendertarget then conversion from linear to sRGB will happen when the fragment is written to the rendertarget.

. For PNG textures I have to specify the format myself which is RGBA8_UNORM. I was surprised that the textures were not sRGB, but I double checked opening the images in VS and RenderDoc.


Why would you expect it to be sRGB when you are not uploading the texture as such ? Is RGBA8_UNORM a sRGB format ?

If the data in the texture is color data, chances are its sRGB. I would suggest read this post http://http.developer.nvidia.com/GPUGems3/gpugems3_ch24.html as it may clear up a few questions you may have.

Share this post


Link to post
Share on other sites

Nice to see that my assumptions seem to be correct. The question was probably a bit ambiguous; the Direct3D version was likely correct, the "previous" version was just what I went with when getting Vulkan up and running and does not refer to the Direct3D version. I'm aware of what sRGB implies, I just wanted to know the particular rules of conversions in Vulkan.

 

I have also tested performing a fullscreen pass that simply outputs a gradient, and comparing to reference pictures of correct gamma indicates that something is indeed wrong. At the end of the day I'm none the wiser and have probably overlooked something. I will continue searching and see what can be the issue and look back here if somebody has chipped in with more suggestions or perhaps sources to conversion rules (in the spec I only found conversion rules regarding blitting, but nothing concerning shaders).

 

. For PNG textures I have to specify the format myself which is RGBA8_UNORM. I was surprised that the textures were not sRGB, but I double checked opening the images in VS and RenderDoc.


Why would you expect it to be sRGB when you are not uploading the texture as such ? Is RGBA8_UNORM a sRGB format ?

 

The format to use is just a remnant of when I was getting things up and running. I expected the texture to be sRGB because many images are stored with that encoding, but was surprised that it was not and could conclude that the texture was not the problem. I included that statement in the question for completeness as to what I considered when trying to figure out what might be erroneous.

Share this post


Link to post
Share on other sites

Your texture is almost certainly sRGB.

 

An image file normally contains a sequence of RGB triplets defining the colour of each pixel. However, this doesn't tell us exactly what 'shade' the colours are. A colour profile steps in here to define the spectral properties of the RGB values in the image - so without a profile the numbers are meaningless.

 

Fortunately sRGB is the colour space of the internet. This means it's now pretty much universally accepted that any untagged images files (i.e. they have no embedded profile) can be safely assumed to be sRGB.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By turanszkij
      Hi, right now building my engine in visual studio involves a shader compiling step to build hlsl 5.0 shaders. I have a separate project which only includes shader sources and the compiler is the visual studio integrated fxc compiler. I like this method because on any PC that has visual studio installed, I can just download the solution from GitHub and everything just builds without additional dependencies and using the latest version of the compiler. I also like it because the shaders are included in the solution explorer and easy to browse, and double-click to open (opening files can be really a pain in the ass in visual studio run in admin mode). Also it's nice that VS displays the build output/errors in the output window.
      But now I have the HLSL 6 compiler and want to build hlsl 6 shaders as well (and as I understand I can also compile vulkan compatible shaders with it later). Any idea how to do this nicely? I want only a single project containing shader sources, like it is now, but build them for different targets. I guess adding different building projects would be the way to go that reference the shader source project? But how would they differentiate from shader type of the sources (eg. pixel shader, compute shader,etc.)? Now the shader building project contains for each shader the shader type, how can other building projects reference that?
      Anyone with some experience in this?
    • By mark_braga
      I am working on a compute shader in Vulkan which does some image processing and has 1024 * 5=5120 loop iterations (5 outer and 1024 inner)
      If I do this, I get a device lost error after the succeeding call to queueSubmit after the image processing queueSubmit
      // Image processing dispatch submit(); waitForFence(); // All calls to submit after this will give the device lost error If I lower the number of loops from 1024 to 256 => 5 * 256 = 1280 loop iterations, it works fine. The shader does some pretty heavy arithmetic operations but the number of resources bound is 3 (one SRV, one UAV, and one sampler). The thread group size is x=16 ,y=16,z=1
      So my question - Is there a hardware limit to the number of loop executions/number of instructions per shader?
    • By AxeGuywithanAxe
      I wanted to see how others are currently handling descriptor heap updates and management.
      I've read a few articles and there tends to be three major strategies :
      1 ) You split up descriptor heaps per shader stage ( i.e one for vertex shader , pixel , hull, etc)
      2) You have one descriptor heap for an entire pipeline
      3) You split up descriptor heaps for update each update frequency (i.e EResourceSet_PerInstance , EResourceSet_PerPass , EResourceSet_PerMaterial, etc)
      The benefits of the first two approaches is that it makes it easier to port current code, and descriptor / resource descriptor management and updating tends to be easier to manage, but it seems to be not as efficient.
      The benefits of the third approach seems to be that it's the most efficient because you only manage and update objects when they change.
    • By khawk
      CRYENGINE has released their latest version with support for Vulkan, Substance integration, and more. Learn more from their announcement and check out the highlights below.
      Substance Integration
      CRYENGINE uses Substance internally in their workflow and have released a direct integration.
       
      Vulkan API
      A beta version of the Vulkan renderer to accompany the DX12 implementation. Vulkan is a cross-platform 3D graphics and compute API that enables developers to have high-performance real-time 3D graphics applications with balanced CPU/GPU usage. 

       
      Entity Components
      CRYENGINE has addressed a longstanding issue with game code managing entities within the level. The Entity Component System adds a modular and intuitive method to construct games.
      And More
      View the full release details at the CRYENGINE announcement here.

      View full story
    • By khawk
      The AMD GPU Open website has posted a brief tutorial providing an overview of objects in the Vulkan API. From the article:
      Read more at http://gpuopen.com/understanding-vulkan-objects/.


      View full story
  • Advertisement