sRGB -> Linear Space

Started by
13 comments, last by InvalidPointer 12 years ago
I had a conversation about this with my coworker today. Apparently the policy on this is not exactly clear.

If I convert a texture to linear space for proper lighting, at the end when all of the compositing is done I then convert back to sRGB space via sqrt() rather than pow( 1.0f / 2.2f ). This tidbit is not related.


My policy is that if you set the clear color to [0.5, 0.5, 0.5], if that clear color shows up in the final render it should be shown as [0.5, 0.5, 0.5], so you have to convert that to linear space before actually clearing the screen.

He agreed with that logic, but then he asked the killer question: How do I handle luminance?
Mine shifts up and down dynamically over 30 seconds. Which means even if I stick to this policy it is unlikely that the final result will be [0.5, 0.5, 0.5] anyway.

But I consider dynamic luminance to be just another effect. Assuming it wasn’t there, the goal would be to map to [0.5, 0.5, 0.5] on output, and the same for the colors of lights, materials, etc. All of those should be converted to linear space in order to produce the output that we would assume the artists expected when they chose those colors.


How does dynamic luminance and tone mapping throw a kink into things, and what is the most bestest way to handle this?



L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

Advertisement
Do you actually use the clear-colour for artistic purposes? Does it matter if it changes over time?

If you're dynamically mapping luminance values differently over time, and your clear-colour is being treated as a linear luminance value, then surely it's correct for the clear-colour to appear as a different result at different times.
Moreover, if you're not treating the clear-colour as a linear luminance value (i.e. it's a "colour", not a "quantity of light"), then it's incorrect for you to place that value into a render-target that is being treated as if it holds linear light values.

If you really want a static, non-changing background colour, i would render your HDR scene over the top of a black clear-colour, and then composite the final result (re-mapped for dynamic luminance) over the top of a new background.
e.g.original render target:
clear( 0,0,0,0 );
render linear content...
tone-map into gamma-space

new render target:
clear( 0.5, 0.5, 0.5 );
pre-multiplied alpha blend( input = original render target );
If I convert a texture to linear space for proper lighting, at the end when all of the compositing is done I then convert back to sRGB space via sqrt() rather than pow( 1.0f / 2.2f ). This tidbit is not related.
On a side-not-related-note wink.png Technically, you're working with gamma-2.0 space, not sRGB. Also, gamma-2.2 space is an approximation of sRGB, but not quite the same either.
Sorry, I made it seem as though I was focused on the clear color specifically. It was only the easiest example of a color an artist specifies as an input while expecting, under predictable circumstances (void of normal calculations and additions from other inputs, etc.), the same output.
This question is really about all colors that are not the texture itself. Materials, lights, everything.

As for the rest, probably one of the reasons I mentioned my conversion method was to indicate that I am using approximations. Going by the OpenGL documentation on how they convert from sRGB to linear space, I know that 2.2 is also an approximation, and I am fairly positive that even their more in-depth conversion is itself also still just an approximation. I have entertained the idea of using better approximations for my materials, lights, clear colors, etc., since they are only calculated once for render, not for every pixel, but I am not sure there will be any gain so I am putting that experiment on low priority.


For anyone curious, the person at the office I asked about this today is the taller of the two pictured here:
http://game.watch.impress.co.jp/docs/series/3dcg/20100401_358445.html
He is also the one mainly responsible for the graphics in this video:

The shorter of the two pictured gave a presentation at GDC last week. I hope anyone who saw it enjoyed it.

So basically the guy I asked knows his stuff, but I can’t shake the feeling that this needs more investigation.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

Yeah it doesn't really matter which colour-space you work in (2.0, 2.2, sRGB, etc), just as long as you're consistent smile.png
If some places are encoding with pow(2.2) and some places decoding with sqrt(), then you're in trouble.
---------

IMHO, if you're making a physically based renderer, then your artists have to learn that they're not painting colours any more, but are painting physical properties.
The main "colour texture" or "material colour" (diffuse colour) is actually the surface albedo. The "specular mask" isn't some arbitrary percentage, it's a function of the IOR's real-part. The "specular power" isn't some arbitrary number, it's an encoding of the roughness derived from averaging the microscopic normal map. The "light colour" isn't a colour, it's a spectrum of intensity values. etc, etc... All of these are then combined to create final light intensity values, which we then run through a simulation of a camera or eye-ball (tone-mapping) before displaying them.

This doesn't mean they can't use a traditional colour-picker to choose these values -- albedo is a percentage, and has a lot of dark values, so it's a good fit for being stored in sRGB space, "Light colours" can be represented as an sRGB colour + an intensity value, etc... -- but you should never really expect any of your input values to look exactly the same as your final results...
...unless you're rendering extremely unrealistic materials (impossible IOR producing a pure diffuse/lambertian surface), which is pointing directly at a light source with an intensity of 1.0 and no attenuation, and the tone-mapper is simply encoding linear values into the same colour space that your artists' monitors are tuned to.
As a bit of an addendum, there's no real reason why you can't pick either sRGB or straight linear for any kind of texture-- in fact you can probably get better quality by making this decision at asset cook time by way of a histogram. Look at what values you're storing (this works best if you use a high-precision source format, I prefer 16 bit normalized integer/32-bit floating point per channel depending on the semantic type of the texture in question) and make the final decision based on what fraction of the total pixels sit above some threshold. Crysis 2 does something pretty similar and the results are *very* impressive-- with standard DXT1/DXT5 textures and an additional histogram denormalization you can get very, very good quality even at insane exposures.
clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

As a bit of an addendum, there's no real reason why you can't pick either sRGB or straight linear for any kind of texture-- in fact you can probably get better quality by making this decision at asset cook time by way of a histogram. Look at what values you're storing (this works best if you use a high-precision source format, I prefer 16 bit normalized integer/32-bit floating point per channel depending on the semantic type of the texture in question) and make the final decision based on what fraction of the total pixels sit above some threshold. Crysis 2 does something pretty similar and the results are *very* impressive-- with standard DXT1/DXT5 textures and an additional histogram denormalization you can get very, very good quality even at insane exposures.

I plus-1’ed you because this is a very constructive comment. The information in it is what I consider to be a rare gem.
Unfortunately it is a gem I already knew, but I ++rep’ed you because it is a gem that should be more widely known.
My own engine has the histogram functions needed for this in-place but not the toolchain required to use them. Plus I mentioned this exact technique to the staff of Square Enix but they were more interested in my knowledge of how “Megatextures” worked (from id Software).
I do know how megatextures work, and I tried to caution them against the use of them. And failed the interview as a result. It was literally the turning point of the interview. I was applying to be their next next-gen game-engine programmer and had passed all previous interviews.

Unfortunately due to non-discloser agreements I can’t say much else, except that my current company is Square Enix’s best friend in the business and whether they wanted me to or not I am making the engines that are used in their Final Fantasy games—and everything else they are currently doing.

Just something to get off my chest, I guess.


Anyway good to point out that little trick, but here it won’t help me. It should help in the future though.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid


I do know how megatextures work, and I tried to caution them against the use of them. And failed the interview as a result. It was literally the turning point of the interview. I was applying to be their next next-gen game-engine programmer and had passed all previous interviews.

With respect to Squeenix, what a phenomenally stupid decision. I personally think MegaTextures are pretty neat (the new Radeons do them in hardware! :D) but were I in the interviewer's shoes I'd probably just get into a very heated debate :)

On-topic: So if I'm understanding this correctly, you're trying to make the actual, on-monitor clear color constant regardless of rendering settings? I'm still a little murky on the end goal here.
clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

IMHO, if you're making a physically based renderer

I almost replied saying I’m not, but I guess that is not accurate since I am working on the company’s game engine.
So technically I am making a physically-based renderer, but as for my own engine I am not.
I might shift over to physically-based in the future but for now it is fairly standard, with HDR just added.



On-topic: So if I'm understanding this correctly, you're trying to make the actual, on-monitor clear color constant regardless of rendering settings? I'm still a little murky on the end goal here.

As mentioned above the clear color was simply the easiest example I could provide of inputs matching outputs. All other types of things get modified by normals, added with other things, etc.


So basically I know that tone-mapping will prevent the possibility of getting the same output as your input, which is my coworker’s reason for skipping the conversion to linear space when setting light colors and material colors. Note that their toolchain allows the artist to set a value to indicate whether it should be converted or not, but here I am simply exploring the concept.

Regardless of tone-mapping changes to the final result, my concept is that every pixel on the screen will go through sqrt() before being presented, and thus should go through pow( 2 ) before being input.
Regardless of whether it came from a texture (that was not already in linear space), a material, a light color, the clear color, etc.

And what I want to clear up is whether or not that is a valid line of logic and whether or not it is canceled out by tone-mapping.
That is, does tone-mapping implicitly perform a conversion to (or close to) sRGB while also performing its function of brightening/darkening the screen, thus making my sqrt() an incorrect addition to the pipeline?


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid

Regardless of tone-mapping changes to the final result, my concept is that every pixel on the screen will go through sqrt() before being presented, and thus should go through pow( 2 ) before being input.
Regardless of whether it came from a texture (that was not already in linear space), a material, a light color, the clear color, etc.
It's probably obvious, but to be explicit -- The point of the [font=courier new,courier,monospace]pow(x)[/font] and [font=courier new,courier,monospace]pow(1/y)[/font] is to make sure that all of your lighting math is done in linear-space, and that your final output matches the colour-space of the user's monitor. That's all.

So, if your inputs (textures, materials, etc) are linear values, then you don't need the [font=courier new,courier,monospace]pow(x)[/font].

If a value has been chosen visually -- e.g. an artist looked at their colour-picker GUI to choose the value -- then that value is encoded in whatever colour space the artist's monitor was tuned to (probably not linear).
The most common monitor colour-spaces are gamma-1.8, gamma-2.2 and sRGB, so any value that's chosen from a monitor is probably in one of these colour-spaces. In a professional studio, you should ensure that all of your artists have monitors that are calibrated to the same colour-space (preferably sRGB).

If your artists are authoring their data on monitors that are calibrated to the sRGB standard, but you want to use the computationally efficient gamma-2.0 colour-space instead (i.e. the [font=courier new,courier,monospace]pow(2)[/font] decode and [font=courier new,courier,monospace]sqrt()[/font] encode), then you should convert all of their values from sRGB to gamma-2.0 before using them in your engine.
e.g. [font=courier new,courier,monospace]gamma2 = sqrt( (srgb <= 0.04045) ? srgb / 12.92 : pow( (srgb + 0.055)/1.055, 2.4 ) );[/font]

So, yes, most colours will need to undergo the gamma-space to linear-space (e.g. pow(x)) conversion when being used, because most colours are chosen visually. However, not every value requries this treatment.

For example, I may want to clear the background of an image to a "glowing white" colour -- in this case, i might want to call [font=courier new,courier,monospace]clear(1000, 1000, 1000)[/font] -- This value is obviously a linear light value and colour-space conversions do not work for linear light values like this.
Or, another example, the results from a radiosity/light baker represent linear light values from 0 to infinity -- i might save these values in a floating point texture. When using these values, you would not want to perform a [font=courier new,courier,monospace]pow(2)[/font] on them!

And what I want to clear up is whether or not that is a valid line of logic and whether or not it is canceled out by tone-mapping.
That is, does tone-mapping implicitly perform a conversion to (or close to) sRGB while also performing its function of brightening/darkening the screen, thus making my sqrt() an incorrect addition to the pipeline?[/quote]Most tone-mappers I've looked at take 0-infinity linear value as input, and produce a 0-1 linear value as output. However, I have seen some tone-mappers that produce a 0-1 sRGB value as output. So, it depends on the tone-mapping algorithm.
If it's a simple linear conversion, e.g. [font=courier new,courier,monospace]output = saturate(input * exposure)[/font], then it definitely doesn't account for sRGB encoding.

And what I want to clear up is whether or not that is a valid line of logic and whether or not it is canceled out by tone-mapping.
That is, does tone-mapping implicitly perform a conversion to (or close to) sRGB while also performing its function of brightening/darkening the screen, thus making my sqrt() an incorrect addition to the pipeline?

What Hodgman said. I know John Hable's ALU-based filmic tonemapper produces results in gamma 2.2 space (NOT the same as sRGB, but close) but the more boring stuff like Reinhard is linear. Considering that tonemapping itself isn't very rigorous in the 'physically-plausible' sense, film simulation aside, you can probably get away with treating the result as either for artistic effect.


The most common monitor colour-spaces are gamma-1.8, gamma-2.2 and sRGB, so any value that's chosen from a monitor is probably in one of these colour-spaces. In a professional studio, you should ensure that all of your artists have monitors that are calibrated to the same colour-space (preferably sRGB).

What's your stance on embedding a color management utility into the editor and using that to convert to sRGB/linear? While I can certainly concede that proper calibration is a Very Good Idea™ and should probably be done anyways, that solution seems vastly more robust.

EDIT: And grab the current monitor calibration for conversion purposes, I mean. I think Windows will let you do this, and the more I talk about it the more I get the urge to try and implement something like this.
clb: At the end of 2012, the positions of jupiter, saturn, mercury, and deimos are aligned so as to cause a denormalized flush-to-zero bug when computing earth's gravitational force, slinging it to the sun.

This topic is closed to new replies.

Advertisement