You are wrong. sRGB is an application of standardization to RGB color space and it is defined by three primaries in CIE XYZ color space. The transformation between linear and non-linear color spaces is entirely different topic. I've already said this before.
If you don't beleive wikipedia, check it's references.
In this case, the sRGB standard is defined by IEC 61966-2-1:1999, which you can view a draft copy of for free here
. Yes, it's defined by three XYZ primaries, and
a non-linear transformation of those primaries (which is similar to a "gamma 2.2" adjustment).
The OP was specifically asking about sRGB in OpenGL; you can read their definitions of the sRGB transform here
Those three documents describe the same non-linear transforms that appear on wikipedia... but because I've linked to them on the internet instead of quoted an ISBN, you don't believe them?
So either you're saying these documents are wrong (and that when i sample from an sRGB texture in my fragment shader, no non-linear transform of the texture data will take place
) or that these documents are wrong to call this colour space sRGB, and they've actually misappropriated the name.
If the former, you can be refuted by experiment, if the latter, then it's irrelevant as the OP was asking about the "sRGB" space that's used by GL/D3D, which is also known as IEC 61966-2-1:1999
I don't know what "sRGB" you're talking about, but what you've described is definitely not
Perhaps this whole time you've been describing the "linear RGB" space that's defined as an intermediate conversion between XYZ and sRGB, which sounds likely. The point is that we want to be doing our shading math in this linear RGB space (at a high bit depth
), but usually our input and output formats are sRGB, so we require the non-linear conversion (which is approximate to "gamma 2.2" correction, as mentioned in the specification
In fact, I think mentioning gamma in sRGB discussion is not relevant.
The fact is that in OpenGL and D3D sRGB and gamma are
related concepts, because as described in the above specification, sRGB is similar to a gamma 2.2 curve...
When I read a texel from an OpenGL sRGB texture, a non-linear transform described on the wikipedia page (which can be approximated as x^2.2)
is applied to it automatically.
When I write a pixel to an OpenGL sRGB render-target, a non-linear transform described on the wikipedia page (which can be approximated as x^(1/2.2)
) is applied to it automatically.
So if I assume that my input textures were authored in the sRGB space (or on a CRT and are happy that CRT's are close enough to sRGB displays
) and assume that the user's output display is an sRGB device, then by using sRGB textures an render-targets, I can perform all of my fragment-shading math in a linear colour space automatically (but still have non-linear inputs and outputs
), thanks to GL natively supporting these transforms.
This is the purpose of sRGB formats in OpenGL, as shown by the above OpenGL specifications.
you feel that you are right because you are moderator
No, that's insulting. I'm quoting the sRGB standard, and you're saying I'm wrong, the standard is wrong, and nVidia, ATI and Microsoft are wrong too. That's pretty simple. Why are you so opposed to learning about sRGB?
P.S. you might want to read some earlier versions of Wikipedia sRGB entry.
That old version still describes the exact same
linear transformation from XYZ followed by a non-linear transformation
!!! How can you post this stuff up, and still argue that it's a linear space? Now I think you're just trolling...
You have said that I'm wrong and failed to give any reasonable evidence to support your points
The wikipedia page that I linked to contains proper references, shown above. Where is your evidence that sRGB is just a linear transform from XYZ with no non-linear part to it?
As well as this obviously false claim, you've attacked an nVidia and microsoft article without actually stating any actual points against them or providing evidence.
You've made claims about the purpose/usefulness of sRGB resources in GL/D3D without providing any evidence to back them up too.
Read the sRGB specification already.
The end result is that when sRGB is viewed on CRT, the viewed gamma appears as 2.2,
The display has nothing to do with it -- arguing about what a signal looks like when plugged into a display of a different colour space is irrelevant.
e.g. oh I sent the HSV bytes of [0,0,100] on my RGB CRT and it came out Blue!
Yes, CRT's often work in a vague "RGB gamma 2.2" colour space -- however, this is actually a good approximation of sRGB colour space (sRGB was inspired by CRT's
), so sRGB images look almost correct when viewed on these displays... However, to display it correctly in theory you should correctly decode the sRGB signal and re-encode it in the monitor's colour space (but in practice with 8-bit inputs, this will do more harm than good
), but I assume you already know this - e.g.
if(srgb < 0.04045)
linear = srgb / 12.92;
linear = pow( ((srgb + 0.055)/1.055), 2.4 );
CRT = pow(linear, 1/2.2);
If you still don't believe that you can't do math in curved spaces, and that sRGB is a curved space, despite the specification saying so, try it for yourself:
* Pick any two XYZ colours, Axyz
* Convert them to "linear RGB" (as defined in the first part of sRGB specification) to get Alinear
* Compute their average Clinear
* Convert Alinear
. to sRGB following the full sRGB specification to get Asrgb
* Compute their average Csrgb
* Convert Csrgb
back into "linear RGB" and compare against Clinear
-- It will be very wrong in most cases.
The reason the GPU hardware supports sRGB as a native colour space now, is so we can use sRGB data for storage and display, while performing our math in a linear colour space, without having the pay the cost of transforming back and forth between the two colour spaces constantly (the hardware makes the conversion 'free'). This is a huge deal, because as the checkerboard image from earlier shows -- math in sRGB space makes no sense.
Edited by Hodgman, 18 September 2012 - 06:27 AM.