• Create Account

FREE SOFTWARE GIVEAWAY

We have 4 x Pro Licences (valued at \$59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.

Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!

# Linear color space

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

13 replies to this topic

### #1larspensjo  Members   -  Reputation: 1557

Like
0Likes
Like

Posted 17 September 2012 - 02:34 AM

I am trying to learn about when to use, and when to not use, linear and non-linear color space. This is how I understand it, please correct me where I am wrong or have incomplete understanding:
• Many texture manipulations need to be done in linear space, e.g. anti aliasing and light manipulations.
• Many tools, like Photoshop, save pictures in non-linear format by default.
• You can specify SRGB as a bitmap format (e.g. GL_SRGB in OpenGL to glTexImage2D), and the graphic drivers (or the hardware?) will automatically transform the bitmap you sample from non-linear to linear.
• If you transform it yourself, you do that by setting each color component to c^2.2. This would be an approximation of the SRGB.
• You can transform each color channel independently on the others.
• As a last step, outputting the pixels to the screen, you need to transform it back into non-linear space, using c^(1/2.2) for each channel.
• The value 2.2 depends on the display you use. It looks like Apple use 1.8.
I am not sure about the glGenerateMipmap() function. Will it take the linear/non-linear attribute (SRGB) into account when applying the filter functions?

Is the approximation above good enough, for showing pixels on the screen? Or is it that the exact SRGB encoding need to be used?

Is there any automatic support in the hardware for the final pixel transformation to show on screen?

In most example and tutorials "out there", you don't find any gamma correction being done. So either I don't understand this, or there is a general lack of understanding elsewhere (or something in between :-).
Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

### #2Hodgman  Moderators   -  Reputation: 31938

Like
2Likes
Like

Posted 17 September 2012 - 05:02 AM

It sounds like you understand

In most example and tutorials "out there", you don't find any gamma correction being done. So either I don't understand this, or there is a general lack of understanding elsewhere

Yes, "gamma correct rendering" has only become popular in real-time graphics over, maybe the past 5 or so years (guessing), along with the growth in popularity of HDR and physically-based lighting. A lot of older real-time rendering literature ignores gamma issues.

I am trying to learn about when to use, and when to not use, linear and non-linear color space

You can think of sRGB as a "compression" format for the number of bits required to store an image without colour banding. Humans are good at differentiating between different colours, especially dark colours. The sRGB curve allocates "more bits" to representing darker colours, which means it allows 8-bit images to have less noticeable colour-banding in dark areas.
I've seen it stated that to get the same precision in dark areas, a linear image would need 10-16 bits.
The bad thing is that doing math in curved spaces is difficult -- we're used to flat spaces, like the number-line, or the cartesian plane, but "gamma spaces" (like sRGB, or "gamma 2.2") aren't flat, they're curved.

I am not sure about the glGenerateMipmap() function. Will it take the linear/non-linear attribute (SRGB) into account when applying the filter functions?

OpenGL and Dx10/11 should do this correctly (convert to linear, downsample, convert to sRGB), but DX9 does not.

You can specify SRGB as a bitmap format (e.g. GL_SRGB in OpenGL to glTexImage2D), and the graphic drivers (or the hardware?) will automatically transform the bitmap you sample from non-linear to linear.

When you sample from the texture, the texture-fetch hardware will apply the inverse sRGB curve to convert it from 8-bit sRGB to floating-point linear.

As a last step, outputting the pixels to the screen, you need to transform it back into non-linear space, using c^(1/2.2) for each channel.

If your render-target is created as an sRGB texture, then when you write to it, the hardware will perform the linear->sRGB conversion when writing values from your pixel shader automatically.
It's common to just assume that the user's monitor is an sRGB monitor (because, it's pretty much *the* standard), but yes, a lot of monitors aren't actually sRGB -- I've seen gamma 2.4 and gamma 1.8 monitors before. To get the correct appearance on these monitors, it would be better to manually convert to e.g. gamma 1.8 rather than to convert to sRGB.

Many tools, like Photoshop, save pictures in non-linear format by default

If you're painting a picture in an application that doesn't do any "gamma correction", then the data in your file is in the same "gamma space" as your monitor.
That is to say -- the image will only look the way that I saw it (while painting it), if it's displayed on another monitor with the same gamma-curve. If I paint my artwork on a gamma-1.8 screen, and then view it on an sRGB screen, it will look different (because the original data was painted in the "gamma 1.8 space").
For this reason, it's common for games studios to buy expensive calibration equipment to make sure that all of their artists monitors are correctly calibrated to the sRGB curve.

Edited by Hodgman, 17 September 2012 - 05:12 AM.

### #3Lifepower  Members   -  Reputation: 118

Like
-3Likes
Like

Posted 17 September 2012 - 07:30 AM

Just wanted to clarify this one. For some reason, they have used "sRGB" to denote "linear color space" in DirectX and OpenGL, which are just two separate things.

Indeed, you can convert from linear to non-linear color spaces and vice-versa by using Gamma correction.

RGB color space by itself lacks any standard or definition, so sRGB was proposed as a standard, which is defined by specifying white point and three chromaticities. For instance, there is also Wide gamut RGB, Adobe RGB and so on.

Now, the conversion from one color space to another, where the color gamut is different, you would need to convert your initial color space to CIE XYZ by using linear transformation and then to the desired color space.

This is why it is simply wrong to call sRGB "linear" and non-sRGB "non-linear" and do the conversion between both using gamma correction. In reality, both typical RGB and sRGB may or may not be linear.

In fact, typically, you can assume that your RGB color space is actually linear. You don't need to voluntarily apply any gamma correction there. Since it lacks standard definition, you can simply assume that when you work with RGB, you work in sRGB, or in Adobe RGB - whatever your choice is. In order to properly standarize your color space, you would need to convert it to one of perceptually uniform (or supposedly) color spaces such as CIELAB, CIELUV, DIN99, ATD95, CIECAM, or at least CIE XYZ, which can actually represent all visible colors by human eye, unlike RGB, which is limited by triangle in CIE diagram.

Now, the problem is that most LCD displays apply huge gamma correction to the input image. Not only that, they may also pre-process images and oversaturate them too. Why? To sell better since higher contrast and crispier images appear prettier, but in the end you receive a very distorted image. This is not your problem, it is a problem of display's manufacturers and vendors! You simply can't make an application that will predict all of the monitors out there, so it's their responsibility to generate final image as accurate as possible.

I don't know why they introduced "sRGB" into DirectX and OpenGL - after all, suposedly, you are already working in sRGB and it's display's job to properly represent input sRGB data so that output strictly conforms to sRGB, or any other standard. If you do gamma correction in your application - well, you still don't know how display is going to re-transform your image data, so in the end you may actually get less accurate results.

My guess is that they introduced so-called "sRGB" in APIs just for the hype of it, e.g.: "We can now store textures and front-buffer in gamma-adjusted format! WOW!" (like we couldn't do it back in 1969).

You may check some of the following bibliography to figure out more about different color spaces (you can see by the dates that this is a very studied topic, yet it seems that people making changes in DirectX/OpenGL standards regarding sRGB have never read them):

1. Poynton, Charles. Digital Video and HDTV Algorithms and Interfaces. Morgan Kaufmann, 2003.
3. Hill, Francis S. Computer Graphics using OpenGL. Prentice Hall, 2000.
4. Hearn, Donald, and Pauline M. Baker. Computer Graphics, C Version. Prentice Hall, 1996.
5. Luo, Ronnier M., Guihua Cui, and Changjun Li. "Uniform Colour Spaces Based on CIECAM02 Colour Appearance Model." Color Research & Application (Wiley InterScience) 31, no. 4 (June 2006): 320-330.
6. Lindbloom, Bruce J. "Accurate Color Reproduction for Computer Graphics Applications." Computer Graphics 23, no. 3 (July 1989): 117-126.
7. Brewer, C. A. "Color Use Guidelines for Data Representation." Proceedings of the Section on Statistical Graphics. Alexandria VA: American Statistical Association, 1999. 55-60.
8. MacAdam, David L. "Visual Sensitivities to Color Differences in Daylight." (Journal of the Optical Society of America) 32, no. 5 (May 1942): 247-273.
9. Schanda, Janos. Colorimetry: Understanding the CIE system. Wiley Interscience, 2007.
10. Pratt, William K. Digital Image Processing. 3rd Edition. Wiley-Interscience, 2001.
11. Keith, Jack. Video Demystified: A Handbook for the Digital Engineer. 5th Edition. Fremont, CA: Newnes, 2007.

### #4Hodgman  Moderators   -  Reputation: 31938

Like
0Likes
Like

Posted 17 September 2012 - 07:51 AM

Just wanted to clarify this one. For some reason, they have used "sRGB" to denote "linear color space" in DirectX and OpenGL, which are just two separate things.

No, sRGB is not linear color space so your accusations levelled against DirectX/OpenGL are wildly inaccurate.

For why it was so important to add sRGB support to GPUs, read this primer: http://http.develope...gems3_ch24.html

Yes, the sRGB standard does define a standard linear RGB space (as an intermediate step) based on a standardized red/green/blue linear transform from CIE XYZ space...
...but sRGB is a curved "gamma corrected" transformation of these standard linear RGB values.
sRGB isn't even a simple "gamma correction" transform. It's similar to gamma correction of 2.2, but it's actually a piecewise transform with a linear toe at the bottom and a gamma of 2.4 at the top.
Linear RGB to sRGB = http://www.wolframal...31308 & x<=1}}]
sRGB to linear RGB = http://www.wolframal...04045 & x<=1}}]

It's the standard color space for the WWW, and it's being pushed as a standard color space for TVs, computer monitors, cameras, etc... If a display performs sRGB "gamma correction" on the signal, then that's a good thing -- they're supposed to assume that the input signal is sRGB (~gamma 2.2) and adjust voltages accordingly to produce the appropriate perceptually linear luminance response.

Yes it's true that different monitors will do different, wacky things to the input signal, but the world is getting saner in this regard thanks to most manufacturers agreeing to adopt a single gamma standard. The right thing™ to do these days is to perform all of your math in a linear color space, and then output an sRGB signal, unless otherwise asked not to.

Edited by Hodgman, 17 September 2012 - 08:04 AM.

### #5Lifepower  Members   -  Reputation: 118

Like
-3Likes
Like

Posted 17 September 2012 - 08:37 AM

No, sRGB is not linear color space so your accusations levelled against DirectX/OpenGL are wildly inaccurate.

No, my accusations against sRGB in DirectX/OpenGL are based on fact that conversion between RGB and sRGB is thought in terms of gamma correction, while in reality RGB and sRGB may actually be the same thing. In any case, you cannot convert between the two using gamma correction, so this SDK article, for instance, is misleading.

For why it was so important to add sRGB support to GPUs, read this primer: http://http.develope...gems3_ch24.html

Have you read the article yourself? The article you provided can be used as an exercise to find out logical fallacies. Begging the question and fallacy of composition are amongst the first ones visible.

Yes, the sRGB standard does define a standard linear RGB space (as an intermediate step) based on a standardized red/green/blue linear transform from CIE XYZ space...

So, you've repeated what I've said, then added phrase "as an intermediate step" (to what, by the way?) and now you are saying that:

...but sRGB is a curved "gamma corrected" transformation of these standard linear RGB values.
sRGB isn't even a simple "gamma correction" transform. It's similar to gamma correction of 2.2, but it's actually a piecewise transform with a linear toe at the bottom and a gamma of 2.4 at the top.

Nonsense. sRGB is just a color space, nothing more. It's not "gamma correction of 2.2", left alone "piecewise transform with linear toe at [gibberish]". Proof by verbosity is a logical fallacy (but you already know that), please don't do that.

It's the standard color space for the WWW, and it's being pushed as a standard color space for TVs, computer monitors, cameras, etc... If a display performs sRGB "gamma correction" on the signal, then that's a good thing -- they're supposed to assume that the input signal is sRGB (~gamma 2.2) and adjust voltages accordingly to produce the appropriate perceptually linear luminance response.

Two separate arguments. Yes, sRGB is a standard and popular color space. But starting from "display performs sRGB gamma correction" - just a senseless manipulation of words.

### #6Hodgman  Moderators   -  Reputation: 31938

Like
1Likes
Like

Posted 17 September 2012 - 09:03 AM

So, you've repeated what I've said, then added phrase "as an intermediate step" (to what, by the way?) and now you are saying that
...
Nonsense. sRGB is just a color space, nothing more. It's not "gamma correction of 2.2",

No, you misread - the sRGB standard defines two colour spaces -- one is a linear RGB colour-space, which is used as an intermediate between CIE XYZ and sRGB-proper.
Once you've got colour data in this "linear RGB" space, you can perform the above transforms on it to get the values into the non-linear sRGB space.

Regarding sRGB being similar to "gamma 2.2" -- the above functions to convert to/from linear/sRGB (the "piecewise gibberish") can be approximated by x^2.2 and x^(1/2.2) (i.e. the regular "gamma correction" process with a power of 2.2).

Linear RGB colour spaces can be used to describe physical quantities of energy, not just colours. If I've got 100 "units" of photons at the "red" wavelength and send them through a half-silvered-mirror so I end up with only half of them, then I've now got "50" units of red photons. This kind of math does not work in non-linear spaces like sRGB.

Likewise if I've got a black/white checker pattern (0 & 255) and mathematically average it, I get a "50% grey" image (127). In a linear color-space, this value is exactly half as bright as the original white squares. However, in non-linear spaces this math doesn't work. For example, in sRGB 127 is ~21% as bright as 255. If you down-scale an sRGB image of a black/white checkerboard, the resulting colour should be ~187 (which corresponds to "half as bright as white").

e.g. the left half of this image is resized by performing the naive math (averaging sRGB values directly resulting in 127).
The right half performs the math correctly (convert sRGB values to linear, average to 127, then convert back to sRGB, resulting in 187).
If I squint at the image from a distance (to manually average the black/white pattern in my eye), the right hand side all looks almost the same brightness, but the left hand side is obviously too dark.

The linked article from nVidia shows the disastrous consequences from trying to perform math in a non-linear colour space.

while in reality RGB and sRGB may actually be the same thing

Yes, RGB is a loose term so it could mean anything.
But in rendering we deal with linear-RGB spaces, and non-linear RGB spaces (such as "gamma 2.2" and sRGB).
I posted the equations to transform between "linear RGB" and sRGB above, so when we talk about them in rendering they're definitely not the same - one is mathematically linear and one is not!

Have you read the article yourself? The article you provided can be used as an exercise to find out logical fallacies. Begging the question and fallacy of composition are amongst the first ones visible.

Wow. That article describes the basics for achieving physically correct math in a renderer. What is your problem with it?

you cannot convert between the two using gamma correction, so this SDK article, for instance, is misleading

What's your problem with that article?
Maybe if you're disagreeing with nVidia, Microsoft, Kronos, and the GL ARB (3Dlabs, Apple, ATI, Dell, IBM, Intel, SGI, Sun), the problem is actually that they know what they're doing and you're refusing to read wikipedia to catch up? (the argumentum ad verecundiam fallacy, I know -- but while I'm here, what does make you an expert over them?)

Yes, sRGB is a standard and popular color space. But starting from "display performs sRGB gamma correction" - just a senseless manipulation of words.

The display has to calibrate it's internal voltages so that when it receives a value of 255 it outputs at maxium luminosity, at 187 it outputs half-maximum luminosity, and at 127 it outputs at 21% maximum luminosity. That's the sRGB correction that the monitor must perform.

Edited by Hodgman, 17 September 2012 - 09:39 AM.

### #7Lifepower  Members   -  Reputation: 118

Like
-3Likes
Like

Posted 17 September 2012 - 09:56 AM

Maybe if you're disagreeing with nVidia, Microsoft, Kronos, and the GL ARB (3Dlabs, Apple, ATI, Dell, IBM, Intel, SGI, Sun), the problem is actually that they know what they're doing and you're refusing to read wikipedia to catch up?

The articles I've mentioned in my post are verified and have been passed scientific review (by several council members), while you provide your opinions backed up by your own words, some stuff on Internet and popular belief.

So you are saying that the companies you have randomly selected and mentioned have something to do in decision-making regarding misleading usage of sRGB term? With this, you automatically decide that I'm wrong and you are right?

(Appeal to authority fallacy, I know -- but while I'm here, what does make you an expert over them?)

Yes, it's appeal to authority. You are moderator, so you are always right and if there is something you don't like, you rather attack the person (Appeal to the person fallacy) rather than provide sound arguments in discussion. While we're here - I don't consider myself authority and there are many things in the world that I don't know or understand, and I'm humble about it. Yes, my master's and doctoral thesis works were regarding practical applications in mobile systems of color theory and I have published 12 council-reviewed scientific works regarding different color spaces and applications, so this is why I have something to say about it. I could always be mistaken as well as people who review and judge my work, but while I try to base my points on proven facts, you try to prove something by use of Wikipedia, popular folklore and your Moderator badge.

Please, I know your intentions in answering OP's question was good, I just tried to clarify things as something that made its way to SDK does not necessarily mean it is correct. You don't have either to defend something blindly just because I've pointed out to a misconception in Microsoft manual.

### #8Hodgman  Moderators   -  Reputation: 31938

Like
0Likes
Like

Posted 17 September 2012 - 10:12 AM

You don't have either to defend something blindly just because I've pointed out to a misconception in Microsoft manual.

You said "For some reason, they have used "sRGB" to denote "linear color space" in DirectX and OpenGL" which is absolutely 100% false, so is a statement that should be criticized. Which of your references backs up this statement of yours?

sRGB is defined as a non-linear transformation from a particular linear RGB colour space.

Also, you've said [brackets mine]:
Indeed, you can convert from linear [RGB] to non-linear [sRGB] color spaces and vice-versa by using Gamma correction.
And then:
In any case, you cannot convert between the two [RGB and sRGB] using gamma correction, so this SDK article, for instance, is misleading.

The above checkerboard image and the linked nVidia article explain why you cannot perform your shading math in curved spaces such as sRGB, and thus why sRGB values have to be decoded to linear values for shading (and possibly re-encoded to sRGB for display or storage).

Here's the short form of linear vs non-linear:
Math in sRGB: (0+1)/2=0.22
Math in any linear space: (0+1)/2=0.5

What errors or misleading statements are there in the Microsoft and nVidia links that you've accused?

So you are saying that the companies you have randomly selected and mentioned have something to do in decision-making regarding misleading usage of sRGB term? With this, you automatically decide that I'm wrong and you are right?

The companies I listed are responsible for the important nVidia page you've denounced and the design of D3D/GL, which you've denounced.

I decided that you're wrong because you're saying things that I know are wrong. I've worked on converting a lot of renderers from being "gamma ignorant" to performing proper gamma correction and linear-space lighting. You can't perform shading math in colour spaces like sRGB because of the non-linearity. This makes it fundamentally different from linear RGB colour spaces. This is the reason why it was so important to add them to D3D/GL.

You are moderator, so you are always right and if there is something you don't like, you rather attack the person

Moderators always being right is a ridiculous appeal to authority. We generally don't moderate threads that we've participated in either, so my ability to lock/hide abusive content is irrelevant.

I've explained where and why you were wrong, which you've brushed off as nonsense, gibberish and senseless words. I think I've been quite polite regarding such condescension.
Despite 'appeal to popularity' being a fallacy, you do have to consider that perhaps you're just wrong and you should go and re-read the sRGB wikipedia page -- Occam's razor and all that... but take it as a personal slight instead of reflecting on it if you must, or explain to me the flaw in the above math.

### #9Lifepower  Members   -  Reputation: 118

Like
-5Likes
Like

Posted 17 September 2012 - 11:00 AM

sRGB is defined as a non-linear transformation from a particular linear RGB colour space.

You are wrong. sRGB is an application of standardization to RGB color space and it is defined by three primaries in CIE XYZ color space. The transformation between linear and non-linear color spaces is entirely different topic. I've already said this before.

What errors or misleading statements are there in the Microsoft and nVidia links that you've accused?

I've already said this in my earlier posts. The error is to mix gamma correction concepts along with RGB and sRGB color spaces together, trying to imply that at one point or another when you "convert" or "transform" (or similar term) from one to another, you need to do gamma correcion, or that at some point gamma correction is applied. My suggested correction is that there are separate topics and the introduction of sRGB texture format is poorly fundamented and sRGB color space name is misused.

I decided that you're wrong because you're saying things that I know are wrong.

Just because you think/decide/believe I'm wrong, it doesn't make you right. It just makes you superficial.

Moderators always being right is a ridiculous appeal to authority. We generally don't moderate threads that we've participated in either, so my ability to lock/hide abusive content is irrelevant.

I was not referring to actual thread moderation, rather than you feel that you are right because you are moderator. Perhaps I'm wrong and maybe there are other reasons why you think you are automatically right.

I've explained where and why you were wrong, which you've brushed off as nonsense, gibberish and senseless words.

You have said that I'm wrong and failed to give any reasonable evidence to support your points, other than referring to popular belief, your own belief, mixing my phrases with new words among others. I wouldn't mind if you only posted your own points, but copying my text and then adding stuff of your own with the purpose of misguiding the discussion is just uncool. I think you just don't like being seen as wrong on forums where you moderate.

Despite 'appeal to popularity' being a fallacy, you do have to consider that perhaps you're just wrong and you should go and re-read the sRGB wikipedia page -- Occam's razor and all that... but take it as a personal slight instead of reflecting on it if you must.

P.S. you might want to read some earlier versions of Wikipedia sRGB entry. The end result is that when sRGB is viewed on CRT, the viewed gamma appears as 2.2, but again, this is CRT/Display issue, not the space itself. Coincidence and consequence are two different things. Just because gamma is mentioned, it doesn't mean (non-S)RGB has different gamma. In fact, I think mentioning gamma in sRGB discussion is not relevant.

### #10Hodgman  Moderators   -  Reputation: 31938

Like
3Likes
Like

Posted 17 September 2012 - 09:55 PM

You are wrong. sRGB is an application of standardization to RGB color space and it is defined by three primaries in CIE XYZ color space. The transformation between linear and non-linear color spaces is entirely different topic. I've already said this before.

If you don't beleive wikipedia, check it's references.
In this case, the sRGB standard is defined by IEC 61966-2-1:1999, which you can view a draft copy of for free here. Yes, it's defined by three XYZ primaries, and a non-linear transformation of those primaries (which is similar to a "gamma 2.2" adjustment).

The OP was specifically asking about sRGB in OpenGL; you can read their definitions of the sRGB transform here and here.

Those three documents describe the same non-linear transforms that appear on wikipedia... but because I've linked to them on the internet instead of quoted an ISBN, you don't believe them?

So either you're saying these documents are wrong (and that when i sample from an sRGB texture in my fragment shader, no non-linear transform of the texture data will take place) or that these documents are wrong to call this colour space sRGB, and they've actually misappropriated the name.
If the former, you can be refuted by experiment, if the latter, then it's irrelevant as the OP was asking about the "sRGB" space that's used by GL/D3D, which is also known as IEC 61966-2-1:1999.

I don't know what "sRGB" you're talking about, but what you've described is definitely not IEC 61966-2-1.
Perhaps this whole time you've been describing the "linear RGB" space that's defined as an intermediate conversion between XYZ and sRGB, which sounds likely. The point is that we want to be doing our shading math in this linear RGB space (at a high bit depth), but usually our input and output formats are sRGB, so we require the non-linear conversion (which is approximate to "gamma 2.2" correction, as mentioned in the specification).

In fact, I think mentioning gamma in sRGB discussion is not relevant.

The fact is that in OpenGL and D3D sRGB and gamma are related concepts, because as described in the above specification, sRGB is similar to a gamma 2.2 curve...

When I read a texel from an OpenGL sRGB texture, a non-linear transform described on the wikipedia page (which can be approximated as x^2.2) is applied to it automatically.
When I write a pixel to an OpenGL sRGB render-target, a non-linear transform described on the wikipedia page (which can be approximated as x^(1/2.2)) is applied to it automatically.

So if I assume that my input textures were authored in the sRGB space (or on a CRT and are happy that CRT's are close enough to sRGB displays) and assume that the user's output display is an sRGB device, then by using sRGB textures an render-targets, I can perform all of my fragment-shading math in a linear colour space automatically (but still have non-linear inputs and outputs), thanks to GL natively supporting these transforms.

This is the purpose of sRGB formats in OpenGL, as shown by the above OpenGL specifications.

you feel that you are right because you are moderator

No, that's insulting. I'm quoting the sRGB standard, and you're saying I'm wrong, the standard is wrong, and nVidia, ATI and Microsoft are wrong too. That's pretty simple. Why are you so opposed to learning about sRGB?

P.S. you might want to read some earlier versions of Wikipedia sRGB entry.

That old version still describes the exact same linear transformation from XYZ followed by a non-linear transformation!!! How can you post this stuff up, and still argue that it's a linear space? Now I think you're just trolling...

You have said that I'm wrong and failed to give any reasonable evidence to support your points

The wikipedia page that I linked to contains proper references, shown above. Where is your evidence that sRGB is just a linear transform from XYZ with no non-linear part to it?
As well as this obviously false claim, you've attacked an nVidia and microsoft article without actually stating any actual points against them or providing evidence.
You've made claims about the purpose/usefulness of sRGB resources in GL/D3D without providing any evidence to back them up too.

The end result is that when sRGB is viewed on CRT, the viewed gamma appears as 2.2,

The display has nothing to do with it -- arguing about what a signal looks like when plugged into a display of a different colour space is irrelevant.
e.g. oh I sent the HSV bytes of [0,0,100] on my RGB CRT and it came out Blue!
Yes, CRT's often work in a vague "RGB gamma 2.2" colour space -- however, this is actually a good approximation of sRGB colour space (sRGB was inspired by CRT's), so sRGB images look almost correct when viewed on these displays... However, to display it correctly in theory you should correctly decode the sRGB signal and re-encode it in the monitor's colour space (but in practice with 8-bit inputs, this will do more harm than good), but I assume you already know this - e.g.
if(srgb < 0.04045)
linear = srgb / 12.92;
else
linear = pow( ((srgb + 0.055)/1.055), 2.4 );
CRT = pow(linear, 1/2.2);

If you still don't believe that you can't do math in curved spaces, and that sRGB is a curved space, despite the specification saying so, try it for yourself:
* Pick any two XYZ colours, Axyz and Bxyz.
* Convert them to "linear RGB" (as defined in the first part of sRGB specification) to get Alinear and Blinear.
* Compute their average Clinear.
* Convert Alinear and Blinear. to sRGB following the full sRGB specification to get Asrgb and Bsrgb.
* Compute their average Csrgb.
* Convert Csrgb back into "linear RGB" and compare against Clinear -- It will be very wrong in most cases.

The reason the GPU hardware supports sRGB as a native colour space now, is so we can use sRGB data for storage and display, while performing our math in a linear colour space, without having the pay the cost of transforming back and forth between the two colour spaces constantly (the hardware makes the conversion 'free'). This is a huge deal, because as the checkerboard image from earlier shows -- math in sRGB space makes no sense.

Edited by Hodgman, 18 September 2012 - 06:27 AM.

### #11larspensjo  Members   -  Reputation: 1557

Like
0Likes
Like

Posted 18 September 2012 - 06:55 AM

Thanks for the very detailed information!

If your render-target is created as an sRGB texture, then when you write to it, the hardware will perform the linear->sRGB conversion when writing values from your pixel shader automatically.

It seems to be easy to do this in (looking at OpenGL specifically now). Using a Frame Buffer Object with a target texture object of format GL_SRGB8_ALPHA8 (which is a required format). The only caveat is that the transformation to SRGB color space should be done last, and I would prefer not to have a dummy draw into a FBO just to get this transformation. I don't think it is possible to associate the attribute "SRGB" with the default frame buffer?

You can do glEnable(GL_FRAMEBUFFER_SRGB), but only if the value GL_FRAMEBUFFER_ATTACHMENT_COLOR_ENCODING of the destination buffer is GL_SRGB.

Of course, there is always the possibility of doing the transformation yourself, in the shader. But automatic built-in functionality is sometimes more optimized.
Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

### #12MJP  Moderators   -  Reputation: 11786

Like
1Likes
Like

Posted 18 September 2012 - 02:33 PM

It's possible in DX, so it should be possible in GL as well. Although typically in the very last step you want to do the transformation yourself, so that you allow the user to tweak the curve slightly in order to compensate for the gamma of their display.

### #13larspensjo  Members   -  Reputation: 1557

Like
0Likes
Like

Posted 19 September 2012 - 04:14 AM

CRT = pow(linear, 1/2.2);

This would be an approximation of the sRGB algorithm, which is
if (linear <= 0.0031308)
CRT = linear * 12.92;
else
CRT = 1.055 * pow(linear, 1/2.4) - 0.055;

For high values, they are near. For low values, they start to diverge, especially very low values. What is the reason for using this approximation? I suppose one reason can be that of shader performance.

Possibly, it may be that the lower values will not be noticed in games. But that would seem to be a contradiction to the purpose of having extra resolution in the low value interval of sRGB. Actually, my game has some banding problems when drawing the outer limits of a spherical fog in a dark environment (while still not taking sRGB color space into account).

Sampled values from 8-bit textures will only have a resolution of 1/255 = 0.0039, which would not be used in the sRGB conversion unless using some form of HDR.
Current project: Ephenation.
Sharing OpenGL experiences: http://ephenationopengl.blogspot.com/

### #14Hodgman  Moderators   -  Reputation: 31938

Like
1Likes
Like

Posted 20 September 2012 - 09:59 AM

What is the reason for using this approximation?

In that code, I was assuming that the input values were in non-linear sRGB, but the desired output was "CRT RGB" (linear RGB with gamma 2.2) -- the code was decoding from sRGB to linear, and then re-encoding the linear values to "CRT RGB". i.e. it was an sRGB->CRT conversion function.
N.B. this isn't a very useful thing to do in practice, because if this operation is done with 8-bits inputs and outputs, you'll just be destroying information. If displaying 8-bit sRGB images on a CRT, it would be best to avoid doing the right thing™ (which is converting the data into the display's colour encoding) and just output the sRGB data unmodified, because "CRT RGB" and sRGB are so similar.

In a game, if you had sRGB textures and the user had a CRT monitor, the right thing to do would be to sRGB decode the textures to linear, do your shading in linear (at high precision, e.g. float/half) and then encode the result with x^(1/2.2) to compensate for the CRT's natural response of x^2.2.
However, most user's don't have a CRT, so it's best to encode the final result with the standard sRGB curve instead of the CRT gamma 2.2 curve.

Edited by Hodgman, 20 September 2012 - 10:04 AM.

Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

PARTNERS