ID3DX10Font and sRGB

Started by
3 comments, last by JorgeTiagoVila 12 years, 9 months ago
Hi all!

I am currently adding Direct3D 10 support to a Direct3D 9 software I developed. The software is based on DXUT (window creation, message processing, and GUI).

Since DXGI_FORMAT_R8G8B8A8_UNORM_SRGB is the preferred DXUT Direct3D 10 format I've been using it. I coded the required parts for texture sampling in sRGB space, etc...

I can't however get the colors to match for the ID3DX10Font interface. A DrawText call using the Direct3D interface doesn't produce the same results as one using the Direct3D 10 interface (same input color).

If I use a non-sRGB format (DXGI_FORMAT_R8G8B8A8_UNORM) everything works fine. If I do a manual sRGB correction it works aswell, but is this really the way to go?

So, can anyone explain me the advantages of using an sRGB render target (i.e. why is it preferred by DXUT) and point me in the correct direction so that I can get the right colors for my text sprites?

Thanks!
Advertisement
Direct3D9 has support for SRGB textures and render-targets as well -- the [font="Courier New"]D3DRS_SRGBWRITEENABLE[/font] render-state determines if RGB->SRGB conversions are performed when writing to the current render-target, and the [font="Courier New"]D3DSAMP_SRGBTEXTURE[/font] sampler state determines if a texture will perform SRGB->RGB conversions when read from.

If you're some of performing these conversions only in D3D10 and not in D3D9 (or vice versa), it follows your colours won't match.

If you're some of performing these conversions only in D3D10 and not in D3D9 (or vice versa), it follows your colours won't match.


So if am using a SRGB render tarded on D3D10 I need to convert every color (e.g. ID3DX10Font::DrawText) before using them?

To be more explicit I need to draw a logo in Pantone 287 (RGB 0x00338D). If I use D3D9 (non-sRGB) and supply that color to the interfaces the color matches the reference image. If I do it in D3D10 (sRGB) I get a much brighter, and incorrect, version.

So if I have to convert every color prior to using it what is the point of the sRGB render target? I think I'm missing something here...


Thanks for the help.
what is the point of the sRGB render target? I think I'm missing something here...
If you're performing 3D rendering with lighting, gamma correction (the process of converting between sRGB and linear RGB) is very important.

sRGB is curved -- 127 is not half as bright as 255. It turns out that 186 is half as bright as 255!! Therefore, if you're ever doing math in sRGB-space, your results are going to be wrong (because of the curved nature of this numbering system).

However, most images are stored in sRGB, as it gives better precision for most images than linear-RGB does... so we want to store our textures in sRGB, but do our math in linear RGB.

That's why modern GPUs support sRGB textures and render-targets, and support automatic conversion between the two colour-spaces.

Here's some info. N.B. "sRGB" is also called "Gamma space" and "Gamma 2.2".
http://http.develope...gems3_ch24.html
http://filmicgames.com/archives/299
http://altdevblogada...mma-correction/
http://en.wikipedia....amma_correction
To be more explicit I need to draw a logo in Pantone 287 (RGB 0x00338D).[/quote]Firstly, an RGB value like 0x00338D is completely meaningless without also knowing what color-space it's in -- it's like addressing a package to 34th street without mentioning the city.

Pantone have adopted the sRGB standard, so any RGB values they specify will be in this color space. That means to view the color as they intended it, you need to be viewing it on a monitor that's also properly calibrated to the sRGB standard (approx gamma = 2.2, white = 6500K). N.B. Most monitors are calibrated to display sRGB signals -- so when the monitor receives the signal "186", it knows you want to display a pixel that's half way in-between "255" and "0".

If you tell D3D that none of your textures/targets are sRGB, then no automatic conversions will occur anywhere, and your data will pass through the pipe unchanged.

Alternatively, make sure that both your textures and your render-targets are specified as sRGB, so that the textures are 'decoded' to linear, and then 'encoded' back to sRGB again
So if am using a SRGB render tarded on D3D10 I need to convert every color (e.g. ID3DX10Font::DrawText) before using them?[/quote]No. If you're sampling from an sRGB texture, this conversion will happen automatically, if you've told D3D that it's an sRGB texture..

So if you use sRGB textures and sRGB render-targets, it looks like:
Texture (186) -> sample (converts from 186 sRGB to 127 linear RGB) -> shader code (operates on 127) -> output (converts from 127 linear RGB to 186 sRGB) -> Target (186)

If you don't use sRGB texture or sRGB render-targets, you get the same results (assuming your shader code doesn't do any mathematical operations on the colours)
Texture (186) -> sample -> shader code (operates on 186) -> output ->target (186)

From the sounds of it, you're only using an sRGB target, but not sRGB textures, which means your pipe looks like:
Texture (186) -> sample -> shader code (operates on 186) -> output (converts from 186 linear RGB to 220 sRGB) -> Target (220)
Thanks for the gamma correction articles. :)


If you tell D3D that none of your textures/targets are sRGB, then no automatic conversions will occur anywhere, and your data will pass through the pipe unchanged.

Alternatively, make sure that both your textures and your render-targets are specified as sRGB, so that the textures are 'decoded' to linear, and then 'encoded' back to sRGB again




Regarding textures, I'm using this:


D3DX10_IMAGE_INFO imageInfo;
D3DX10GetImageInfoFromFile(filename, nullptr, &imageInfo10, nullptr);

D3DX10_IMAGE_LOAD_INFO loadInfo;
loadInfo.BindFlags = D3D10_BIND_SHADER_RESOURCE;
loadInfo.Format = imageInfo.Format;
loadInfo.pSrcInfo = &imageInfo10_;

ID3D10Resource* pTexture = NULL;
HRESULT hr = D3DX10CreateTextureFromFileW(pd3dDevice, filename, &loadInfo, nullptr, &pTexture, nullptr);

// Create shader resource view from texture...


This creates a bright (incorrect) output as you can see below:

33y2nva.png

If I specify a sRGB format the output is the same.

loadInfo.Format = DXGI_FORMAT_R8G8B8A8_UNORM_SRGB;


If I use any sRGB filter flags I get a D3DERR_INVALIDCALL:

loadInfo.Filter = D3DX10_FILTER_SRGB_IN;
loadInfo.Filter = D3DX10_FILTER_SRGB_OUT;
loadInfo.Filter = D3DX10_FILTER_SRGB;


In order to match the reference image (as I see it on any image editor) I need to:

  1. Create a new texture with the same parameters as the original and the
    corresponding typeless format.
  2. Copy the original texture resource to the new texture.
  3. Create a shader resource view with the corresponding sRGB
    format and the data of the new texture.
This seems terribly inefficient...

Regarding the ID3DX10Font (my original problem)... If I set my text color to a RGB value, render it with ID3DX10Font::DrawText(), do a print screen on the output, paste it on an image editor and use a color picker the RGB value I get is not the one I asked for. Is this the correct behaviour?

This topic is closed to new replies.

Advertisement