Sign in to follow this  

Texture rendering colors look a little "off"

This topic is 2845 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello all. I'm doing something very simple: mapping a texture to a quad using MDX(9) with transformed coordinates and rendering it to the screen. The textured quad appears in the correct screen space and everything looks good... except the texture seems to render incorrectly. It's as if the texture image histogram has been modified and is drawn to the screen in a strangely sharpened/contrasty state. I'm loading the texture with values from a byte array (that corresponds initially to an 8-bit grayscale image) so I saved the array to a bitmap, saved it, opened it and looked at it. It appears exactly as I would expect, so I know the source array contains the correct values. The texture is properly positioned and everything, it's just the appearance of the texture itself that looks different by a not-insignificant degree. The byte array is exactly the same dimensions as the quad so there should be a straight 1:1 mapping with no interpolation. Here's the simple source code I'm using...can anyone see what I've done wrong? Thanks in advance!

                //Values have been loaded into pixelBuffer[,]. Save to a bitmap just
		// to make sure the values look correct:

                Bitmap checkImage = new Bitmap(pixelBuffer.GetLength(1), pixelBuffer.GetLength(0), System.Drawing.Imaging.PixelFormat.Format32bppArgb);

                for (int x = 0; x < pixelBuffer.GetLength(0); x++)
                {
                    for (int y = 0; y < pixelBuffer.GetLength(1); y++)
                    {
                        byte grayVal = pixelBuffer[x,y];
                        checkImage.SetPixel(y, x, Color.FromArgb(grayVal, grayVal, grayVal));
                    }
                }

                checkImage.Save("C:\\Users\\Me\\Check Image.bmp");

		// Image looks good.




            // Instantiate a Texture, write to it:
            Bitmap texBitmap = new Bitmap(renderRectangle.Width, renderRectangle.Height, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
            backgroundTexture = new Texture(device, texBitmap, 0, Pool.Managed);
            Surface texSurface = backgroundTexture.GetSurfaceLevel(0);
            SurfaceDescription description = backgroundTexture.GetLevelDescription(0);


	// Note: "PixelColor" is a struct containing 4 bytes (a,r,g,b):
            PixelColor[] textureData = (PixelColor[])texSurface.LockRectangle(typeof(PixelColor), LockFlags.None, description.Width * description.Height);

            int CurrentTexturePosition = 0;
            for (int row = 0; row < renderRectangle.Height; row++)
            {
                for (int column = 0; column < renderRectangle.Width; column++)
                {
                    textureData[CurrentTexturePosition].a = 255;
                    textureData[CurrentTexturePosition].r = pixelBuffer[row, column];
                    textureData[CurrentTexturePosition].g = pixelBuffer[row, column];
                    textureData[CurrentTexturePosition].b = pixelBuffer[row, column];
                    CurrentTexturePosition++;
                }
            }

            // Unlock surface:
            textSurface.UnlockRectangle();



	    // Setup device and parameters:

            presentParams            = new PresentParameters();
            presentParams.Windowed   = true;
            presentParams.SwapEffect = SwapEffect.Discard;

            
            device =
            new Device(0, DeviceType.Hardware, ParentForm, CreateFlags.HardwareVertexProcessing, presentParams);

            device.RenderState.Lighting         = false;
            device.RenderState.AlphaBlendEnable = true;
            device.RenderState.SourceBlend      = Blend.SourceColor;
            device.RenderState.DestinationBlend = Blend.InvSourceAlpha;



            // Initialize the vertices used for the triangles that form
            // the background quad. Used transformed coordinates.
            // "renderRectangle" is exactly the same dimensions
            // as the "pixelBuffer" array:

            float Left   = (float)renderRectangle.Left;
            float Top    = (float)renderRectangle.Top;
            float Right  = (float)renderRectangle.Right;
            float Bottom = (float)renderRectangle.Bottom;
	
	    // Use default (clockwise) winding order:
            backgroundVertices[0].Position = new Vector4(Left, Top, 1.0f, 1.0f);
            backgroundVertices[0].Tu = 0.0f;
            backgroundVertices[0].Tv = 0.0f;

            backgroundVertices[1].Position = new Vector4(Right, Top, 1.0f, 1.0f);
            backgroundVertices[1].Tu = 1.0f;
            backgroundVertices[1].Tv = 0.0f;

            backgroundVertices[2].Position = new Vector4(Right, Bottom, 1.0f, 1.0f);
            backgroundVertices[2].Tu = 1.0f;
            backgroundVertices[2].Tv = 1.0f;

            backgroundVertices[3].Position = new Vector4(Left, Top, 1.0f, 1.0f);
            backgroundVertices[3].Tu = 0.0f;
            backgroundVertices[3].Tv = 0.0f;

            backgroundVertices[4].Position = new Vector4(Right, Bottom, 1.0f, 1.0f);
            backgroundVertices[4].Tu = 1.0f;
            backgroundVertices[4].Tv = 1.0f;

            backgroundVertices[5].Position = new Vector4(Left, Bottom, 1.0f, 1.0f);
            backgroundVertices[5].Tu = 0.0f;
            backgroundVertices[5].Tv = 1.0f;

            // Now loop through the vertices and offset by -0.5 to correctly map the
            // texels with the pixels:
            for (int x = 0; x < 6; x++)
            {
                backgroundVertices[x].X -= 0.5f;
                backgroundVertices[x].Y -= 0.5f;
            }



            // Clear the DirectX device and render:
            device.Clear(ClearFlags.Target, Color.Black, 1.0f, 0);
            device.BeginScene();
            device.SetTexture(0, backgroundTexture);
            device.VertexFormat = CustomVertex.TransformedTextured.Format;
            device.DrawUserPrimitives(PrimitiveType.TriangleList, 2, backgroundVertices);
            device.EndScene();
            device.Present();


Share this post


Link to post
Share on other sites
Thank you for the reply. This certainly sounds like it may be the problem. Where is the gamma correction applied (i.e. is it a device property that needs to be set)? How does one check to see if it is being applied?

Thanks again for the help, everyone.

-L

Share this post


Link to post
Share on other sites
I ran through the link regarding gamma correction...very interesting. I'm a little confused now though. In the code I posted above, is "checkImage" linear or sRGB?

Judging by the examples I saw in the presentation, my original pixelBuffer[,] looked like it was gamma corrected, whereas my rendered texture looked like it wasn't gamma corrected (primary values were being pushed towards their extremes, black areas had lost most of the detail, white areas looked like they were saturating).

I need to guarantee that the values I'm writing to the texture in my above program are indeed the exact same values that my pixel shader "sees" when it samples the texture. Does gamma correction do this?

Share this post


Link to post
Share on other sites
Quote:
Original post by LTKeene
I ran through the link regarding gamma correction...very interesting. I'm a little confused now though. In the code I posted above, is "checkImage" linear or sRGB?

There's one little tip.
If the original image looks fine in your monitor, it's in sRGB (of course, you can't do this if you generate your own values)

Gamma correction is needed if you apply some math to the image.
For example, if you do (pixel * 0.5f), one would expect to see half brightened pixel, but if "pixel" is in sRGB, you'll get around 1/4 brightness!

Applying gamma should be done when reading the image, and then when displaying:

Read pixel -> Linearize -> pixel * 0.5f -> De-Linearize -> Pixel on monitor

Usually a 2.2 gamma is assumed, a pixel shader would do the conversion this way:
return pow( (pow( pixel, 1 / 2.2f ) * 0.5f), 2.2f );

Gamma correction on the texture means the texture gets Linearized for the pixel shader
Gamma correction on the backbuffer means the output from the the pixel shader gets de-linearized again

Quote:
Original post by LTKeene
I need to guarantee that the values I'm writing to the texture in my above program are indeed the exact same values that my pixel shader "sees" when it samples the texture. Does gamma correction do this?

Then disable gamma correction on the texture, and enable it on the backbuffer.
A reason to do this is for example, when you generate your own values (i.e. a perlin noise texture) that has nothing to do with a photo or scanned image.

You see your image too dark because your monitor is too dark, not the shader.
To fix this, you can either enable HW gamma correction (D3DRS_sRGBWRITEENABLE) or return pow( result, 2.2f );
Note not all HW supports gamma correction. But a lot of HW supports pixel shaders nowadays...

Cheers
Dark Sylinc

Edit: Changed coefficients

Share this post


Link to post
Share on other sites
First of all, thanks to everyone who's taking the time to read/reply, it's a big help!

I need to be clearer about what it is I'm doing exactly:

1) I load the contents of a camera buffer into a byte[,] and do some processing on it on the CPU, not the GPU. The results of this processing are 32-bit floats and are written to an R32F texture which is set aside for step 3.

2) Next, I again load the contents of the camera's buffer (its a little video camera that's constantly updating its internal buffer at a low frame rate of approx. 15 fps) into a byte[,] and transfer that into a texture. This texture is mapped onto my simple quad in transformed coordinates. At this point, when rendered to the screen it looks uncorrected for gamma since the rendered image looks darker than the image portrayed by the byte[,] values. Therefore, I need to enable gamma correction in the device's render state to correct for this. However...

2) What I didn't mention (and probably should have) was that after doing the above, I'm sequentially taking images from the cameras buffer, loading them into a texture and running my pixel shader which taps it. The pixel shader acquires the values via the sampler (1:1 mapping, texture size = render size), and does a bunch of calculations. Based on the result of these calculations and a comparison with the results in step 1 that were written to the float texture, the output color is chosen.

So, to summarize...I'm comparing results obtained by operating on sRGB data with those obtained by operating on texture data fed to the pixel shader. The results just aren't what I'm used to when doing everything on the CPU and with the help of people here, it sure looks like I'm comparing apples with oranges (gamma-corrected vs not gamma-corrected pixel data). What I have to do is make sure the GPU achieves the same result as the CPU when operating on these image frames.

As a final clarification, let's say I were to load a camera frame into a byte[,] array and perform some mathematical operation on the values in the array. This operation is performed on the CPU and I get some result. Then, I take the same byte[,] but this time I write the values to a texture using the approach in my original post and pass it to the pixel shader. The pixel shader then does the exact same operation and generates a value at each pixel. If I were to somehow compare the results of the pixel shader to those from the CPU, would they be the same or would they be different (since the pixel shader is operating on un-gamma-corrected data)? If different, do I compensate by disabling gamma correction and enabling it on the back-buffer as Matias suggested?

Sorry for the long post!

Share this post


Link to post
Share on other sites
Quote:
Original post by LTKeene
I ran through the link regarding gamma correction...very interesting. I'm a little confused now though. In the code I posted above, is "checkImage" linear or sRGB?

Judging by the examples I saw in the presentation, my original pixelBuffer[,] looked like it was gamma corrected, whereas my rendered texture looked like it wasn't gamma corrected (primary values were being pushed towards their extremes, black areas had lost most of the detail, white areas looked like they were saturating).

I need to guarantee that the values I'm writing to the texture in my above program are indeed the exact same values that my pixel shader "sees" when it samples the texture. Does gamma correction do this?


Since you generated the Bitmap for checkImage it should be linear. I don't believe GDI+ will convert to sRGB when saving an image. However most image editing applications will save images in sRGB, and it's when working with those images that you need to apply gamma correction in the texture sampler.

In your case the values you get in the pixel shader should already be in linear, so that shouldn't be a problem. However when you write the pixels to he back buffer you'll want to apply gamma correction so that the back buffer is in sRGB space, which will cause the image to look correct on your monitor.

Share this post


Link to post
Share on other sites

This topic is 2845 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this