Blurred Texture Sampling Minification

Started by
3 comments, last by Tim Coolman 11 years, 9 months ago
I am writing a program in C++ using DirectX 11. As part of my user interface, I want to display buttons whose background texture consists of a rectangle with rounded edges. However, I want to be able to make these buttons of various size, without having to maintain the same aspect ratio of the source texture. But I also don't want to corners to be squished or stretched when changing the size. Below is my original source image that I'm using for my texture/shader resource. Opaque black border and opaque white filling. The small area outside the border's corners is transparent.

[attachment=9819:Button Background Template.png]

To address this, instead of using a simple quad of four vertices and mapping the texture directly to the four corners, I created a surface consisting of a 4x4 grid of vertices. The idea being that as I change the size of my button, I can maintain the original dimensions of the four corners, and just stretch or shrink the edges and center areas.

I've tried to give a representation of what I'm talking about in the image below. The values on the left are texture 'Y' coordinates (the 'X' coordinates would be the same going from left to right). I'm also showing that I've given a color to the vertices as well (because in my pixel shader I am blending the vertex color with the sampled texture color).

[attachment=9822:Button Background Template Description.png]

As you can see from the images below, my plan works great when I expand the size of the button. However, if I shrink the size of the button relative to the original texture, the corners look correct, but the edge portions of the texture sampling gives a blurred result. (Ignore that the gradient isn't linear from top to bottom - I am aware of the reason for this. My focus is on the blurring)
(The original texture shown above is 100x100 pixels. The expanded button's dimensions are 400x200; the shrunk button's dimensions are 75x42).

[attachment=9820:Expanded.png]
[attachment=9821:Shrunk.png]

I am confident that the issue is not with the blending that I'm doing - I removed that portion from the shader so I was doing a simple texture sample, but the blurred edges were still there.

I am fairly new to texture sampling, so I'm thinking maybe I have something wrong in my sample state that is causing the blur.

Here is my current sampler setup:
[source lang="cpp"] D3D11_SAMPLER_DESC samplerDescription;
ZeroMemory(& samplerDescription, sizeof(samplerDescription));
samplerDescription.Filter = D3D11_FILTER_MIN_MAG_MIP_LINEAR;
samplerDescription.AddressU = D3D11_TEXTURE_ADDRESS_CLAMP;
samplerDescription.AddressV = D3D11_TEXTURE_ADDRESS_CLAMP;
samplerDescription.AddressW = D3D11_TEXTURE_ADDRESS_CLAMP;
samplerDescription.ComparisonFunc = D3D11_COMPARISON_NEVER;
samplerDescription.MinLOD = 0;
samplerDescription.MaxLOD = D3D11_FLOAT32_MAX;[/source]
And if it helps, here is my Pixel Shader:
[source lang="cpp"]float4 PixelShaderTextureBlendHUD(PixelShaderInput pixelIn) : SV_Target
{
// Sample the texture for a color at the interpolated texture coordinate.
float4 outColor = DiffuseMap.Sample(SampleLinear, pixelIn.TextureCoordinate);

// Do multiplicate blending of the interpolated vertex color with the sampled color.
outColor *= pixelIn.Diffuse;

// If constant buffer 'IsGrayscale' flag is set, calculate the grayscale version of the color.
if(IsGrayscale > 0)
{
float luminance = outColor.x * 0.3f + outColor.y * 0.59f + outColor.z * 0.11f;
outColor = float4(luminance, luminance, luminance, outColor.w);
}

return outColor;
}[/source]

What might be causing the sampler to blur the image this way when the display area is smaller than the are of the texture being sampled? I tried using a Point sampler instead of Linear - that looked slightly different, but just as bad.

Any tips would be greatly appreciated!
Advertisement
The problem is likely caused by the fact that the screen-space texture coordinate derivatives are different in the x and y direction. This causes the GPU to sample from lower mip level than would be needed with the larger derivative's dimension. In some cases, you can configure the behavior with the "texture quality" slider in your GPU driver's property pages - higher quality tends to favour the larger texture levels.

If your boxes have 1:1 relation to screen pixels, you could try to disable mip filtering altogether.

If you do need filtering, try to enable anisotropic mip filtering. Anisotropic filtering takes several samples along the vector defined by the texcoord derivatives and blends them so that you effectively get the actual maximum detail level at each pixel.

The maximum number of anisotropic samples is configurable by using the sampler state; if you use a lot of anisotropic samples, you get more accurate results at the expense of effective fillrate. The maximum number of samples is a hint to the GPU; it can still use less samples if it determines that the maximum amount is not needed (this is commonly called "anisotropic optimization").

Niko Suni

To be honest I don't have any experience using MIP filtering. How would I go about turning it off? To use my texture in my pixel shader, I currently create an ID3D11ShaderResourceView using the D3DX11CreateShaderResourceViewFromFile function (my image is a PNG file). As I understand, the D3DX11 library is deprecated, but at this point I've yet to learn how else to create a texture from a file for use as an ID3D11ShaderResourceView.

So, using the D3DX11CreateShaderResourceViewFromFile function, I see the pLoadInfo parameter does contain a MipLevels field, however I'm currently passing NULL in that parameter. By default, do multiple MIP levels get created from a single image file? I have not explicitly created various sizes of the same texture - so unless this is done automatically, I don't know why MIP filtering would even be happening with my current code.
You want to set the MipLevels field to 1, that way no mip chain will be generated for you. Passing in NULL is the same as passing in 0, which generates a full mip chain according to the docs on that struct.

http://msdn.microsoft.com/en-us/library/windows/desktop/ff476317(v=vs.85).aspx
You guys are my heroes. Rather than passing NULL for the pLoadInfo parameter, I passed in a D3DX11_IMAGE_LOAD_INFO structure where I only set the MipLevels value to 1 (left all the rest at defaults). Worked like a charm!

I learn something new everyday. I didn't realize that DirectX would generate MIPs for you. I thought you had to supply your own variations of a texture to setup MIP filtering. Guess I should do some more reading up on MIPs and see what other kinds of trouble I can get myself into ;)

Thanks again.
Tim

This topic is closed to new replies.

Advertisement