In a texture I have a square sprite (though it does not take up the whole texture, just a small portion).
I set up my UV coordinates to target just the sprite.
oSamplerDesc.Filter = D3D11_FILTER_MIN_MAG_MIP_POINT; oSamplerDesc.AddressU = D3D11_TEXTURE_ADDRESS_WRAP; oSamplerDesc.AddressV = D3D11_TEXTURE_ADDRESS_WRAP; oSamplerDesc.AddressW = D3D11_TEXTURE_ADDRESS_WRAP; oSamplerDesc.MipLODBias = 0.0f; oSamplerDesc.MaxAnisotropy = 1; oSamplerDesc.ComparisonFunc = D3D11_COMPARISON_ALWAYS; oSamplerDesc.BorderColor = 0; oSamplerDesc.BorderColor = 0; oSamplerDesc.BorderColor = 0; oSamplerDesc.BorderColor = 0; oSamplerDesc.MinLOD = 0; oSamplerDesc.MaxLOD = D3D11_FLOAT32_MAX;
When rendering the sprite to the screen using an orthographic projection with no rotating it renders perfectly.
However, if I scale the sprite non-uniformly (as it is being use as a frame that stretches) then the parts of the texture surrounding the sprite seem to be sampled.
I drew a coloured border around my square sprite and can see the red bleeding in to the edges of the texture in the above scenario.
A more detailed explanation...
Here are some frame pieces for a frame, a bar and a corner. The bar is to be stretched to accommodate the frame dimensions whereas the corner is not.
Here is the frame rendering in-game. Notice the coloured borders are ignored, they were not included using the UV coordinates I set.
Now here is the exact same example as the above except I've changed the overall scale for the frame to reduce its size. (I've re-shaped the frame as that is part of its functionality)
I get the positions for my texture sampling like this...
The squares represent pixels, the blue squares are pixels in a sprite, the red circles are the positions I use.
Why is there sampling around the sprite when the frame is scaled smaller?