I am making a 2D game in D3D11 and I am implementing my own renderer which works like this:
1. A sprite has a texture, size and a position
2. To draw the sprite I create a rectangle which reflects the screen position and size of the sprite and map it to the vertex buffer
3. Vertex shader just applies orthographics projection to the rectangle coordinates
4. In the pixel shader I sample the sprite's texture and output it as color
Let's say that I am drawing a triangle. The texture is square but the part which does not contain the triangle has alpha = 0. I expected that only the triangle itself would be drawn and the rest of the square texture which has alpha = 0 would not be drawn but that is not the case, it is also drawn but it is just white. I think I am missing something important but I'm not sure what it is.
I am saving the loaded textures in a DDS format if that makes any difference. Thanks.
// PIXEL SHADER
Texture2D shaderTexture;
SamplerState texSampler;
struct PixelShaderInput
{
float4 position : SV_POSITION;
float2 texCoord : TEXCOORD;
};
float4 main(PixelShaderInput input) : SV_TARGET
{
float4 textureColor = shaderTexture.Sample(texSampler, input.texCoord);
return textureColor;
}