BlendState in an Effect

Started by
3 comments, last by JB3DG 10 years ago

Hi guys

Just got into using the FX style of shaders. I would like to know how do I set up the BlendState object so that a Alpha of 0 is transparent.

Any ideas?

Thanks

JB


BlendState SrcAlphaBlendingAdd 
{ 
	BlendEnable[0] = TRUE; 
	SrcBlend = SRC_COLOR; 
	DestBlend = INV_SRC_COLOR; 
	BlendOp = ADD; 
	SrcBlendAlpha = ZERO; 
	DestBlendAlpha = ZERO; 
	BlendOpAlpha = ADD; 
	RenderTargetWriteMask[0] = 0x0F; 
}; 
Advertisement

So with blending you have two colors: SrcColor, which is the color output by your pixel shader, and DstColor, which is the color that's currently in the render target. The equation goes like this:

FinalColor.rgb = SrcColor.rgb * SrcBlend [BlendOp] DstColor.rgb * DstBlend;
FinalColor.a = SrcColor.a * SrcBlendAlpha [BlendOpAlpha] DstColor.a * DstBlendAlpha;

So with the settings you've specified, you're doing this:

FinalColor.rgb = SrcColor.rgb * SrcColor.rgb +  DstColor.rgb * (1.0f - DstColor.rgb);
FinalColor.a = SrcColor.a * 0 + DstColor.a * 0;

This is clearly not what you want. Instead you want to multiply the SrcColor by your source alpha value, and your DstColor by the inverse of your source alpha:

FinalColor.rgb = SrcColor.rgb * SrcColor.a +  DstColor.rgb * (1.0f - DstColor.a);

You can achieve this by setting SrcBlend = SRC_ALPHA, and DestBlend = INV_SRC_ALPHA.

Thanks. Figured it would be that way. However for some reason, the alpha component in my texture is not being respected. The texture is a 32bit BMP. All areas that are transparent have an alpha of 0.

Here is what the pixel shader looks like


float4 PS( PS_INPUT input ) : SV_Target
{ 
	float4 color = Map.Sample(linearSampler, input.t);
	color.a = color.a*input.opacity;
	return color;      
}

input.opacity is a little extra that I use to change transparency even further when I need to.

Any ideas?

Hmm...perhaps your texture loader isn't handling the alpha correctly for bitmaps. Is it possible to try saving the texture as .png instead, and see if that works?

Here is how I load the texture.


D3DX11_IMAGE_LOAD_INFO rtd;
		ZeroMemory(&rtd, sizeof(rtd));
		rtd.Width = 1024;
		rtd.Height = 1024;
		rtd.MipLevels = 1;
		rtd.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
		rtd.Usage = D3D11_USAGE_DEFAULT;
		rtd.BindFlags = D3D11_BIND_SHADER_RESOURCE;
		rtd.CpuAccessFlags = D3DX11_DEFAULT;
		rtd.MiscFlags = D3DX11_DEFAULT;
		rtd.Depth = D3DX11_DEFAULT;
		rtd.FirstMipLevel = D3DX11_DEFAULT;
		rtd.Filter = D3DX11_DEFAULT;
		rtd.MipFilter = D3DX11_DEFAULT;
		rtd.pSrcInfo = NULL;
		HRSRC Srsrc = FindResource(modinst, MAKEINTRESOURCE(200), MAKEINTRESOURCE(10));
		//DWORD temp = GetLastError();
		HGLOBAL Simg = LoadResource(modinst, Srsrc);

		DWORD size = SizeofResource(modinst, Srsrc);
		void* buf= LockResource(Simg);
		hr = D3DX11CreateShaderResourceViewFromMemory(dev, buf, size, &rtd, NULL, &srv, NULL);

This topic is closed to new replies.

Advertisement