Can't read stencil values in a shader

Started by
3 comments, last by JavaCoolDude 13 years, 11 months ago
Hi, I am running out of ideas on why my stencil values aren't showing up in a test shader I wrote. I am using Windows 7 and a Radeon 5870. Setup code: // create texture D3D11_TEXTURE2D_DESC desc; desc.Width = width; desc.Height = height; desc.MipLevels = 1; desc.ArraySize = 1; desc.Format = DXGI_FORMAT_R24G8_TYPELESS; desc.SampleDesc.Count = 1; desc.SampleDesc.Quality = 0; desc.Usage = D3D11_USAGE_DEFAULT; desc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_DEPTH_STENCIL; desc.CPUAccessFlags = 0; desc.MiscFlags = 0; HRESULT hr = d3d->CreateTexture2D( &desc, NULL, &m_DsTexture.m_Texture); D3D11_DEPTH_STENCIL_VIEW_DESC dsvd; dsvd.Format = DXGI_FORMAT_D24_UNORM_S8_UINT; dsvd.ViewDimension = D3D11_DSV_DIMENSION_TEXTURE2D; dsvd.Flags = 0; dsvd.Texture2D.MipSlice = 0; VERIFY_HR( d3d->CreateDepthStencilView( m_DsTexture.m_Texture,// pResource &dsvd, // pDesc &m_DsWriteView )); // ppRTView // Create the shader resource view D3D11_SHADER_RESOURCE_VIEW_DESC descSRV; descSRV.Format = DXGI_FORMAT_X24_TYPELESS_G8_UINT; descSRV.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; descSRV.Texture2D.MipLevels = 1; descSRV.Texture2D.MostDetailedMip = 0; VERIFY_HR( d3d->CreateShaderResourceView( m_DsTexture.m_Texture, &descSRV, &m_DsTexture.m_View ) ); Shader code: Texture2D<uint> g_StencilTexture : MAKE_TEXTURE_REG( TEX_UNIT_BASE ); float4 PS_Stencil_Test(in PS_Tex0 input ) : SV_TARGET { // m_Tex0 contains unnormalized UV coordinates float4 sample = g_StencilTexture.Load(int3(input.m_Tex0.xy, 0)); return float4(sample.xyz != 0.0.xxx, 1.0); } I did try clearing the stencil values to 0xFF hoping to obtain a white color output but still no luck! I modified the code and I got it to show depth just fine but stencil is failing me badly. Abdul
Advertisement
What do you know, I just switched from ps 4.0 profile when compiling my shader to 5.0 and everything works as I would expect.
Sigh!
What are the differences in asm code from the shader compiled to 4.0 vs 5.0 ?
ps_4_0
dcl_resource_texture2d (uint,uint,uint,uint) t6
dcl_input_ps linear v1.xy
dcl_output o0.xyzw
dcl_temps 1
ftoi r0.xy, v1.xyxx
mov r0.zw, l(0,0,0,0)
ld r0.xyzw, r0.xyzw, t6.xyzw
utof o0.xyz, r0.yyyy
mov o0.w, l(1.000000)
ret

vs

ps_5_0
dcl_globalFlags refactoringAllowed
dcl_resource_texture2d (uint,uint,uint,uint) t6
dcl_input_ps linear v1.xy
dcl_output o0.xyzw
dcl_temps 1
ftoi r0.xy, v1.xyxx
mov r0.zw, l(0,0,0,0)
ld_indexable(texture2d)(uint,uint,uint,uint) r0.y, r0.xyzw, t6.xyzw
utof o0.xyz, r0.yyyy
mov o0.w, l(1.000000)
ret
It turns out ps_4_0 is just fine what happened though was when I switched to ps_5_0 I also started declaring my texture resource as Texture2D<uint4> instead of Texture2D<uint> and since the stencil value is stored in the y (green) component of the texture according to the DXGI_FORMAT_X24_TYPELESS_G8_UINT format, I was originally discarding it.
Sigh!

This topic is closed to new replies.

Advertisement