Jump to content
  • Advertisement
Sign in to follow this  

How the stencil buffer works

This topic is 4203 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I'm trying to use the stencil buffer, but I'm arriving at the conclusion that I don't quite understand the logic behind it. When I DrawPrimitive to the back buffer and the depth buffer, assuming ZENABLE is TRUE, then the following happens: when the primitive is drawn, the final z-coordinate is checked against the current value in the depth buffer. If it's smaller (meaning, it's in front of what was there before), then both the depth and the back buffers are updated for the pixel in question. Now to the stencil buffer. Let's assume I drew something on the stencil buffer. For the sake of argument, let's assume that all either all 8 bits are set to 1 for a specific pixel, or to 0. Now I call DrawPrimitive. What exactly happens? Here are some questions that would probably clarify a lot to me, if somebody would be kind enough to help me out. 1) When I DrawPrimitive, the comparison for the depth buffer is done between what is currently in the depth buffer and the incoming z-coordinate (transformed). As far as the stencil buffer is concerned, does the incoming primitive I'm drawing belong anywhere in the comparison that the stencil buffer will make? Or is the comparison done 100% based on data only in the stencil buffer? In other words, in order to decide whether the pixel will be updated in the back and depth buffers, a check is made based upon the value of the pixel on the stencil buffer only. Is this the case? 2) If this is the case, then since my stencil buffer has only zeros or only ones, if I don't update it, regardless of the result of the comparison, it will remain as is unless I draw to it again. So, after drawing to my stencil buffer, I set device->SetRenderState(D3DRS_STENCILFAIL, D3DSTENCILOP_KEEP); device->SetRenderState(D3DRS_STENCILPASS, D3DSTENCILOP_KEEP); In other words, regardless the result of the comparison, do not update the stencil buffer. Question: If I do this, am I preventing the back and depth buffers from also being updated? Thanks.

Share this post

Link to post
Share on other sites
The order of operations goes like this (assuming the stencil test is enabled):

1) Perform stencil test. If failed, perform StencilFail operation and discard pixel. If passed, go to step 2.
2) Perform depth test. If failed, perform StencilZFail operation and discard pixel. If passed, perform StencilPass operation and write pixel color/depth.

So to answer your questions, the stencil test is based solely on the current value in the stencil buffer, the stencil reference value, and the stencil mask. It has to be, since it's done first. The depth test is performed only if the stencil test passes or is disabled.

The DX documentation can provide more details.

Share this post

Link to post
Share on other sites

Thanks for the quick and clear reply. Maybe you can help me in a follow-up question.

I don't really know how to see exactly what's in the stencil buffer. What I'm trying to do is to do screen-door transparency using a stencil buffer. But I cannot use the alpha channel as I may have to combine screen-door transparency with alpha transparency.

Since all I can see is the end-result - and it's not working... nothing shows up on the screen - I have to keep on guessing where I may have gone wrong.

It would be nice to know that what I wrote to the stencil buffer is correct, so I can go from there.

This is what I'm doing:

1) I have a bmp file for a 32x32 pixel image. In fact, I have two versions of it: in one case, each pixel is 32-bits, with the "on" bits set to 0xffffffff, and the "off" bits set to 0. In another case, I have a 1-bit BMP file define with 2 colors: black and white. In short, all I need in this 32x32 pattern is to know which bits are on and which are off.

2) I create a Texture using CreateTextureFromFile. I then draw the square with the texture applied. Ideally, I would like to use the WRAP option. However, in practice, it doesn't quite work because it would require me to use texture coordinates beyond the maximum for the device. So I just replicate the quadrilateral a few times. I've tested this algorithm on the frame buffer, and the resulting image is exactly what I expect to see. However, how exactly this gets put into the 8 bits of the stencil buffer is a mystery to me, especially since each pixel in the bmp file is 32 bits in depth.

3) When writing to the stencil buffer in step 2 above, I need to set some flags to do this properly. I'm not sure I'm doing this correctly. I want to make sure that I:
- Do NOT change the z buffer
- Do NOT change the back buffer
- DO update the stencil buffer which has ben initially cleared
- DO update the stencil buffer regardless of any other flags I may have set.

Note that all I care is to put a bit on each "on" pixel in the stencil buffer that I can later use for comparison. I'm using the following render states:

device->SetRenderState(D3DRS_STENCILREF, 0x1);

Are these sufficient to make sure that my textured quads will be written to the stencil buffer? And, if so, are the values optimal?

4) Now that the stencil buffer is set, I need to set render states to a point where the stencil buffer comparison is enabled, the stencil will not be changed, and the stencil comparison will only succeed for the bits that are "on".

Any suggestions?

Thanks again.


Share this post

Link to post
Share on other sites
To disable Z-writing, you use the D3DRS_ZWRITEENABLE mode with TRUE or FALSE. Color writing is controlled through D3DRS_COLORWRITEENABLE. You can set it to 0 to disable writing of all channels, or mask one or more of the channel flags. More information can be found here.

In order to use a texture to control stencil testing, you also have to utilize alpha testing. Otherwise, there's no way to discriminate between which pixels "make it" to the stencil/depth test - it would just test the entire primitive. There are a bunch of different states associated with alpha testing, but the short of it is that you enable alpha testing (D3DRS_ALPHATESTENABLE), set a reference value (D3DRS_ALPHAREF), and an alpha function (D3DRS_ALPHAFUNC). The output alpha of the pixel shader is compared against the reference value using the supplied comparison function. If the comparison returns true, then the pixel is processed, otherwise it's immediately discarded. So what you can do is set your reference value to 0xFF and the function to D3DCMP_EQUAL, and only pixels with an alpha equal to 0xFF will pass. You can use any format texture you want, as long as the alpha channel contains the "on/off" information (so if it's a single-channel BMP with only a red channel, you use a shader to swizzle it to the alpha channel).

One last thing to be careful of is how you set your stencil operations. It appears that you don't want to write to the Z-buffer, but you still want to perform Z-testing, which makes it suspicious that you'd use REPLACE as the stencil operation on Z-fail (since depth testing it usually done to reject pixels). Maybe you meant this to be KEEP? Also note that if the stencil function is ALWAYS, then setting the StencilFail operation is redundant since it never happens.

I hope that helped!

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!