Jump to content
  • Advertisement
Sign in to follow this  
ZBufferOP

Implementing my own z-Buffer

This topic is 1692 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi guys,

 

I have some problems understanding the z-Buffer and how to interpolate the values between vertices correctly in respect to perspective.

 

What I have done so far is take a vertex in World Space (x, y,z) and transform it using my ViewProjection Matrix to get coordinates in clip-space of the form (x, y, z, w). These coordinates still need to be divided by w.

 

What do I have to use here z or w? And how do I interpolate it?

 

What I would do is:

 

- normalize (x,y,z,w) => (x/w, y/w, z/w, 1)

- linearly interpolate z/w

 

Would that be correct?

Do I have to divide by 1/w at the end? IF no, why not?

Share this post


Link to post
Share on other sites
Advertisement
Interpolating linearly won't do. It's very easy to find information about this on the web.

Share this post


Link to post
Share on other sites

OpenGL is interpolating linearly aswell once you are in clip space and normalized your coordinates afaik. Your coordinates look like this:

 

(x,y,z,1) where x,y,z are in the range [-1, 1].

 

Any more opinions on this? My impresison is that I dont have to do perspective correct interpolation on the z here but a linear interpolation is correct since z is already divided by w.

Share this post


Link to post
Share on other sites
Really, just do a web search for something like `software rasterizer z interpolation'. There is no point in repeating everything here.

Share this post


Link to post
Share on other sites

Yep thanks, i am right. You were wrong.

 

After transforming to the NDC,

zNDC.png

So the depth value can be directly interpolated using z-NDC  for depth test.

 

http://www.altdevblogaday.com/2012/04/29/software-rasterizer-part-2/

 

Conclusion
In this post, the steps to linear interpolate the vertex in screen space is described. And for rasterizing the depth buffer only (e.g. for occlusion), the depth value can be linearly interpolated directly with the z coordinate in NDC space which is even simpler.

Edited by ZBufferOP

Share this post


Link to post
Share on other sites

However we cannot directly interpolate those attributes in screen space because projection transform after perspective division is not an affine transformation (i.e. after transformation, the mid-point of the line segment is no longer the mid-point), this will result in some distortion and this artifact is even more noticeable when the triangle is large[.]


But you seem to think you know what you are doing, so go ahead and do linear interpolation and hope something good will happen.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!