Deferred shading, analytically calculate derivatives

Started by
16 comments, last by B_old 14 years, 6 months ago
Hello, I have read that calculating the derivatives can improve the filtering of VSMs in a deferred renderer. I have not the slightest idea how this is achieved, though, and having trouble to find info on this. Has anybody done this and does it only apply if you use mipmaps for the VSMs as well? Some help with this is appreciated.
Advertisement
I tried to gather some more useful information on this, but am still unsure about what actually to do.

Briefly the context: I do CSM/(E)VSM with deferred shading. To improve the filtering of the shadow map I want to provide the derivatives myself. Mip-mapping is not important for now.

Am I supposed to calculate the shadow texture coords for the current pixel and neighboring pixels, then compute the derivatives and pass those along with the pixels texture coords? Or do I need the derivatives of something else entirely? I think I remember that I could store the derivatives in a G-Buffer, but that doesn't really make sense if I need the shadow texture coords, as those change from light to light.

Am I on the right track, that I need to sample several pixels myself in order to calculate the derivatives. Which do I choose? Will the edge of the render target be a problem?

Would be cool if someone could help me out a bit. Thanks!
Don't know if this can help:

Here you will find a chapter about VSM: http://http.developer.nvidia.com/GPUGems3/gpugems3_ch08.html

Check it out " 8.4.2 Biasing "

good job :)
Thanks for the reply. Maybe I should add, that my (E)VSM implementation is working properly. It is just that I experience some artifacts in filtering. I hope that those can be resolved by "manually calculating the derivatives". The problem is, that I only read this statement and have almost no idea about how to proceed.
I'm pretty sure that AndyTX knows what I mean and I hope there might be some more people here that could give me a hand. :)
Quote:Original post by B_oldIt is just that I experience some artifacts in filtering.


What kind of filtering are you talking about ?

Normally, when using VSM, you use a gaussian filter to blur the shadow map. Better, you can use a bilateral filtering.
I'm pretty sure my problem is not directly related to the way I filter. (I use a very small kernel right now, so I wouldn't really gain anything from blurring in a separate pass).
Do you happen to know how I could calculate the derivatives, or of what I'm supposed to calculate them?
You can treat it as a geometry problem, but it will take some calculation.

If you know the position and normal of the fragment at the pixel (i,j), this defines a plane. Now consider two adjacent pixels (i+1,j) and (i,j+1) and trace virtual rays from the camera and find the intersection with your plane. This will give you two vectors in space. Now map those into shadow map space and you have your derivatives.

Incidentally, why do you want derivatives if you're not mipmapping? Are you doing aniso without mipmaps?
"Math is hard" -Barbie
Quote:Original post by Pragma
If you know the position and normal of the fragment at the pixel (i,j), this defines a plane. Now consider two adjacent pixels (i+1,j) and (i,j+1) and trace virtual rays from the camera and find the intersection with your plane. This will give you two vectors in space. Now map those into shadow map space and you have your derivatives.

I calculate the plane of my current pixel by using pos and normal.
I trace rays from the camera through the pos of adjacent pixels and compute the intersection with the plane. Those intersections are my derivatives?

Doesn't SampleGrad() expect float2 derivatives in case of two-dimensional textures?
Quote:Original post by Pragma
Incidentally, why do you want derivatives if you're not mipmapping? Are you doing aniso without mipmaps?

Actually I am. Is your question implying that the derivatives will only make a difference if I use either aniso or mipmaps? The bigger my blur kernel the stronger artifacts I get when the shadows fall on an edge. I believe this problem comes from deferred shading. Could that be at all?
Quote:Original post by B_old
I calculate the plane of my current pixel by using pos and normal.
I trace rays from the camera through the pos of adjacent pixels and compute the intersection with the plane. Those intersections are my derivatives?

Doesn't SampleGrad() expect float2 derivatives in case of two-dimensional textures?


There is one more step. Once you have the intersections, you map them into texture space, giving 2 float2's. The difference between these and your original texture coordinates are your derivatives.

Quote:Original post by B_old
Actually I am. Is your question implying that the derivatives will only make a difference if I use either aniso or mipmaps? The bigger my blur kernel the stronger artifacts I get when the shadows fall on an edge. I believe this problem comes from deferred shading. Could that be at all?


Yes, derivatives only make a difference when you have either mipmapping or aniso. If your artifacts go away when you turn off aniso, then the problem could be from deferred shading and manual derivatives could help. If you post a picture I might be able to tell if this is the problem.
"Math is hard" -Barbie
Thanks for the answer pragma.
Could you help me verify this code?
//calculate derivatives//pos_a and pos_b are the neighbor pixel's position//everything is in viewspace, so the rays come from the originfloat numer = -dot(normal, pos);	float3 dir_a = normalize(pos_a);float3 dir_b = normalize(pos_b);float denom_a = dot(normal, dir_a);float denom_b = dot(normal, dir_b);	float3 int_a = (numer / denom_a) * dir_a;  //this is supposed to be the intersectionfloat3 int_b = (numer / denom_b) * dir_b;	float4 tmp_a = mul(float4(int_a, 1.f), g_shadowTransforms[index]);float4 tmp_b = mul(float4(int_b, 1.f), g_shadowTransforms[index]);	float2 da = shadowProj.xy - tmp_a.xy;float2 db = shadowProj.xy - tmp_b.xy;

When I apply this the shadow certainly look smoother with anisotropic filtering. On the other hand the artifacts that I wanted to avoid in the first place look worse.
The bigger my filter kernel, or in this case my aniso, the more shadowed pixels are "unshadowed" near geometry edges. AndyTX told me that it has to do with EVSM, but without the exp() its all the same. That's why I tried this derivatives stuff.

Here is a picture that shows the artifacts I am talking about.
Deferred VSM filter artifacts

[Edited by - B_old on October 30, 2009 5:51:35 AM]

This topic is closed to new replies.

Advertisement