problems with vsm implementation in DirectX

Started by
2 comments, last by MJP 16 years, 6 months ago
Hello all, I have been playing lately with variance shadow maps. In order to improve shadow quality I split the scene around the camera into 4 areas and in my pixel shader I choose the right split by computing the distance form the center of the screen to the current shadow pixel ( I think the technique is called cascaded shadow mapping). I have implemented this method in OpenGL and in DirectX. But in DirectX I get these weird seams when I transition from one split area to another. The shaders are identical. If I turn off VSM, the problem disappears, so I don't know if is a DirectX issue or a VSM one. Here is a screen shot of the problem: Any insight is highly appreciated.
2+2=5, for big values of 2!
Advertisement
You have to make sure that when you're transforming screen coordinates to texture coordinates you offset them by -0.5 * screenDimensions.
Thank you for your suggestion, but it didn't worked. The shadows are filtered bilinearly, so there is no need for mapping pixels to texels directly.
2+2=5, for big values of 2!
Quote:Original post by Dizzy_exe
Thank you for your suggestion, but it didn't worked. The shadows are filtered bilinearly, so there is no need for mapping pixels to texels directly.


Ahh, thats too bad. That's just what immediately pops into my head when people have trouble with OpenGL -> DX conversions.

This topic is closed to new replies.

Advertisement