[SOLVED] SamplerCubeShadow, strange texture-results

Started by
0 comments, last by Fnord42 12 years, 4 months ago
Hi there,

I'm currently trying to optimize my shadowmapping-shaders.
I'm using a deferred renderer with shadow-cubemaps and my system is linux with proprietary drivers for my ATI Radeon HD 5700.
What I'm trying to do is to use the samplerCubeShadow in my lighting-shader to get a (at least a bit) smoothed shadow.
As I've read here, a texture-lookup with a shadow-sampler should return the shadow-factor as a result.
I've also read there, that if used with a texture with enabled linear-filtering, the behaviour of a shadow-sampler-texture-lookup is implementation-dependent but should always return a value on the range [0, 1], which is proportional to the number of samples in the shadow texture that pass the comparison.

Now my results are a bit different.
If I make a texture-lookup on my samplerCubeShadow without linear-filtering activated, I get the depth of the shadow-map as a result.
If I make a texture-lookup with linear-filtering, I get constant 0 or 1 (in dependency of the GL_TEXTURE_COMPARE_FUNC).

The depth-values of the shadow-cubemap and the depth-values I compare them to, both look fine.

This is how I initialize the shadow-texture:
glBindTexture(GL_TEXTURE_CUBE_MAP, m_shadowCubemapTextureId);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_COMPARE_MODE, GL_COMPARE_R_TO_TEXTURE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_COMPARE_FUNC, GL_LEQUAL);

for (char face = 0; face < 6; face++)
{
glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X + face, 0, GL_DEPTH_COMPONENT32,
size, size, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0);
}


The Shaders that create my shadowmap calculate their own depth for a worldspace-direction as follows:
// ===== Vertexshader =====
vec3 lightToPos = wsPosition.xyz - wsLightPosition;
vDepth = dot(lightToPos.xyz, lightToPos.xyz) / squaredLightRadius;
gl_Position = lightVPMatrix * wsPosition;

// ===== Fragmentshader =====
gl_FragDepth = vDepth;

The light-projection-matrix in lightVPMatrix goes from z=1 to z=lightRadius.

The Shaders that draw the lighting of a light with it's shadowmap calculate the depth of the current position in the same way, just in the fragmentshader because of the deferred rendering, and compare that depth with the one of the shadowmap:
// ===== Fragmentshader =====
uniform sampler2D texDepth;
uniform samplerCubeShadow texShadowCubemap;
...
varying out vec4 frag;

void main(void)
{
// Reconstruct viewspace-position from depth
vec3 vsPosition = ...

vec3 wsLightDir = invNormal * (vsPosition - vsLightPosition);
float wsDepth = dot(wsLightDir, wsLightDir)/squaredLightRadius;
float shadowFactor = texture(texShadowCubemap, vec4(wsLightDir, 1), wsDepth - 0.0005);
// frag = vec4(vec3(shadowFactor), 1); return;
...
}


Somebody knows what might be the problem?

Thanks for your time!
Advertisement

Hi there,

I'm currently trying to optimize my shadowmapping-shaders.
I'm using a deferred renderer with shadow-cubemaps and my system is linux with proprietary drivers for my ATI Radeon HD 5700.
What I'm trying to do is to use the samplerCubeShadow in my lighting-shader to get a (at least a bit) smoothed shadow.
As I've read here, a texture-lookup with a shadow-sampler should return the shadow-factor as a result.
I've also read there, that if used with a texture with enabled linear-filtering, the behaviour of a shadow-sampler-texture-lookup is implementation-dependent but should always return a value on the range [0, 1], which is proportional to the number of samples in the shadow texture that pass the comparison.

Now my results are a bit different.
If I make a texture-lookup on my samplerCubeShadow without linear-filtering activated, I get the depth of the shadow-map as a result.
If I make a texture-lookup with linear-filtering, I get constant 0 or 1 (in dependency of the GL_TEXTURE_COMPARE_FUNC).

The depth-values of the shadow-cubemap and the depth-values I compare them to, both look fine.

This is how I initialize the shadow-texture:
glBindTexture(GL_TEXTURE_CUBE_MAP, m_shadowCubemapTextureId);
// glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
// glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_COMPARE_MODE, GL_COMPARE_R_TO_TEXTURE);
glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_COMPARE_FUNC, GL_LEQUAL);

for (char face = 0; face < 6; face++)
{
glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X + face, 0, GL_DEPTH_COMPONENT32,
size, size, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0);
}


The Shaders that create my shadowmap calculate their own depth for a worldspace-direction as follows:
// ===== Vertexshader =====
vec3 lightToPos = wsPosition.xyz - wsLightPosition;
vDepth = dot(lightToPos.xyz, lightToPos.xyz) / squaredLightRadius;
gl_Position = lightVPMatrix * wsPosition;

// ===== Fragmentshader =====
gl_FragDepth = vDepth;

The light-projection-matrix in lightVPMatrix goes from z=1 to z=lightRadius.

The Shaders that draw the lighting of a light with it's shadowmap calculate the depth of the current position in the same way, just in the fragmentshader because of the deferred rendering, and compare that depth with the one of the shadowmap:
// ===== Fragmentshader =====
uniform sampler2D texDepth;
uniform samplerCubeShadow texShadowCubemap;
...
varying out vec4 frag;

void main(void)
{
// Reconstruct viewspace-position from depth
vec3 vsPosition = ...

vec3 wsLightDir = invNormal * (vsPosition - vsLightPosition);
float wsDepth = dot(wsLightDir, wsLightDir)/squaredLightRadius;
float shadowFactor = texture(texShadowCubemap, vec4(wsLightDir, 1), wsDepth - 0.0005);
// frag = vec4(shadowFactor); return;
...
}


Somebody knows what might be the problem?

Thanks for your time!
Abused the GLSL-texture()-function...

I did
float shadowFactor = texture(texShadowCubemap, vec4(wsLightDir, 1), wsDepth - 0.0005);
instead of
float shadowFactor = texture(texShadowCubemap, vec4(wsLightDir, wsDepth - 0.0005));

Now it works with linear-filtering.

This topic is closed to new replies.

Advertisement