Hi all, this might seem a stupid question (maybe it's the case and I hope so) but I'm hammering my head for a couple of days to find a logic reason why my depth test fails.
First off, the source code:
I define the depth texture with those parameters:
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri (GL_TEXTURE_2D, GL_DEPTH_TEXTURE_MODE_ARB, GL_LUMINANCE);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_COMPARE_MODE_ARB, GL_COMPARE_R_TO_TEXTURE_ARB);
glTexParameteri (GL_TEXTURE_2D, GL_TEXTURE_COMPARE_FUNC_ARB, GL_LEQUAL);
glTexImage2D (GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT24, textSize, textSize, 0,
GL_DEPTH_COMPONENT, GL_FLOAT, 0);
The depth texture works great, I've saved it as an image just before passing it to the shader and I can clearly see depth values.
In the shader, after some calculations, I simply do:
vec4 depthTest = shadow2DProj(dTexture, P);
if (depthTest.r > 0.5)
R.r = max(dot(normalize(N).xyz, normalize(viewDirection)), 0.0);
Where P is the current vertex in clip space mapped in viewport space. I used a vec4 so I can debug the shader in GLSLdevil.
Here's the problem: depthTest.xyzw is ALWAYS equal to vec4(1.0).
I can't understand why. Any suggestion?