Strange rendering with blurred depth texture shader (simple SSAO)

Started by
0 comments, last by Muftobration 13 years ago
I'm working on a simple type of SSAO that involves unsharp masking the depth buffer. To verify that I correctly loaded the depth texture (a texture the same size as the screen containing depth values), I created a simple shader that just colors each fragment according to its depth. These two images show a test object I made with the shader off and on:

Defaultoutputbefore4quadrantfix.png


Depthbefore4quadrantfix.png




That all seems to be working well, so I moved on to unsharp masking the depth texture (the next step). I wrote a shader to blur the depth texture and display the blurred version, but I'm getting a weird artifact that I can't track down. The render only appears in the lower left quadrant:

UnsharpMaskoutputbefore4quadrantfix.png




Here's the fragment shader code for the first depth image (the second image):


uniform sampler2D depthValues;

void main()
{
gl_FragColor = texture2D(depthValues, gl_FragCoord.xy/1024.0);
}





And here's the fragment shader code for the shader that's misbehaving (third image):


uniform sampler2D depthValues;

float[25] gauss = float[25] (0.0030, 0.0133, 0.0219, 0.0133, 0.0030,
0.0133, 0.0596, 0.0983, 0.0596, 0.0133,
0.0219, 0.0983, 0.1621, 0.0983, 0.0219,
0.0133, 0.0596, 0.0983, 0.0596, 0.0133,
0.0030, 0.0133, 0.0219, 0.0133, 0.0030);
const float kernelDimension = 5.0;
const float screenDimension = 1024.0;

void main()
{
vec4 sum = vec4(0,0,0,0);
int iter = 0;
int i = int(gl_FragCoord.x);
int j = int(gl_FragCoord.y);
int maxX = i + int(floor(kernelDimension/2.0));
int maxY = j + int(floor(kernelDimension/2.0));
float sampX;
float sampY;

for (int x = i - int(floor(kernelDimension/2.0)); x < maxX; x++)
{
for (int y = j - int(floor(kernelDimension/2.0)); y < maxY; y++, iter++)
{
sampX = (gl_FragCoord.x + float(x)) / screenDimension;
sampY = (gl_FragCoord.y + float(y)) / screenDimension;
if (sampX >= 0.0 && sampX <= 1.0 && sampY >= 0.0 && sampY <= 1.0)
{
sum += texture2D(depthValues, vec2(sampX, sampY)) * gauss[iter];
}
}
}

gl_FragColor = texture2D(depthValues, gl_FragCoord.xy / screenDimension) - sum;
}



The second shader blurs the depth texture with a 5x5 Gaussian kernel (gauss). If you know why I'm getting these results, please help me solve the issue.
Advertisement
I've solved the quadrant issue. I was sampling the texture in the wrong positions:

sampX = (gl_FragCoord.x + float(x)) / screenDimension;
sampY = (gl_FragCoord.y + float(y)) / screenDimension;

needed to be changed to

sampX = float(x) / screenDimension;
sampY = float(y) / screenDimension;


Since resolving that issue, I've come up with another. The unsharp mask I'm making should be mostly black with some lighter color around the edges. It's about right, but instead of being mostly black, it's mostly dark gray and changes shade as the object moves closer to or farther from the camera.

Averagingover20neighborsbuggyshaderfartherview.png

Averagingover20neighborsbuggyshader.png

This topic is closed to new replies.

Advertisement