I've tried to implement screen-space reflections and it seems to work but only at a very tight reflection angle:
[attachment=17858:giboxssr0.jpg]
Here's what happens when I inrease the camera angle to the surface:
[attachment=17857:giboxssr1.jpg]
And if you look at the image below, when I am facing the surface, it just renders the entire scene from the camera's point of view, instead of a reflection:
[attachment=17856:giboxssr2.jpg]
I understand that SSRs only render what the camera can see, so at larger angles from each surface, the reflection will disappear; however, in my case it seems to only work for very tight angles - I've seem implementations that still show a correct reflection at larger angles.
Here's my code:
vec4 bColor = vec4(0.0);
float reflDist = 0.0;
vec3 screenSpacePos;
E = normalize(camPos - gsout.worldPos.xyz);
reflDir = normalize(reflect(-E, bumpN));
float currDepth = 0.1;
for(int i = 0; i<20; i++)
{
vec4 clipSpace = proj*view*vec4(gsout.worldPos.xyz+reflDir*reflDist,1.0);
vec3 NDCSpace = clipSpace.xyz/clipSpace.w;
screenSpacePos = 0.5*NDCSpace + 0.5;
float sampleDepth = texture(reflTex, screenSpacePos.xy).w;
currDepth = (proj*view*vec4(gsout.worldPos.xyz+reflDir*reflDist,1.0)).z;
float diff = currDepth - sampleDepth;
if(diff < 0)
bColor.xyz = texture(reflTex, screenSpacePos.xy).xyz;
reflDist += 0.1;
}
The reflTex stores the screen space rendering of the scene in the xyz components and length(camPos.xyz - worldPos.xyz) in the w component in a 32-bit floating point texture.
Would anyone be able to give me some tips on what I may be doing wrong?