Reflection/refraction map edges

Started by
7 comments, last by george7378 10 years, 10 months ago

Hi everyone,

I've got a little problem with the reflection and refraction textures that I'm using for my water. I use a clipping plane to avoid rendering stuff above the water line for reflection, but I also perturb the texture coordinates of my reflection map to simulate a rippling surface. This means that at the edges, the texture coordinates spill over and I get black showing up:

[attachment=16044:reflectionmapedges.jpg]

Is there an easy way to fix this? I can do it by moving the clip plane down slightly, but this causes things below the water line to be reflected.

Thanks!

Advertisement

Short answer is - look at Half Life 2 water for example (I know it isn't recent example, but most current days games use either cube maps or screen-space reflections, plus HL2 water has the exact same issue visible). From here you can deduce, that it's mostly ignored as the players won't notice the effect that much.

Long answer is - the petrubation is problem, as you're performing it in screen-space, you basically just offset the texture that doesn't have the info you want. One possible solution would be to store the distance of the y-flipped world from clipping plane and weight petrubation based upon this distance (clamped of course) - this might work - but I haven't tried it yet, so it might look weird. The other possible solution is to move clip-plane upwards, but this results in other artifacts.

My current blog on programming, linux and stuff - http://gameprogrammerdiary.blogspot.com

Hey - thanks for the answer - yes, I knew it was something to do with the sampling coordinates. Ah well, I will try and find a way to fix it, or at least make it less noticeable. It's interesting (but annoying) how each of these algorithms (reflection mapping, shadow mapping, etc...) has its own little flaws which always seem to creep in, no matter what.

Most of the solutions are going to be hacks of some sort wink.png

When rendering your reflection map, you can clear alpha to zero, then make sure you always write a non-zero alpha value for any polys that you draw.

Then when sampling the reflection map, you can inspect the alpha component to see if you've gotten a dud value - if so, you can use a backup texture sample (e.g. with zero offset to the screen coords).

When rendering the water plane, you could also sample the depth buffer to find out how far the ray travels through the water before hitting the surface underneath. Your artefacts will appear in the areas where this distance becomes close to zero. So, you could use this distance to automagically scale down your uv distortion scale at these edges.

I don't understand, why can't you clamp or saturate() the texture coordinates?

The texture that I'm sampling from contains the whole scene, so there's no easy way to tell which parts of the texture correspond to the black parts and which correspond to the 'good' parts.

I assume the black is coming from the texture map, if this is the case you could use a photo editor to color the black areas to something that matches the tiles a little better. The errors will at least show less until fixed. Maybe you could come up with a formula that dampens the texture coordinate shift as the coordinates approach the edges of the waters geometry. Perhaps a formula that has asymptotes along the outer edges? If you're not worried about older machines you could load a mask in the vertex shader. If the mask is white and has black blurred edges and is then multiplied against the texture coordinate shift it should work as well as some complicated, inefficient formula.

Consider it pure joy, my brothers and sisters, whenever you face trials of many kinds, 3 because you know that the testing of your faith produces perseverance. 4 Let perseverance finish its work so that you may be mature and complete, not lacking anything.

I assume the black is coming from the texture map, if this is the case you could use a photo editor to color the black areas to something that matches the tiles a little better. The errors will at least show less until fixed.

That's not going to work for a reflection texture built in realtime.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Yeah, the refraction map is built in realtime so I can't tweak the pictures manually. I am thinking that the easiest ways are:

- Clear the render targets to a specific colour, and if the sampled colour is equal to the specified colour in the shader, I know that the sampling has overspilled and I should choose another coordinate.

...or...

- Find the depth of each pixel in the refraction/reflection maps and lerp the sampling offset based on the distance of the pixel below the plane. That would kinda work because in real life, if there is more water between you and the object, there will be more refraction.

This topic is closed to new replies.

Advertisement