Soft water edges

Started by
22 comments, last by Kaptein 9 years, 6 months ago

Hello. Can someone explain me how to do soft water edges, or give me a link of tutorial? I also tried to do this myself, but nothing happens. The only thing that happens, so water fragments that are more far form camera is more transparent and which fragments are near camera they are less transparent (see image below). If someone could help me, here is my current fragment shader's subroutine which is used while rendering water:


layout(binding = 1) uniform sampler2D waterReflection;
layout(binding = 2) uniform sampler2D waterRefraction;
layout(binding = 3) uniform sampler2D normalMap;
layout(binding = 4) uniform sampler2D dudvMap;
layout(binding = 5) uniform sampler2D sceneDepth;
layout(binding = 6) uniform sampler2D waterDepth;

in vec2 TexCoord;

uniform vec4 WaterColor;
uniform float kShine;
uniform float kDistortion;
uniform float kRefraction;

subroutine(RenderPassType)
void drawWater()
{
	vec4 distOffset = texture2D(dudvMap, NormalUV) * kDistortion;
	vec4 dudvColor = texture2D(dudvMap, RefractionUV + distOffset.xy);
	dudvColor = normalize(dudvColor * 2.0 - 1.0) * kRefraction;
	
	vec4 normalVector = texture2D(normalMap, RefractionUV + distOffset.xy);
	normalVector = normalVector * 2.0 - 1.0;
	normalVector.a = 0.0;
	
	vec4 lightReflection = normalize(reflect(-LightTangentSpace, normalVector));
	
	vec4 fresnelTerm = vec4(dot(normalVector, lightReflection));
	vec4 invertedFresnel = 1.0 - fresnelTerm;
	
	vec4 ViewCoordY = ViewCoord;
	ViewCoordY.y = -ViewCoordY.y;
	vec4 projCoord = ViewCoordY / ViewCoord.q;
	projCoord = (projCoord + 1.0) * 0.5;
	projCoord += dudvColor;
	projCoord = clamp(projCoord, 0.001, 0.999);
	vec4 reflectionColor = texture2D(waterReflection, projCoord.xy);
	
	projCoord = ViewCoord / ViewCoord.q;
	projCoord = (projCoord + 1.0) * 0.5;
	projCoord += dudvColor;
	projCoord = clamp(projCoord, 0.001, 0.999);
	vec4 refractionColor = texture2D(waterRefraction, projCoord.xy);
	float dColor = texture2D(sceneDepth, projCoord.xy).x;
	vec4 depthValue = vec4(dColor, dColor, dColor, 1.0); //Scene depth
	
	vec4 invDepth = 1.0 - depthValue;
	refractionColor *= invertedFresnel * invDepth;
	refractionColor += WaterColor * depthValue * invertedFresnel;
	
	reflectionColor *= fresnelTerm;
	
	vec4 localView = normalize(ViewTangentSpace);		
	float intensity = max(0.0, dot(lightReflection, localView));
	vec4 specular = vec4(pow(intensity, kShine));
	
	vec4 color = refractionColor + reflectionColor + specular;
	
	dColor = texture2D(waterDepth, projCoord.xy).x;
	vec4 vWaterDepth = vec4(dColor, dColor, dColor, 1.0);
	
	color.a = (depthValue - vWaterDepth) * 2.5; //Subtract scene depth from water depth and scale the result
	
	FragColor = color;
}

And here the image how it looks now: http://imageshack.com/a/img540/1930/Dfkzrr.jpg I actually have seen somewhere that I need to subtract scene's depth value from water depth value, but I don't know what texture coordinates I should use when getting depth value from depth texture using texture2D function. I have several choices: projCoord.xy, gl_FragCoord.xy, TexCoord, and maybe more.

Advertisement

https://www.opengl.org/discussion_boards/showthread.php/177986-Confused-About-gl_FragCoord-use-with-Textures

Or, simpler: vec2 coord = gl_FragCoord.xy / screenSize.xy;

Where screenSize is input by you. You can also use textureSize(waterDepth).

Keep in mind that textureSize(downscaledTexture) will not work. It needs to be the size of the screen. You sometimes want to adjust for half-pixel. Most of the time it's not important at all, and just adds extra calculations where none is needed.

Subtracting and scaling is correct. Make it [0, 1] if possible, and use that for various things like wave-height and darkness.

So now after modifying my code, calculating alpha looks like this:


vec2 coord = gl_FragCoord.xy / ScreenSize;
dColor = texture2D(waterDepth, coord).x;
float vWaterDepth = vec4(dColor, dColor, dColor, 1.0);
float vSceneDepth = depthValue;
	
color.a = clamp((vSceneDepth - vWaterDepth) * 10.0, 0.0, 1.0); //Subtract scene depth from water depth and scale the result

And the result is almost same, edges looks good, but when I move my camera away from water, it's just not visible, alpha = 0. Here is two images how it looks, when camera is close to water edge: http://imageshack.com/a/img912/1438/jDQpGJ.jpg and when camera moves away from water: http://imageshack.com/a/img909/8340/WJ4QwI.jpg

P.s. For getting depth value from scene's depth texture I'm not using coordinates that is in coord variable, I'm still using projCoord.xy because whether I use coord or projCoord.xy it doesn't changes anything.

So now after modifying my code, calculating alpha looks like this:


vec2 coord = gl_FragCoord.xy / ScreenSize;
dColor = texture2D(waterDepth, coord).x;
float vWaterDepth = vec4(dColor, dColor, dColor, 1.0);
float vSceneDepth = depthValue;
	
color.a = clamp((vSceneDepth - vWaterDepth) * 10.0, 0.0, 1.0); //Subtract scene depth from water depth and scale the result

And the result is almost same, edges looks good, but when I move my camera away from water, it's just not visible, alpha = 0. Here is two images how it looks, when camera is close to water edge: http://imageshack.com/a/img912/1438/jDQpGJ.jpg and when camera moves away from water: http://imageshack.com/a/img909/8340/WJ4QwI.jpg

P.s. For getting depth value from scene's depth texture I'm not using coordinates that is in coord variable, I'm still using projCoord.xy because whether I use coord or projCoord.xy it doesn't changes anything.

You should already have waterDepth in linear space. It should be the value of length(mvpos.xyz).

vWaterDepth needs to be linearized too. They are not in the same coordinate system space. And they both need to be linear or this will not work.

Googling for an example (where f = far, and n = near): http://www.ozone3d.net/blogs/lab/20090206/how-to-linearize-the-depth-value/

If waterDepth is in linear space, why should I linearize vWaterDepth? vWaterDepth is the color of the water depth texture (waterDepth) isn't it's same?:


vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = texture2D(waterDepth, coord).x

If waterDepth is in linear space, why should I linearize vWaterDepth? vWaterDepth is the color of the water depth texture (waterDepth) isn't it's same?:


vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = texture2D(waterDepth, coord).x

What? Why do you subtract a depth-value from a color?

You need to calculate the depth of the water fragment too, otherwise you have no way to know the actual depth of the water. Like I wrote, you need the length of the eye vector.

Assuming this is what you call ViewCoord, then waterDepth = length(ViewCoord.xyz), and sceneDepth = linear(depthFromTexture)

Subtract them, scale and saturate. Form color.

So water depth equals to length(ModelViewMatrix * VertexPosition) ? Okay I did this, then I linearize the depth which is got from scene depth texture:


const float f = 1000.0;
const float n = 0.1;
	
vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = length(MVPos.xyz);
float vSceneDepth = (2 * n) / (f + n - texture2D(sceneDepth, coord).x * (f - n));
		
color.a = (vSceneDepth - vWaterDepth) * 1.0; //Subtract scene depth from water depth and scale the result

And then I can't even see water, it's invisible.

are you just fading out the water edge itself or are you blending the edges with the terrain ?

are you just fading out the water edge itself or are you blending the edges with the terrain ?

fading out water edges

How is the sceneDepth texture generated? Isn't this already linear?

Why are you taking the length of the water vector? You only need the Z component.

This topic is closed to new replies.

Advertisement