Sign in to follow this  

Soft water edges

This topic is 1175 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello. Can someone explain me how to do soft water edges, or give me a link of tutorial? I also tried to do this myself, but nothing happens. The only thing that happens, so water fragments that are more far form camera is more transparent and which fragments are near camera they are less transparent (see image below). If someone could help me, here is my current fragment shader's subroutine which is used while rendering water:

layout(binding = 1) uniform sampler2D waterReflection;
layout(binding = 2) uniform sampler2D waterRefraction;
layout(binding = 3) uniform sampler2D normalMap;
layout(binding = 4) uniform sampler2D dudvMap;
layout(binding = 5) uniform sampler2D sceneDepth;
layout(binding = 6) uniform sampler2D waterDepth;

in vec2 TexCoord;

uniform vec4 WaterColor;
uniform float kShine;
uniform float kDistortion;
uniform float kRefraction;

subroutine(RenderPassType)
void drawWater()
{
	vec4 distOffset = texture2D(dudvMap, NormalUV) * kDistortion;
	vec4 dudvColor = texture2D(dudvMap, RefractionUV + distOffset.xy);
	dudvColor = normalize(dudvColor * 2.0 - 1.0) * kRefraction;
	
	vec4 normalVector = texture2D(normalMap, RefractionUV + distOffset.xy);
	normalVector = normalVector * 2.0 - 1.0;
	normalVector.a = 0.0;
	
	vec4 lightReflection = normalize(reflect(-LightTangentSpace, normalVector));
	
	vec4 fresnelTerm = vec4(dot(normalVector, lightReflection));
	vec4 invertedFresnel = 1.0 - fresnelTerm;
	
	vec4 ViewCoordY = ViewCoord;
	ViewCoordY.y = -ViewCoordY.y;
	vec4 projCoord = ViewCoordY / ViewCoord.q;
	projCoord = (projCoord + 1.0) * 0.5;
	projCoord += dudvColor;
	projCoord = clamp(projCoord, 0.001, 0.999);
	vec4 reflectionColor = texture2D(waterReflection, projCoord.xy);
	
	projCoord = ViewCoord / ViewCoord.q;
	projCoord = (projCoord + 1.0) * 0.5;
	projCoord += dudvColor;
	projCoord = clamp(projCoord, 0.001, 0.999);
	vec4 refractionColor = texture2D(waterRefraction, projCoord.xy);
	float dColor = texture2D(sceneDepth, projCoord.xy).x;
	vec4 depthValue = vec4(dColor, dColor, dColor, 1.0); //Scene depth
	
	vec4 invDepth = 1.0 - depthValue;
	refractionColor *= invertedFresnel * invDepth;
	refractionColor += WaterColor * depthValue * invertedFresnel;
	
	reflectionColor *= fresnelTerm;
	
	vec4 localView = normalize(ViewTangentSpace);		
	float intensity = max(0.0, dot(lightReflection, localView));
	vec4 specular = vec4(pow(intensity, kShine));
	
	vec4 color = refractionColor + reflectionColor + specular;
	
	dColor = texture2D(waterDepth, projCoord.xy).x;
	vec4 vWaterDepth = vec4(dColor, dColor, dColor, 1.0);
	
	color.a = (depthValue - vWaterDepth) * 2.5; //Subtract scene depth from water depth and scale the result
	
	FragColor = color;
}

And here the image how it looks now: http://imageshack.com/a/img540/1930/Dfkzrr.jpg I actually have seen somewhere that I need to subtract scene's depth value from water depth value, but I don't know what texture coordinates I should use when getting depth value from depth texture using texture2D function. I have several choices: projCoord.xy, gl_FragCoord.xy, TexCoord, and maybe more.

Share this post


Link to post
Share on other sites

https://www.opengl.org/discussion_boards/showthread.php/177986-Confused-About-gl_FragCoord-use-with-Textures

 

Or, simpler: vec2 coord = gl_FragCoord.xy / screenSize.xy;

Where screenSize is input by you. You can also use textureSize(waterDepth).

 

Keep in mind that textureSize(downscaledTexture) will not work. It needs to be the size of the screen. You sometimes want to adjust for half-pixel. Most of the time it's not important at all, and just adds extra calculations where none is needed.

 

Subtracting and scaling is correct. Make it [0, 1] if possible, and use that for various things like wave-height and darkness.

Edited by Kaptein

Share this post


Link to post
Share on other sites

So now after modifying my code, calculating alpha looks like this:

vec2 coord = gl_FragCoord.xy / ScreenSize;
dColor = texture2D(waterDepth, coord).x;
float vWaterDepth = vec4(dColor, dColor, dColor, 1.0);
float vSceneDepth = depthValue;
	
color.a = clamp((vSceneDepth - vWaterDepth) * 10.0, 0.0, 1.0); //Subtract scene depth from water depth and scale the result

And the result is almost same, edges looks good, but when I move my camera away from water, it's just not visible, alpha = 0. Here is two images how it looks, when camera is close to water edge: http://imageshack.com/a/img912/1438/jDQpGJ.jpg and when camera moves away from water: http://imageshack.com/a/img909/8340/WJ4QwI.jpg

 

P.s. For getting depth value from scene's depth texture I'm not using coordinates that is in coord variable, I'm still using projCoord.xy because whether I use coord or projCoord.xy it doesn't changes anything.

Share this post


Link to post
Share on other sites

So now after modifying my code, calculating alpha looks like this:

vec2 coord = gl_FragCoord.xy / ScreenSize;
dColor = texture2D(waterDepth, coord).x;
float vWaterDepth = vec4(dColor, dColor, dColor, 1.0);
float vSceneDepth = depthValue;
	
color.a = clamp((vSceneDepth - vWaterDepth) * 10.0, 0.0, 1.0); //Subtract scene depth from water depth and scale the result

And the result is almost same, edges looks good, but when I move my camera away from water, it's just not visible, alpha = 0. Here is two images how it looks, when camera is close to water edge: http://imageshack.com/a/img912/1438/jDQpGJ.jpg and when camera moves away from water: http://imageshack.com/a/img909/8340/WJ4QwI.jpg

 

P.s. For getting depth value from scene's depth texture I'm not using coordinates that is in coord variable, I'm still using projCoord.xy because whether I use coord or projCoord.xy it doesn't changes anything.

 

You should already have waterDepth in linear space. It should be the value of length(mvpos.xyz).

vWaterDepth needs to be linearized too. They are not in the same coordinate system space. And they both need to be linear or this will not work.

Googling for an example (where f = far, and n = near): http://www.ozone3d.net/blogs/lab/20090206/how-to-linearize-the-depth-value/

Share this post


Link to post
Share on other sites

If waterDepth is in linear space, why should I linearize vWaterDepth? vWaterDepth is the color of the water depth texture (waterDepth) isn't it's same?:

vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = texture2D(waterDepth, coord).x
Edited by Modestas

Share this post


Link to post
Share on other sites

 

If waterDepth is in linear space, why should I linearize vWaterDepth? vWaterDepth is the color of the water depth texture (waterDepth) isn't it's same?:

vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = texture2D(waterDepth, coord).x

 

What? Why do you subtract a depth-value from a color?

You need to calculate the depth of the water fragment too, otherwise you have no way to know the actual depth of the water. Like I wrote, you need  the length of the eye vector.

Assuming this is what you call ViewCoord, then waterDepth = length(ViewCoord.xyz), and sceneDepth = linear(depthFromTexture)

Subtract them, scale and saturate. Form color.

Share this post


Link to post
Share on other sites

So water depth equals to length(ModelViewMatrix * VertexPosition) ? Okay I did this, then I linearize the depth which is got from scene depth texture:

const float f = 1000.0;
const float n = 0.1;
	
vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = length(MVPos.xyz);
float vSceneDepth = (2 * n) / (f + n - texture2D(sceneDepth, coord).x * (f - n));
		
color.a = (vSceneDepth - vWaterDepth) * 1.0; //Subtract scene depth from water depth and scale the result

And then I can't even see water, it's invisible.

Share this post


Link to post
Share on other sites

How is the sceneDepth texture generated? Isn't this already linear?

Why are you taking the length of the water vector? You only need the Z component.

This is how I generate sceneDepth texture:
 

depthTexture = glGenTextures();
glActiveTexture(GL_TEXTURE5);
glBindTexture(GL_TEXTURE_2D, depthTexture);
ARBTextureStorage.glTexStorage2D(GL_TEXTURE_2D, 1, GL14.GL_DEPTH_COMPONENT32, Display.getWidth(), Display.getHeight());
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
		
depthFbo = glGenFramebuffers();
glBindFramebuffer(GL_FRAMEBUFFER, depthFbo);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, depthTexture, 0);
glDrawBuffer(GL_NONE);

So if I don't linearize scene depth, and take length only from Z component, code becomes this:

vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = length(ViewCoord.z);
float vSceneDepth = texture2D(sceneDepth, coord).x;
		
color.a = clamp((vSceneDepth - vWaterDepth) * 1.0, 0.0, 1.0);

And result: http://imageshack.com/a/img674/2048/kzOGM9.jpg water is only 1.0f-2.0f radius circle around camera.

Share this post


Link to post
Share on other sites

So water depth equals to length(ModelViewMatrix * VertexPosition) ? Okay I did this, then I linearize the depth which is got from scene depth texture:

const float f = 1000.0;
const float n = 0.1;
	
vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = length(MVPos.xyz);
float vSceneDepth = (2 * n) / (f + n - texture2D(sceneDepth, coord).x * (f - n));
		
color.a = (vSceneDepth - vWaterDepth) * 1.0; //Subtract scene depth from water depth and scale the result

And then I can't even see water, it's invisible.

 

length(MVPos) has to be divided by zfar.

It's also possible vSceneDepth is [0, zfar] instead of [0, 1], so try dividing.

 

You can have a look at this one, but its a production shader so nothing in there is educational. If it helps, good, if not, then I guess ask:

https://github.com/fwsGonzo/cppcraft/blob/master/Debug/shaders/blocks_water.glsl

Edited by Kaptein

Share this post


Link to post
Share on other sites

 

So water depth equals to length(ModelViewMatrix * VertexPosition) ? Okay I did this, then I linearize the depth which is got from scene depth texture:

const float f = 1000.0;
const float n = 0.1;
	
vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = length(MVPos.xyz);
float vSceneDepth = (2 * n) / (f + n - texture2D(sceneDepth, coord).x * (f - n));
		
color.a = (vSceneDepth - vWaterDepth) * 1.0; //Subtract scene depth from water depth and scale the result

And then I can't even see water, it's invisible.

 

length(MVPos) has to be divided by zfar.

It's also possible vSceneDepth is [0, zfar] instead of [0, 1], so try dividing.

 

You can have a look at this one, but its a production shader so nothing in there is educational. If it helps, good, if not, then I guess ask:

https://github.com/fwsGonzo/cppcraft/blob/master/Debug/shaders/blocks_water.glsl

 

 

If I divide both, then water is invisible, if only length(MVPos) then water alpha is 1.0.

Share this post


Link to post
Share on other sites

I am still not really sure how you get the scene depth texture. Do you copy the depth buffer? I have the feeling that you are mixing to different ways of doing the distance calculation.

 

In the attached image there are two ways how this could be done. On the left side is the physically correct way. On the right side is a hack that is a bit faster but produces wrong results.

 

Like I wrote. Choose one but don't mix them up.

Share this post


Link to post
Share on other sites

I am still not really sure how you get the scene depth texture. Do you copy the depth buffer? I have the feeling that you are mixing to different ways of doing the distance calculation.

 

In the attached image there are two ways how this could be done. On the left side is the physically correct way. On the right side is a hack that is a bit faster but produces wrong results.

 

Like I wrote. Choose one but don't mix them up.

 

For scene depth I render scene to the depth texture with fbo after rendering refraction texture, clip plane is disabled while rendering depth texture. And then I render water plane to the depth texture, to get water depth. And if I get depth from the depth texture using this code:

vec2 coord = gl_FragCoord.xy / ScreenSize;
float vWaterDepth = texture2D(waterDepth, coord.xy).x;
float vSceneDepth = texture2D(sceneDepth, coord.xy).x;
	
vec4 color = refractionColor + reflectionColor + specular;
	
color.a = clamp((vSceneDepth - vWaterDepth) * 1.0, 0.0, 1.0);

Result looks like this: https://imagizer.imageshack.us/v2/640x480q90/540/Dfkzrr.jpg

 

And you said to use length function to get depth, but.. How I would get terrain depth? I only render water plane and there is VertexPosition attribute which is water's vertex position, and how I would get terrain vertex, to get it's depth using length function?

Edited by Modestas

Share this post


Link to post
Share on other sites

You can reconstruct the fragment position from the depth value:

 

http://stackoverflow.com/questions/17292397/reconstruct-fragment-position-from-depth

 

So code should look like this?

float vWaterDepth = length(ModelViewPosition.z);
	
vec4 TerrainVertexPosition = vec4(0.0);
TerrainVertexPosition.x = gl_FragCoord / ScreenSize.x;
TerrainVertexPosition.y = gl_FragCoord / ScreenSize.y;
TerrainVertexPosition.z = texture2D(sceneDepth, TerrainVertexPosition.xy).x;
TerrainVertexPosition.w = 0.0;
vec3 xyz = (InverseProjectionMatrix * TerrainVertexPosition).xyz;
	
float vSceneDepth = length(xyz.z);
float alpha = (vSceneDepth - vWaterDepth);
	
vec4 color = refractionColor + reflectionColor + specular;
color.a = alpha;

Result: https://imagizer.imageshack.us/v2/640x480q90/674/kzOGM9.jpg

Share this post


Link to post
Share on other sites

Modestas, I think it would be better if you remove everything except the alpha calculation from the shader for now. Maybe even output alpha as grayscale so you can better focus on the problem. Currently it's really difficult to see what's wrong. Maybe even output the depth value as color for debugging purposes

Share this post


Link to post
Share on other sites

the value you want I simply called wdepth, look how simple it is:

// read underwater depth

float wdepth = getDepth(refcoord) - dist;

 

where dist is the distance to vertex from eye... and for some reason you only calculate length() of the scalar.z which... is the same as doing sqrt(z*z) == z.

dist = length(v_pos);

 

Btw. you cant send "dist" in my example as input to fragment shader, because it doesnt interpolate well on stuff not in a grid pattern,

instead you need to send vec3 viewPos, and then take the length in the fragment shader.

 

----------------

Lets keep things simple and forget all about the window-space-ness of the depth-value and just use it directly.

refcoord here is the texture-coord of as seen on-screen. I hope that is abundantly clear, as you are reading a screen-space texture.

vec2 refcoord = gl_FragCoord.xy / screenSize.xy;

 

from this we can read from out texture:

float depth = getDepth(refcoord);

float getDepth(in vec2 uv)
{
   // exp. depth in window-space
   float wsDepth = texture(depthtexture, uv).x;
   // linear depth in window-space
   wsDepth = ZNEAR / (ZFAR - wsDepth * (ZFAR - ZNEAR));
   
   // this converts it to eye-space (but lets ignore that for now, since the value is almost
   //     the same anyways, google nearplanehalfsize after you get the reconstruction working)
   //return wsDepth * length(vec3((uv * 2.0 - vec2(1.0)) * nearPlaneHalfSize, -1.0));
   return wsDepth;
}

You should now have calculated linear depth that is good enough to prove that this works with.

What you should be able to see if you use the depth difference correctly is that when you rotate the camera that depth changes "slightly" on edges of the screen on the left and right, as well as on the top and bottom. This is because the real eye-vector forms a small dome: The distance to a point on a Z plane are not constant.

 

Now that you have the depth difference, you need to use it for something useful.

The difference value is for example pretty small, so you need rescale it to fit your water

 

for example:

float deep = min(1.0, wdepth * 16.0);

color = mix(shallowColor, deepColor, deep);

 

Some proof here: http://fbcraft.fwsnet.net/proof.png

 

Calculation for nearPlaneHalfSize:

// calculate half near-plane size
const double pio180 = 4.0 * atan(1.0) / 180.0;

float halfTan = tan(fov * pio180 / 2.0);
nearPlaneHalfSize = vec2(halfTan * aspect, halfTan);

Edited by Kaptein

Share this post


Link to post
Share on other sites

 

the value you want I simply called wdepth, look how simple it is:

// read underwater depth

float wdepth = getDepth(refcoord) - dist;

 

where dist is the distance to vertex from eye... and for some reason you only calculate length() of the scalar.z which... is the same as doing sqrt(z*z) == z.

dist = length(v_pos);

 

Btw. you cant send "dist" in my example as input to fragment shader, because it doesnt interpolate well on stuff not in a grid pattern,

instead you need to send vec3 viewPos, and then take the length in the fragment shader.

 

----------------

Lets keep things simple and forget all about the window-space-ness of the depth-value and just use it directly.

refcoord here is the texture-coord of as seen on-screen. I hope that is abundantly clear, as you are reading a screen-space texture.

vec2 refcoord = gl_FragCoord.xy / screenSize.xy;

 

from this we can read from out texture:

float depth = getDepth(refcoord);

float getDepth(in vec2 uv)
{
   // exp. depth in window-space
   float wsDepth = texture(depthtexture, uv).x;
   // linear depth in window-space
   wsDepth = ZNEAR / (ZFAR - wsDepth * (ZFAR - ZNEAR));
   
   // this converts it to eye-space (but lets ignore that for now, since the value is almost
   //     the same anyways, google nearplanehalfsize after you get the reconstruction working)
   //return wsDepth * length(vec3((uv * 2.0 - vec2(1.0)) * nearPlaneHalfSize, -1.0));
   return wsDepth;
}

You should now have calculated linear depth that is good enough to prove that this works with.

What you should be able to see if you use the depth difference correctly is that when you rotate the camera that depth changes "slightly" on edges of the screen on the left and right, as well as on the top and bottom. This is because the real eye-vector forms a small dome: The distance to a point on a Z plane are not constant.

 

Now that you have the depth difference, you need to use it for something useful.

The difference value is for example pretty small, so you need rescale it to fit your water

 

for example:

float deep = min(1.0, wdepth * 16.0);

color = mix(shallowColor, deepColor, deep);

 

Some proof here: http://fbcraft.fwsnet.net/proof.png

 

Calculation for nearPlaneHalfSize:

// calculate half near-plane size
const double pio180 = 4.0 * atan(1.0) / 180.0;

float halfTan = tan(fov * pio180 / 2.0);
nearPlaneHalfSize = vec2(halfTan * aspect, halfTan);

 

Still nothing.. By reading what you've said this is what I got (I removed all unnecessary things such as refraction, reflection and left only deep color and shallow color):

const float ZFAR = 1000.0;
const float ZNEAR = 0.1;

float getDepth(in vec2 uv)
{
	float wsDepth = texture(sceneDepth, uv).x;
	wsDepth = ZNEAR / (ZFAR - wsDepth * (ZFAR - ZNEAR));
	
	const float pio180 = 4.0 * atan(1.0) / 180.0;
	float halfTan = tan(45.0 * pio180 / 2.0);
	float nearPlaneHalfSize = vec2(halfTan * 1.3333334, halfTan);
	return wsDepth * length(vec3((uv * 2.0 - 1.0) * nearPlaneHalfSize, -1.0));
}

subroutine(RenderPassType)
void drawWater()
{
	//texture-coord of as seen on-screen
	vec2 refcoord = gl_FragCoord.xy / ScreenSize;
	//dist is the distance to vertex from eye
	//v_pos = -(ModelViewMatrix * VertexPosition).xyz;
	float dist = length(v_pos);
	//read underwater depth
	float wdepth = getDepth(refcoord) - dist;
	//scale value
	float deep = min(1.0, wdepth * 16.0);
	
	const vec3 deepColor = vec3(42, 73, 87) * vec3(1.0 / 255.0);
	const vec3 shallowColor = vec3(0.35, 0.6, 0.45);
	FragColor = vec4(mix(shallowColor, deepColor, deep), 1.0);
}

And only thing what I can get is the circle which moves when I rotate/move camera: http://imageshack.com/a/img911/8084/mDB8ac.jpg And when camera moves higher (on Y axis), it disappears.

Share this post


Link to post
Share on other sites

You need to divide dist by ZFAR smile.png

float dist = length(v_pos) / ZFAR; 

I also see a minor error: you defined nearPlaneHS as a float, when it should be a vec2, but it wont make or break it

vec2 nearPlaneHalfSize = vec2(screenSize.x / screenSize.y, 1.0) * halfTan;
 
If things still don't work out after that:

1. Check that the value of getDepth(refcoord) is constant for all positions in the scene

this is the most likely issue, that your depth texture is wrong somehow

 

2. Check that length(v_pos) is constant as you move around

I seriously doubt this one is wrong, but you never know

 

In both cases:

1. If you rotate your camera and move around, the color should only vary depending on distance from camera

2. The values should only range from [0, 1]

3. The values should be completely linear

 

Once you have verified that one of your inputs are simply wrong, you'll need to figure out the why

Eg. is the values of ZNEAR/ZFAR correct?

Edited by Kaptein

Share this post


Link to post
Share on other sites

Okay, fixed 2 first problems that you mentioned and this is what I got: http://scrapeshare.com/?vid=1411064900979-cqexqe Now there is something like gradient, far fragments is white, near gray, but that's not what I want..

 

 

1. Check that the value of getDepth(refcoord) is constant for all positions in the scene

this is the most likely issue, that your depth texture is wrong somehow

 

I think that depth texture is good, I tested it by rendering plane with depth texture bound to it and it was good, if you need I can post screenshot. 

 

 

2. Check that length(v_pos) is constant as you move around

I seriously doubt this one is wrong, but you never know

 

I actually didn't know how I could test it, but I set FragColor = vec4(dist); and here is the result: http://scrapeshare.com/?vid=1411065277580-rx2foy And if dist is value between 0.0-1.0, then I think it's good, all fragments till 1.0 distance has transparency, and which are more far than 1.0 then they are only white, because alpha and color >= 1.0.

Share this post


Link to post
Share on other sites

Finally.. After so much days of googling I found the solution. Now I realized that I need to have both scene depth and water depth values linear, then subtract them and scale. Here is the code if someone needs it:

vec2 textureCoords = gl_FragCoord.xy / ScreenSize;
float sceneZ = texture2D(sceneDepthTexture, textureCoords).x;
float linearSceneDepth = (2.0 * ZNEAR) / (ZFAR + ZNEAR - sceneZ * (ZFAR - ZNEAR));
float linearWaterDepth = (2.0 * ZNEAR) / (ZFAR + ZNEAR - gl_FragCoord.z * (ZFAR - ZNEAR));
const float alphaScale = 1000.0;
float alpha = clamp(linearSceneDepth - linearWaterDepth, 0.0, 1.0) * alphaScale;
Edited by Modestas

Share this post


Link to post
Share on other sites

This topic is 1175 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this