Jump to content

  • Log In with Google      Sign In   
  • Create Account


Yann ALET

Member Since 11 Jun 2008
Offline Last Active Mar 27 2014 04:56 PM
-----

Topics I've Started

Good specular for water surface

20 March 2014 - 04:32 PM

Hi there,
 
I am trying to have a convincing ocean shader, but i am struggling a bit with the specular effect.
Here is what i have now :
 
Bigs waves
 
 
 
Small waves
 
 
 
I ported this tutorial from OpenGL to DirectX 11 to simulate ocean with FFT (currently runing on CPU, so very poor performance)
 
I'd like this to look a bit more like AC IV ocean's
 
 
I followed this tutorial which uses a normal map to render more realistic water
 
And things got a lot worse :
 
 
 
Here is my pixel shader :
 
float4 PS(VertexOut pin) : SV_Target
{ 
float zFar = 300.0f;
pin.NormalV = normalize(pin.NormalV);
float3 viewRay = normalize(pin.PosV);
float3 refractRay = refract(viewRay, pin.NormalV, 0.75f);
float2 InvTextureSize = float2(1.0f / gBackBufferWidth, 1.0f / gBackBufferHeight);
float2 texCoord = float2(pin.PosH.x, pin.PosH.y) * InvTextureSize;

float3 diff = viewRay - refractRay;
float3 newTarget = viewRay + refractRay;

texCoord = texCoord + (diff.xz * 0.1f);

float3 n1 = gNormalMap.Sample(PointSampler, (texCoord / 0.1f) + (float2(1.0f, 0.0f) * (gTimer * 0.25f)));
float3 n2 = gNormalMap.Sample(PointSampler, (texCoord / 0.2f) + (float2(1.0f, 0.0f) * (gTimer * 0.25f)));

// Expand the range of the normal from (0,1) to (-1,+1).
n1 = (n1 * 2.0f) - 1.0f;
n2 = (n2 * 2.0f) - 1.0f;

float3 n0 = normalize(n1 + n2);

float zw = gDepthMap.Sample(PointSampler, texCoord).x;

// Reconstruct linear depth.
    float depth2 = ProjectionB / (zw - ProjectionA); 

float3 groundPosV = normalize(newTarget) * depth2; 
    float distance = DistanceBetweenPoints(groundPosV, pin.PosV);
    float3 sourceColor = gGroundMap.Sample(PointSampler, texCoord + (n0*0.03f)); 
float3 sandColor;

if(sourceColor.x == 0.0f)
{
sandColor = float3(0.0f, 0.0f, 0.0f);
}
else
{
sandColor = GetWaterColorAt(sourceColor, distance);
}

// Add ambiant and specular.

float4 emissive_color = float4(sandColor, 1.0);
float4 ambient_color = float4(0.31, 0.54, 0.75, 1.0);
float4 diffuse_color = float4(0.5, 0.65, 0.75, 1.0);
float4 specular_color = float4(1.0, 1.0, 1.0,  1.0);

float emissive_contribution = 0.80f;
    float ambient_contribution  = 0.30f;
    float diffuse_contribution  = 0.30f;
    float specular_contribution = 0.30f;

float3 lightVec = normalize(gSunLightDir);
float3 toEye = normalize(gEyePosW - pin.PosW);

float3 normal = normalize(pin.NormalW + n0); <--------- This is where i add the normal map value to the normal i already have.

float d = dot(normal, lightVec);

float4 finalColor = emissive_color * emissive_contribution +
            ambient_color  * ambient_contribution  +
            diffuse_color  * diffuse_contribution  * max(d, 0);

// Calculate the reflection vector using the normal and the direction of the light.
float3 reflection = -reflect(lightVec, normal);

// Calculate the specular light based on the reflection and the camera position.
float specular = dot(normalize(reflection), normalize(pin.ViewDirection));

float specularShininess = 20.0f;

if(specular > 0.0f)
{
// Increase the specular light by the shininess value.
specular = pow(specular, specularShininess);

// Add the specular to the final color.
finalColor = saturate(finalColor + specular);
}

finalColor.a = 1.0f;
return finalColor;
}
 
I took a look at NVidia ocean demo and realized they used an ocean mesh with a lot more vertices
 
 
I thought it came from by ocean mesh resolution, so i increased my grid resolution, but it still dit not do the trick :
 
 
Nvidia mentions something about using perlin noise to add more details, but doesn not say why and how.
 
So before i attempt to decifer NVidia's code, can anyone give me a high level explanation on why my ocean looks so bad ?
I know i still don't have reflection & caustic, but still, does this explain eveything ???
 
You can download my binaries here, shaders are compiled at runtime, so you can try to edit and see what's wrong :-/
You can tweak the settings.xml file to play with grid resolution and waves amplitude.
 
Thanks,
Yann

Water refraction bug

23 December 2013 - 01:56 PM

Hi there,

 

I am trying to implement refraction for water surface, and i am struggling a bit with the implementation as this is the first time I am trying to do such a thing. The water surface is not flat, i use FFT on a small grid to make vertices move, and i use the normals of each vertices to calculate the refraction vector. Then i substract the view ray with the refract ray to know how much i need to offset my texture sampling when shading the water surface.

 

Here is the shader

 

VertexOut VS(VertexIn vin)
{
VertexOut vout; 

// Transform to world space.
vout.PosW    = mul(float4(vin.PosL, 1.0f), gWorld).xyz;

// Find transformed normals.
vout.NormalW = mul(vin.NormalL, (float3x3)gWorld);
vout.NormalV = mul(vin.NormalL, (float3x3)gWorldView);

// Transform to homogeneous clip space.
vout.PosH = mul(float4(vout.PosW, 1.0f), gViewProj);

vout.Tex = vin.Tex;

vout.PosV = mul(float4(vin.PosL, 1.0f), gWorldView).xyz;

return vout;
}

float4 PS(VertexOut pin) : SV_Target
{ 
float backBufferWidth = 1024.0f;
float backBufferHeight = 768.0f;
float zFar = 300.0f;
pin.NormalV = normalize(pin.NormalV);
float3 viewRay = normalize(pin.PosV);
float3 refractRay = refract(viewRay, pin.NormalV, 0.75f);
float2 InvTextureSize = float2(1.0f / backBufferWidth, 1.0f / backBufferHeight);
float2 texCoord = float2(pin.PosH.x, pin.PosH.y) * InvTextureSize;
float zw = gDepthMap.Sample(PointSampler, texCoord).x;

// Reconstruct linear depth.
    float depth2 = ProjectionB / (zw - ProjectionA);

// Calculate View position of ground fragment.
float3 groundPosV = viewRay * depth2; 

    // Calculate sampling offset caused by refraction.
float3 diff = normalize(normalize(viewRay) - normalize(refractRay));

texCoord = texCoord + diff.xz;

float3 sourceColor = gGroundMap.Sample(PointSampler, texCoord); 
return float4(sourceColor, 1.0f); 
}

And here is the result :

http://www.youtube.com/watch?v=mkoA7Ec1BjU&feature=youtu.be

 

 

I think the issue is around these lines :

// Calculate sampling offset caused by refraction.
float3 diff = normalize(normalize(viewRay) - normalize(refractRay));
texCoord = texCoord + diff.xz;

as i am not sure this will keep me in UV space.

 

Any idea ?!

 

Thanks :-)


Sampling depth buffer in shader

12 September 2013 - 08:56 PM

Hi there,

 

I am trying to improve my water shader so i takes into account how deep the water is.

 

For this i need in the water shader to sample the depth buffer (i render the ground before the ocean, so the depth buffer already has the information i need when i am about to render the ocean).

 

The problem comes when i need to send it to my ocean shader. I read that i need to unbind the depth buffer from the current render target to be able to sample it. So right before rendering the ocean i call this :

 

md3dImmediateContext->OMSetRenderTargets(1, &mRenderTargetView, NULL);

 

But then my ocean grid does not have depth test properly performed, and it looks just wrong.

If i don't unbind the depth buffer, then i only sample the value 0 ...

 

I googled this a bit, but it's always explained with an engine using differed rendering. I might switch to differed rendering at some point, but for now, i just have a "normal" rendering pipeline. Is this incompatible with the ability to sample the hardware depth buffer ?


Instancing : Different textures for instances

10 September 2013 - 06:20 PM

Hi there,
 
I am using instancing to draw a very large amount of cubes in a voxel world (yes another one ... ^_^). So far, all cubes used the same texture, but now i want to use different ones when i build the world.
All my textures are in a unique atlas sent to the GPU i sample from in the pixel shader.
So my question is this, performace wise, what is the best approach to display instanced cubes with different textures ?
With my limited experience, i see only two ways :
1- Instead of calling DrawIndexInstanced once for my 2 million cubes, i call it multiple times for each types of cubes (cubes with different textures).
2- I still call it once, but add a in the instance data the index of the texture in the atas, and to the UV lookup directly in the shader.
3- ?
 
I am tempted to go with option 2, but i fear doing the texture lookup so often is going to hurt performances real bad in the end.
 
Any suggestions ?
 
Thanks :-)

Ocean opacity

03 August 2013 - 01:05 PM

Hi there,
I'm trying to add water in a voxel engine i am working on. My naive first approach was to send with each vertex the depth of the water just bellow that vertex.
The deeper it is, the closer the alpha value gets to 1.0f.
It gives acceptable results from above looking down ...
PALaJkr.png
 
... but  very unrealistic result from other angles as you can see bellow :
 
A1NBKvm.png
 
 
So i guess what i need to do is to cast a ray from the camera to each vertex on the ocean grid and calculate how much water there is between the vertex hit by the ray,
and triangle hit by the same ray at bottom of the ocean. But it sounds very expensive sad.png
 
Current approach :
 
TTy2DLW.png
 
What i need :
 
jbFuV8D.png
 
In you opinion, what is the most efficient way to do that ?
Or maybe there is a completely other cheap way of faking ocean depth you can point me to ?
 
Thanks :-)

PARTNERS