Followers 0

# DX11 Normal Interpolation issues on my generated terrain

## 9 posts in this topic

Hi all,
I am having a weird problem with normals on my generated terrain. I am not sure whether is it is shader or mesh issue, but here is how it looks:

As you can see, I get this pattern along the edges of the triangles. This reminds me of per vertex shading  however I am aiming for per pixel shading.

Here are my vertex and pixel shaders:

VS:

cbuffer cbToProjection
{
float4x4 matToProj;
}

struct VS_IN
{
float4 Position : POSITION;
float3 Normal: NORMAL;
};

struct VS_OUT
{
float4 Position : SV_POSITION;
float3 NormalWS: TEXCOORD1;
};

VS_OUT main(VS_IN IN)
{
VS_OUT OUT;
OUT.Position = mul(IN.Position,matToProj);
OUT.NormalWS = normalize(IN.Normal);
return OUT;
}

PS:

struct PS_IN
{
float4 Position : SV_POSITION;
float3 NormalWS: TEXCOORD1;
};

float4 main(PS_IN IN): SV_TARGET0 {
float3 normal = normalize(IN.NormalWS);
float3 toLight = normalize(float3(1,3,-2));
float NDotL = saturate(dot(toLight,normal));
float4 color = float4(1.0f,1.0f,1.0f,1.0f);
color.rgb *= NDotL;
return color;
}

so what am I doing wrong?

0

##### Share on other sites

From your shader code, it looks like you are doing this correctly (although you could remove the normalize call on the normal vector in the VS since you renormalize in the PS after rasterization).  My guess is that your terrain is defining three vertices for each triangle face, rather than one vertex at each grid point.  You can verify this by checking the number of vertices  you are passing in with your draw call, or you can also check this with PIX/Graphics Debugger to see how many primitives are generated from how many input vertices.

0

##### Share on other sites
From your shader code, it looks like you are doing this correctly (although you could remove the normalize call on the normal vector in the VS since you renormalize in the PS after rasterization).  My guess is that your terrain is defining three vertices for each triangle face, rather than one vertex at each grid point.  You can verify this by checking the number of vertices  you are passing in with your draw call, or you can also check this with PIX/Graphics Debugger to see how many primitives are generated from how many input vertices.

So, I moved the rendering into indexed rendering, so there is only one normal per vertex and still get same result.

0

##### Share on other sites

Have you tried with different light directions? The dark areas simply look like they are dark because they are facing away from the parallel light you have hard-coded in your shader (ie. the dark sections are always on the -z axis in your image).

It may also be worth outputing the normal to the frame buffer so you can visually see any issues with the interpolation.

color = normal * 0.5f + 0.5f;

return color;

You can also change your light vector so that it's always directly above the terrain (float3 toLight = float3( 0.0f, 1.0f, 0.0f );) which should give you a more even lighting across the terrain, again helping to see any issue with the normals. With the off-center light angle, it makes it difficult to say what is wrong really sorry :) But, certainly, your shader code looks fine. If it isn't the light direction confusing you, then it maybe the normals themselves.

My first guess is just that the light is at an angle really ;)

n!

0

##### Share on other sites

It could also be that I'm misunderstanding what you're complaining about.

Sometimes people build their terrain such that the vertices look like:

Whereas you can avoid some artifacts on terrain lighting if you structure your vertices like:

n!

0

##### Share on other sites
From your shader code, it looks like you are doing this correctly (although you could remove the normalize call on the normal vector in the VS since you renormalize in the PS after rasterization).  My guess is that your terrain is defining three vertices for each triangle face, rather than one vertex at each grid point.  You can verify this by checking the number of vertices  you are passing in with your draw call, or you can also check this with PIX/Graphics Debugger to see how many primitives are generated from how many input vertices.

So, I moved the rendering into indexed rendering, so there is only one normal per vertex and still get same result.

That may or may not mean that there is exactly one vertex normal being used at each grid point.  How many vertices are in your vertex buffer, how many indices in your index buffer, and how many primitives are you drawing?  Compare that with your grid size and make sure that you only have N+1 x N+1 vertices for a grid of size N x N.

0

##### Share on other sites

Have you tried with different light directions? The dark areas simply look like they are dark because they are facing away from the parallel light you have hard-coded in your shader (ie. the dark sections are always on the -z axis in your image).

It may also be worth outputing the normal to the frame buffer so you can visually see any issues with the interpolation.

color = normal * 0.5f + 0.5f;

return color;

You can also change your light vector so that it's always directly above the terrain (float3 toLight = float3( 0.0f, 1.0f, 0.0f );) which should give you a more even lighting across the terrain, again helping to see any issue with the normals. With the off-center light angle, it makes it difficult to say what is wrong really sorry But, certainly, your shader code looks fine. If it isn't the light direction confusing you, then it maybe the normals themselves.

My first guess is just that the light is at an angle really ;)

n!

Setting the light to 0,1,0 doesn't help, the problem persists. Rendering the normals into the frame buffer shows the same problem.

It could also be that I'm misunderstanding what you're complaining about.

Sometimes people build their terrain such that the vertices look like:

Whereas you can avoid some artifacts on terrain lighting if you structure your vertices like:

n!

I am building the mesh the way shown in the first picture, will try the other way that thanks.

From your shader code, it looks like you are doing this correctly (although you could remove the normalize call on the normal vector in the VS since you renormalize in the PS after rasterization).  My guess is that your terrain is defining three vertices for each triangle face, rather than one vertex at each grid point.  You can verify this by checking the number of vertices  you are passing in with your draw call, or you can also check this with PIX/Graphics Debugger to see how many primitives are generated from how many input vertices.

So, I moved the rendering into indexed rendering, so there is only one normal per vertex and still get same result.

That may or may not mean that there is exactly one vertex normal being used at each grid point.  How many vertices are in your vertex buffer, how many indices in your index buffer, and how many primitives are you drawing?  Compare that with your grid size and make sure that you only have N+1 x N+1 vertices for a grid of size N x N.

For a 16x16 grid, there are 256 vertices, index buffer holds 1350 indices

0

##### Share on other sites

It could also be that I'm misunderstanding what you're complaining about.

Sometimes people build their terrain such that the vertices look like:

Whereas you can avoid some artifacts on terrain lighting if you structure your vertices like:

n!

So, I tried building the mesh as in the second image, unfortunately the problems did not go away.

16x16 mesh:

normals:

0

##### Share on other sites
Hey, thanks for the images. The lighting in the first one shows your problem much clearer for me.It looks like your normals are incorrect. For terrain normals I take the height sample x,y and compute the normal for that sample with:

float h0 = GetSample( x + 0, y - 1 );
float h1 = GetSample( x - 1, y + 0 );
float h2 = GetSample( x + 1, y + 0 );
float h3 = GetSample( x + 0, y + 1 );

Vector3 normal;

normal.x = h1 - h2;
normal.y = separation; // separation = distance between samples (I use 1.0f).
normal.z = h0 - h3;

normal.Normalize();

return normal;

If that doesn't help, perhaps posting your normal calculation code?

n! Edited by nfactorial
0

##### Share on other sites

Hey, thanks for the images. The lighting in the first one shows your problem much clearer for me.It looks like your normals are incorrect. For terrain normals I take the height sample x,y and compute the normal for that sample with:

float h0 = GetSample( x + 0, y - 1 );
float h1 = GetSample( x - 1, y + 0 );
float h2 = GetSample( x + 1, y + 0 );
float h3 = GetSample( x + 0, y + 1 );

Vector3 normal;

normal.x = h1 - h2;
normal.y = separation; // separation = distance between samples (I use 1.0f).
normal.z = h0 - h3;

normal.Normalize();

return normal;

If that doesn't help, perhaps posting your normal calculation code?

n!

Are you using y-up for your normals, but z-up for your sample locations in this response?

EDIT:  Also, I think your y value for the normal should be 2 * the separation.  According to this post, at least http://www.gamedev.net/topic/163625-fast-way-to-calculate-heightmap-normals/

Edited by Vexal
0

## Create an account

Register a new account

Followers 0

• ### Similar Content

• Hi Guys,
I am revisiting an old DX11 framework I was creating a while back and am scratching my head with a small issue.
I am trying to set the pixel shader resources and am getting the following error on every loop.
As you can see in the below code, I am clearing out the shader resources as per the documentation. (Even going overboard and doing it both sides of the main PSSet call). But I just can't get rid of the error. Which results in the render target not being drawn.
ID3D11ShaderResourceView* srv = { 0 }; d3dContext->PSSetShaderResources(0, 1, &srv); for (std::vector<RenderTarget>::iterator it = rtVector.begin(); it != rtVector.end(); ++it) { if (it->szName == name) { //std::cout << it->srv <<"\r\n"; d3dContext->PSSetShaderResources(0, 1, &it->srv); break; } } d3dContext->PSSetShaderResources(0, 1, &srv);
I am storing the RT's in a vector and setting them by name. I have tested the it->srv and am retrieving a valid pointer.
At this stage I am out of ideas.
Any help would be greatly appreciated

• hi, guys, how to understand the math used in CDXUTDirectionWidget ::UpdateLightDir
the  following code snippet is taken from MS DXTU source code

D3DXMATRIX mInvView;
D3DXMatrixInverse( &mInvView, NULL, &m_mView );
mInvView._41 = mInvView._42 = mInvView._43 = 0;
D3DXMATRIX mLastRotInv;
D3DXMatrixInverse( &mLastRotInv, NULL, &m_mRotSnapshot );
D3DXMATRIX mRot = *m_ArcBall.GetRotationMatrix();
m_mRotSnapshot = mRot;
// Accumulate the delta of the arcball's rotation in view space.
// Note that per-frame delta rotations could be problematic over long periods of time.
m_mRot *= m_mView * mLastRotInv * mRot * mInvView;
// Since we're accumulating delta rotations, we need to orthonormalize
// the matrix to prevent eventual matrix skew
D3DXVECTOR3* pXBasis = ( D3DXVECTOR3* )&m_mRot._11;
D3DXVECTOR3* pYBasis = ( D3DXVECTOR3* )&m_mRot._21;
D3DXVECTOR3* pZBasis = ( D3DXVECTOR3* )&m_mRot._31;
D3DXVec3Normalize( pXBasis, pXBasis );
D3DXVec3Cross( pYBasis, pZBasis, pXBasis );
D3DXVec3Normalize( pYBasis, pYBasis );
D3DXVec3Cross( pZBasis, pXBasis, pYBasis );

https://github.com/Microsoft/DXUT/blob/master/Optional/DXUTcamera.cpp
• By YixunLiu
Hi,
I have a surface mesh and I want to use a cone to cut a hole on the surface mesh.
Anybody know a fast method to calculate the intersected boundary of these two geometries?

Thanks.

YL

• By hiya83
Hi, I tried searching for this but either I failed or couldn't find anything. I know there's D11/D12 interop and there are extensions for GL/D11 (though not very efficient). I was wondering if there's any Vulkan/D11 or Vulkan/D12 interop?
Thanks!

• Hi Guys,
I am just wondering if it is possible to acquire the address of the backbuffer if an API (based on DX11) only exposes the 'device' and 'context' pointers?
Any advice would be greatly appreciated

• 9
• 9
• 20
• 11
• 28