Jump to content
  • Advertisement
Sign in to follow this  
thallish

'wireframe' showing up somehow on terrain

This topic is 4225 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hi I did not know what to call this thread, but here is an image that shows the issue As you can see there is lines showing up on the terrain and I don't know if they are supposed to or not. The terrain is only lighted, not textured. The lines seem to correspond to the triangle edges`. Anybody got a clue why this is happening?

Share this post


Link to post
Share on other sites
Advertisement
This is an artifact of per-vertex lighting, and there just isn't much you can do about it. I has to do with the way the color is interpolated across a triangle by the hardware. Using per-pixel lighting would probably make it far less obvious, but probably won't fix it completely either.

The good news is that when texturing is enabled, the artifact can barely be seen (depending on the texture, of course).

If you keep your triangle count high enough to avoid cases where a single triangle gets to big, it shouldn't prove to be a problem.

Hope this helps.

Share this post


Link to post
Share on other sites
Okay that is good to hear[smile] and I am in fact using per-pixel lighting.

But on with the texturing then.

Thank you.

Share this post


Link to post
Share on other sites
If you're using per-pixel lighting and getting these artifacts, you may indeed have an error in your shader. I'd recommend you have another look at it, and at per-pixel lighting articles to make sure it it correct.
If you'd like, you could post some code here, and get some comments on it's validity.

Share this post


Link to post
Share on other sites
It may also be your input data, if no vertices are shared, and the normals all point in the same direction as the triangle, then you get nothing out of per pixel lighting. Smoothing out the normals by averaging all vertices that are connected can help improve the quality of per pixel lighting.

Share this post


Link to post
Share on other sites
Hi

Here is my shader code:


// FX parameter. Accessible from C++ file
uniform extern float4x4 WorldViewProjMatrix;

// vectors describing view position
uniform extern float3 viewPos;

// degree of shinyness
uniform float specularPower = 50.0f;

// position and color, describing the incoming light
uniform float3 lightDiffusePos = float3(0.0f,-1.0f,0.0f);

// color vectors describing the different materials
uniform float4 diffuseMaterial = {0.9f, 0.9f,0.92f,1.0f};
uniform float4 ambientMaterial = {1.0f, 1.0f,1.0f,1.0f};
uniform float4 specularMaterial = {0.8f, 0.8f, 0.86f,1.0f};

// color vectors describing the different light colors
uniform float4 lightDiffuseCol = float4(0.85, 0.85, 0.85, 1.0f);
uniform float4 lightAmbientCol = float4(0.2f, 0.22f, 0.22f, 1.0f);
uniform float4 lightSpecCol = float4(0.7f,0.7f,0.7f, 1.0f);

// the output vertex structure
struct OutputVS
{
// POSITION0 tells that this member corresponds to the
// data member in the custom vertex structure with usage
// D3DDECLUSAGE_POSITION and index 0
float4 posH : POSITION0;
float3 normal : TEXCOORD0;
float3 posW : TEXCOORD1;
};

// VERTEX SHADER
// input: the current position of vertex and the specified color
// output: the transformed vertex
OutputVS VS(float3 posL : POSITION0, float3 _normal : NORMAL0)
{
// initialize the output structure by zeroing it out
OutputVS outVS = (OutputVS)0;

// pass the normal
outVS.normal = _normal;

// pass the vertex position, already in world space
outVS.posW = posL;

// transform the vertex to homonegenous clip space
outVS.posH = mul( float4( posL, 1.0f ), WorldViewProjMatrix );

// return the vertex
return outVS;
}

// PIXEL SHADER
// input: a normal and position
// output: color of the current pixel by per-pixel-lighting
float4 PS(float3 normal : TEXCOORD0, float3 posW : TEXCOORD1) : COLOR
{
// be sure the normal is unit length
normal = normalize( normal );

// normalize the light positions
lightDiffusePos = normalize( -lightDiffusePos );

/***** Diffuse Component *****/

float lambDiff = max( dot( lightDiffusePos, normal), 0.0f );
float diff = lambDiff * ( diffuseMaterial * lightDiffuseCol ).rgb;

/***** Specular Component *****/
// find the view vector
float3 viewVector = normalize( viewPos - posW );

// calculate the amount of specular light using the half-way vector
float lambSpecHalfWay = pow( max( dot( normal, normalize( lightDiffusePos + viewVector ) ), 0.0f ), 4 * specularPower );
float3 spec = lambSpecHalfWay * ( specularMaterial * lightSpecCol ).rgb;

/***** Ambient Component *****/
float3 amb = ( ambientMaterial * lightAmbientCol ).rgb;

if(diff <= 0.0f)
spec = 0.0f;

return float4( amb + diff + spec, diffuseMaterial.a );

}

// TECHNIQUE, (ie. the composition of shadering)
technique Terrain
{
pass P0
{
// set rendering states
FillMode = Solid;

// which vertex and pixel shader is associated with the pass
vertexShader = compile vs_2_0 VS();
pixelShader = compile ps_2_0 PS();
}
}




After having looked at it more closely it certainly looks like the normals are somehow screwed.

Here is how I create my mesh and compute the normals, maybe there is an error:


/************************************************************************/
/* Run through all the vertices and calculate the position. */
/* Push the vectors onto the mesh vertex buffer */
/************************************************************************/
int vertNum = 0;

VertexPosNormal* v = 0;
DXRESULT( m_pTerrainMesh->LockVertexBuffer( 0, (void**)&v ) );

for (int i = 0; i < vertexRows; i++)
{
for (int j = 0; j < vertexCols; j++)
{
// calculate the vertex position
float vertPosZ = m_position.z + (float)( m_sizeOfQuads * i );
float vertPosX = m_position.x + (float)( m_sizeOfQuads * j );
float vertPosY = 0.0f;

// translate vertex positions so that the middle vertex end
// up at the position specified at construction
vertPosX = vertPosX -( float)( numCols * m_sizeOfQuads ) / 2;
vertPosZ = vertPosZ - (float)( numRows * m_sizeOfQuads ) / 2;

// test for filtering of heights and specify the height
if ( filter )
vertPosY = m_position.y + tempHeights[vertNum] - 200;
else
vertPosY = m_position.y + in[vertNum] - 10;

// push the vertices, assign temp normal
v[vertNum].mPos = Vector3( vertPosX, vertPosY, vertPosZ );
v[vertNum].mNormal = Vector3( 0.0f, 1.0f, 0.0f );
vertNum++;
}
}

DXRESULT( m_pTerrainMesh->UnlockVertexBuffer() );

/************************************************************************/
/* Specify the indices for the mesh */
/************************************************************************/
int quad = 0;
std::vector<DWORD> indices;
indices.resize(numIndices);

for (int i = 0; i < numRows; i++)
{
for ( int j = 0; j < numCols; j++ )
{
// filling up indices
indices[quad] = (DWORD)( j + i * vertexCols);
indices[quad + 1] = (DWORD)( j + (i+1) * vertexCols);
indices[quad + 2] = (DWORD)( j + i * vertexCols + 1);

indices[quad + 3] = (DWORD)( j + i * vertexCols + 1 );
indices[quad + 4] = (DWORD)( j + (i+1) * vertexCols );
indices[quad + 5] = (DWORD)( j + 1 + (i+1) * vertexCols );

quad += 6;
}
}

WORD* indexBuffer = 0;
DWORD* attBuffer = 0;

DXRESULT( m_pTerrainMesh->LockIndexBuffer( 0, (void**)&indexBuffer) );
DXRESULT( m_pTerrainMesh->LockAttributeBuffer( 0, &attBuffer) );

for (unsigned int i = 0; i < numTris; i++)
{
indexBuffer[i*3+0] = (WORD)indices[i*3+0];
indexBuffer[i*3+1] = (WORD)indices[i*3+1];
indexBuffer[i*3+2] = (WORD)indices[i*3+2];

attBuffer = 0; // subset 0
}

DXRESULT( m_pTerrainMesh->UnlockIndexBuffer() );
DXRESULT( m_pTerrainMesh->UnlockAttributeBuffer() );

// compute normals
DXRESULT( D3DXComputeNormals( m_pTerrainMesh, 0 ) );

DWORD* adj = new DWORD[ m_pTerrainMesh->GetNumFaces() * 3 ];
DXRESULT( m_pTerrainMesh->GenerateAdjacency( 0.001f, adj ) );
DXRESULT( m_pTerrainMesh->OptimizeInplace( D3DXMESHOPT_VERTEXCACHE | D3DXMESHOPT_ATTRSORT, adj, 0, 0, 0 ) );
delete[] adj;




Tell me if there are any more you need [wink]

Share this post


Link to post
Share on other sites
Out of curiosity, does sending in the adjacency to the normal generation function affect anything?

Share this post


Link to post
Share on other sites
I think this is called "mach banding", see 16.2.3 Polygon Mesh Shading in

http://www.futuretech.blinkenlights.nl/gouraud.html

I think only with normal mapping it will go away. Because with phong with normals are still interpolated. But normal mapping a terrain might not be practical because of the size of the normal map needed. But as others have said, once a texture is on it it cannot be seen.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!