• Advertisement
Sign in to follow this  

Heightmap Normals

This topic is 2259 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello,

I'm trying to add simple shading to my heightmaps, therefore I need normals, however it seems there's some problem with them. Issues are visible only when height difference is high (image). At first I thought normalize() is missing in pixel shader, but it's there. Possibly normal calculations are wrong, but I failed to find any mistakes. Maybe anyone knows what's wrong? Here's some relevant code:

Vertex & Pixel Shaders:
VS_OUTPUT MainVS(VS_INPUT Input) {
VS_OUTPUT Output;
Output.Position = mul(float4(Input.Position, 1), WorldViewProj);
Output.Normal = Input.Normal;
Output.Tex0 = Input.Tex0;
return Output;
}

PS_OUTPUT MainPS(VS_OUTPUT Input) {
PS_OUTPUT Output;
Output.Color0 = float4(normalize(Input.Normal), 1);
return Output;
}



Normal Calculation:

// triangle normals
MyEngine::Vector3 TriangleNormals[127][127][2];
for(uint8 x = 0; x < 127; x++) {
for(uint8 z = 0; z < 127; z++) {
MyEngine::Vector3 U1 = Vertices[z+1][x+1].Position - Vertices[z][x].Position;
MyEngine::Vector3 Y1 = Vertices[z][x+1].Position - Vertices[z][x].Position;
TriangleNormals[z][x][0] = U1.cross(Y1).normalized();

MyEngine::Vector3 U2 = Vertices[z+1][x].Position - Vertices[z][x].Position;
MyEngine::Vector3 Y2 = Vertices[z+1][x+1].Position - Vertices[z][x].Position;
TriangleNormals[z][x][1] = U2.cross(Y2).normalized();
}
}

// vertex normals, using macro to check for bounds
#define Q(i, j, x, z) \
if((x) >= 0 && (x) < 128 && (z) >= 0 && (z) < 128) { \
count += 2; \
Vertices[j].Normal += TriangleNormals[z][x][0] + TriangleNormals[z][x][1]; \
}

for(uint8 x = 0; x < 128; x++) {
for(uint8 z = 0; z < 128; z++) {
float count = 0;
Q(z, x, x, z);
Q(z, x, x-1, z);
Q(z, x, x, z-1);
Q(z, x, x-1, z-1);
Vertices[z][x].Normal /= count;
Vertices[z][x].Normal.normalize();
}
}



Thank you in advance.

Share this post


Link to post
Share on other sites
Advertisement
After some thinking I figured there's some calculation mistake. But thanks to Google I've found other way to calculate normals:
for(auto i = 0; i < 127 * 127 * 2; i++) { // number of indices divided by 3
auto i0 = Indices[i * 3 + 0];
auto i1 = Indices[i * 3 + 1];
auto i2 = Indices[i * 3 + 2];
MyEngine::Vector3 U1 = (*Vertices)[i1].Position - (*Vertices)[i0].Position; // array is two dimensional, dereferencing to have long 1-dimensional array
MyEngine::Vector3 U2 = (*Vertices)[i2].Position - (*Vertices)[i0].Position;
MyEngine::Vector3 N0 = U1.cross(U2).normalized();
(*Vertices)[i0].Normal += N0;
(*Vertices)[i1].Normal += N0;
(*Vertices)[i2].Normal += N0;
}

for(auto i = 0; i < 128 * 128; i++) // number of vertices
(*Vertices).Normal.normalize();



This way seems to be much easier and makes more sense, however normals still look wrong on slopes: http://img851.imageshack.us/img851/8470/54800837.png
I feel it's supposed to be completely smooth, not triangulated like I have, please correct me if I'm wrong.

Thank you in advance.

Share this post


Link to post
Share on other sites
If your heightmap is a loaded bitmap (ie, not generated on the fly by your application) I found it's *miles* easier to just put the bitmap through the normal map filter in Photoshop, save that as a separate bitmap and load it as another shader input. This has the added advantage that you get the full resolution of normal mapping even when the polygons of your heightmapped mesh are larger than one pixel of the heightmap.

You will need to scale the normals to account for the max height value used when the normal map is created, unless your mesh uses the same max depth.

Share this post


Link to post
Share on other sites

This way seems to be much easier and makes more sense, however normals still look wrong on slopes: http://img851.images...70/54800837.png
I feel it's supposed to be completely smooth, not triangulated like I have, please correct me if I'm wrong.
Thank you in advance.


Hello,

this is a common problem when using per-vertex normals. You can find a million posts on the same subject.
Problem comes from the fact that every pixel drawn is affected by the 3 normals of the triangle. This causes lighting in the adjacent triangles to behave in the way seen on your screen shot.

As far as I know, the only way to solve this is to use a normal map. You may store the very same world space normals in a texture of the same size as your terrain. When reading from a texture with bilinear filtering, each pixel gets averaged normal which is affected by 4 surrounding normals which should give you a smoother lighting across the terrain.

Of course, a normal map gives you possibility to use higher resolution normal map to make your terrain look more detailed than it is actually.

Cheers!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement