Landscape lighting with Normal map

Started by
16 comments, last by Hodgman 13 years ago
That's the result of smoothstepping from the second link, I think it looks worse than simple hardware bilinear filtering from previous screenshots (sad):

13144600840035597059.png

What am I supposed to do now?
Advertisement
Hmm, you don't really want to do the smoothstep() on the normalmap... Can you instead sample the heightmap directly several times, and then construct the normal in the pixel shader? E.g. You can sample the heightmap 3 times in an L shape and construct the normal with a crossproduct, or like so (taken from my own hlsl terrain shader):


float4 heights;
float t = texel_size;
heights[0] = tex2D(heightmap, uv + float2( 0, -t)).r * height_scale;
heights[1] = tex2D(heightmap, uv + float2(-t, 0)).r * height_scale;
heights[2] = tex2D(heightmap, uv + float2( t, 0)).r * height_scale;
heights[3] = tex2D(heightmap, uv + float2( 0, t)).r * height_scale;

float3 normal;
normal.x = heights[1] - heights[2];
normal.y = 2.0;
normal.z = heights[0] - heights[3];

normal = normalize(normal);



Instead of doing the tex2D directly, you can use the filtered version and construct the normal from that.

Let me know if that works out ok,
T

Hmm, you don't really want to do the smoothstep() on the normalmap... Can you instead sample the heightmap directly several times, and then construct the normal in the pixel shader? E.g. You can sample the heightmap 3 times in an L shape and construct the normal with a crossproduct, or like so (taken from my own hlsl terrain shader):


float4 heights;
float t = texel_size;
heights[0] = tex2D(heightmap, uv + float2( 0, -t)).r * height_scale;
heights[1] = tex2D(heightmap, uv + float2(-t, 0)).r * height_scale;
heights[2] = tex2D(heightmap, uv + float2( t, 0)).r * height_scale;
heights[3] = tex2D(heightmap, uv + float2( 0, t)).r * height_scale;

float3 normal;
normal.x = heights[1] - heights[2];
normal.y = 2.0;
normal.z = heights[0] - heights[3];

normal = normalize(normal);



Instead of doing the tex2D directly, you can use the filtered version and construct the normal from that.

Let me know if that works out ok,
T


I've tried this before in the very beginning - and that's horrible especially for LOD-based terrain, moreover constructing normals using only 4 samples is crude. I'm constructing my normal map during preprocessing using canny edge detection approach - by filtering height map image with gaussian kernels - this gives me high quality normal maps. Kernels take into account more than 4 neighbours, in brief it takes as many neighbours as I will specify by sigma parameter (like in gaussian function). So I just don't get why would I compute normals in shader...
just looking at how your terrain is put together isnt very realistic as far as good terrains go.
Just stepping up a height level with one grid step across it very steeply, then doing it again a few more grid steps across isnt right.
Your supposed to gradually change anyway... i think if you made your terrain properly this wouldnt be a problem.

Since these quaddy things appear right across the edges of dual triangles (which form a diamond or quad - consider the screenshot) - I believe that it is not about bi-linear interpolation - I just can't believe it!


Actually, along the edges of the square formed from 4 texel centers is exactly where bilinear interpolation of normals breaks down. If you draw a line between the ends of any two normalized vectors and then evenly space points on that line (linear interpolation) the angles formed from these points and the origin will not be equal. They will tend to bunch up at the beginning and end of your line.

In terms of a normal map it means that most of the normals that you reconstruct from linear interpolation(s) will be close to the normals at the edges of the interpolation. In 2D aka bilinear filtering, that makes your normals bunch up at the edges of a quad.

Spherical linear interpolation (or slerp) would evenly space the reconstructed normals in terms of angles on a sphere and would look better. Unfortunately, slerp is a lot more expensive than lerp and GPU hardware interpolators and instructions rarely support it.

I've tried this before in the very beginning - and that's horrible especially for LOD-based terrain, moreover constructing normals using only 4 samples is crude. I'm constructing my normal map during preprocessing using canny edge detection approach - by filtering height map image with gaussian kernels - this gives me high quality normal maps. Kernels take into account more than 4 neighbours, in brief it takes as many neighbours as I will specify by sigma parameter (like in gaussian function). So I just don't get why would I compute normals in shader...


Sampling the heightmap in the pixel shader works well for LOD based terrain, since it's independent of LOD.

Doing 4 taps to construct the normal isn't great - an offline approach is always going to beat it. The nice things about heightmaps though is that they're linearly filterable (unlike normal maps as Wanderingbort points out - although again, whether you notice or not depends on application) and more amenable to tricks (such as the filtering one).

T
Use more than 8bit heightmap, try with realistic landscape with small variations as well.
If that's not enough use bicubic filtering (or better) for normals to get rid of bilinear artifacts.
[color=#1C2837][size=2]i. e. vertex : normal = 1 : 1
if the normals are supplied per vertex, then they should be output by the vertex shader
Wrong it is crap approach for ... this is LOD-based terrain[/quote]Well that means the 'if' doesn't apply. No need to break out the W word.

Does increasing the radius / number of neighbors in your normal-map generation step help smooth things out?

This topic is closed to new replies.

Advertisement