Sharp lines with linear filtering look awful (WHY!?)

Started by
5 comments, last by phil_t 11 years, 9 months ago
My previous post is here:
http://www.gamedev.net/topic/626376-per-vertex-lighting-are-these-ugly-lines-normal/

When trying to light some procedurally-generated terrain, I see these alarmingly bright lines:
example2.png

These lines run along polygon edges, so I thought maybe doing lighting in the fragment shader would help. In that case, the vertex normals were still subject to linear interpolation, and even after re-normalizing, produces almost the exact same sort of lines.

I tried vertex smoothing. It helped a little, mainly by removing the areas of highest contrast. I tried tesselating quads into 4 triangles instead of two (with a new generated midpoint), but that didn't help at all.

Finally, I decided that the problem must be that I want to filter linearly over a quad and I can only do that when sampling a texture, so I tried packing my lighting data into a texture. This texture has one pixel per vertex.

Result:
example3.png

On the left, you see the terrain rendered with the terrain texture using nearest filtering. You can see there is one pixel per vertex and it is centered neatly on it. On the right, you see what it looks like with linear interpolation. I really really really did not expect to see those sharp lines. I expected to see nice blurry shapes.

I tried resizing the image in paint.net and was surprised that bilinear interpolation in a image editing tool displays the same lines:
Filters.png
The third image is bicubic filtering. That is more like the result I was hoping for.


I feel like I'm doing this all wrong. I don't think I want to generate a higher-res texture just to do some fancy spline interpolation on it. I think I just have to hide the artifacts using some noise from the ground texture, normal smoothing, less contrast, more ambient light, and maybe some noise in the lighting calculation.

Any thoughts are appreciated.

One more thing: In the side-by side, you can see on the left side, even though all coloring is coming from the fragment shader [s]with no lighting whatsoever (just a nearest-pixel texture lookup), you can still see lines on polygon edges. I have absolutely no explanation for that.[/s] Where can that possibly be coming from?

Dumb, dumb, dumb. I accidentally left some lighting in the shader, it wasn't just a texture lookup. When removed, the lines along polygon edges on the left disappears as expected, but the lines remain in the bilinear filter:
example4.png
It just doesn't look "linear" to me.
Advertisement
The problem is that you have too much noise in your heightmap or from another perspective: not enough spatial resolution
you will have too smooth out your heightmap to make the terrain more nicely curved.

Edit: I don't quite get the confusion. This is exactly how bilinear filtering on low res high contrast images has always looked. Haven't you played any games around 1997? wink.png
Madhed, you're probably right. This is how bilinear filtering on low res high contrast images looks.

I feel like I'm missing something though. A few years back, I spent a lot of time doing procedural terrain generation with managed DirectX, using vertex lighting, and I didn't notice these sorts of artifacts. That was using the fixed function pipeline, so I thought I must be doing something wrong.

The type of data and terrain are very similar. I don't know. I think this version looks bad even with very small polygons.
example5.png

Here, I've smoothed the normals from above:
example6.png
It looks better, but I can still see it. You have to see it moving or with the light moving. The lines are still apparent. But I'd rather not smooth the normals anyway, because it robs the landscape of depth. The whole thing looks blurry.

And to look at the normals, it's not like they are crazy. It's a pretty smooth roll along the terrain.
example7.png

Even with triangles just a few pixels wide, I'm seeing this.
example8.png


It's possible I'm only imagining that this wasn't a problem before, but I'm wondering if the lighting algorithm in the fixed function pipleline was different. My lighting calculation is:
max(dot((transformedNormal), uLightingDirection), 0.0)

I know I'm being a perfectionist, but I'm certain I'm missing something basic.
Someone said that once you apply a ground texture, it's hard to notice. I see this webgl demo has a similar problem on one slope and when it adds polygon detail to the steep area of the slope, the artifacts get worse:
http://www.oak3d.com/demo/Engine_Terrain.html

I can see that part of the problem there is too much noise in the heightmap. I can see how much it helps to smooth my raw height data.

I planned on running splines along the surface in the long run and that will accomplish a similar thing.
I'm also beginning to wonder if this has something to do with the intensity of the light on my screen does not scale linearly with the rgb value of the pixel. Maybe 200 and 220 aren't that different, but 240 looks *much* brighter. Or something.

Or maybe I'm grasping at straws now.
I think you should just go on now and implement texturing and add a little ambient light. I bet the artifacts will become much weaker then,


Even with triangles just a few pixels wide, I'm seeing this.
example8.png



You may be able to reduce the appearance of lines here by triangulating your terrain in the opposite manner, depending on the direction the slope faces.

The "Additional Optimizations" section of this article explains it a bit:
http://mtnphil.wordpress.com/2011/09/22/terrain-engine/

This topic is closed to new replies.

Advertisement