Below is a section of a hex grid where each tile corner is assigned a slope value, expressed as a shade of grey*. The per-vertex value is interpolated from the three tile slopes that intersect at the vertex. The tile slopes are calculated directly from the height map using bidirectional interpolation.
However, I'm having trouble identifying why the output is so uneven (see the dark column or the ridge snaking from top to bottom on the right side) and why the tiles seem to have an internal structure/brighter edges whereas all vertex values are uniform.
[attachment=27131:hexagons.jpg]
The hexagons are generated in a geometry shader from a single point and are emitted as a triangle strip.
I realize part of the problem is perception. However, this doesn't quite account for the noticeably harsh edges - both within the strips themselves and in between hexagons.
PS - I'm packing the slope values, but the loss of precision is uniform (as evidenced by the lack of discontinuities) and does not account for the harsh edges. As far as I can tell the output should be smooth.
* apparently there's more than 50