# Generating parametrised terrain.

This topic is 3010 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Okay, this is a pretty difficult problem, and I'm not certain there is a solution that doesn't require a super-computer to compute, but here goes: I am generating height maps procedurally using noise functions (Perlin based multi-fractals). I want them to appear as realistic as possible, i.e. sets of mountains together, plains, river valleys, peaked dunes etc. The variation in natural land-scapes is so great that I can't imagine it is possible to write a noise function by hand that could generate something close. So I had an idea: generate the noise function itself using real-world height map data as inputs. Basically I would need to parametrise my noise function somehow, then generate data with it, check it against real-world data, and use a global optimization technique to refine the input parameters. Now most multi-fractal noise functions have some very basic parametrisation already: scale, offset, lacunarity (basically the balance between high-frequency and low-frequency noise), number of octaves. What I need to do is determine what other parameters are required and how to actually integrate them into a noise algorithm. Here is a basic Ridged Multi-fractal combiner (Musgrave et al.):
template < class FTy_, class NPTy_ >
typename RidgedMultifractalProvider<FTy_, NPTy_>::value_type
RidgedMultifractalProvider<FTy_, NPTy_>::operator()(const SSEVectorType& exponents,
const noise_provider* noisep,
value_type lacunarity, value_type octaves,
value_type vx, value_type vy, value_type vz) const
{
value_type result, /*frequency, */signal, weight;

/* get first octave */
signal = (*noisep)( vx, vy, vz );
/* get absolute value of signal (this creates the ridges) */
if ( signal < 0.0 ) signal = -signal;
/* invert and translate (note that "offset" should be ~= 1.0) */
signal = _offset - signal;
/* square the signal, to increase "sharpness" of ridges */
signal *= signal;
result = signal;
weight = 1.0;

for(int i = 1; i < octaves; ++i) {
/* increase the frequency */
vx *= lacunarity;
vy *= lacunarity;
vz *= lacunarity;

/* weight successive contributions by previous signal */
weight = signal * _gain;
if ( weight > 1.0 ) weight = 1.0;
if ( weight < 0.0 ) weight = 0.0;
signal = (*noisep)( vx, vy, vz );
if ( signal < 0.0 ) signal = -signal;
signal = _offset - signal;
signal *= signal;
/* weight the contribution */
signal *= weight;
result += signal * exponents;
}

return result;
}


So, does anyone have any ideas of how I might go about achieving this? I know its a lot to ask, but I hope someone out there may have a better grasp of these kind of optimization computations than me! Any hints at all, resources I can check out, etc. would be appreciated, I'm not even sure where to start.

##### Share on other sites
I would highly recommend libnoise http://libnoise.sourceforge.net/

It's a plugin based noise generator. So you can basically layer all kinds of noise and get some pretty convincing effects. It's a wide community so there are lots of examples and support already. Also the underlying generator has a domain property that makes it super easy to generate infinite continuous terrain, one patch at a time.

##### Share on other sites
Sorry, I forgot to mention, it needs to be pseudo-real-time, i.e. fast enough to generate while the program is running, and update lod chunks in a chunked-lod scheme! The best example I can see in lib-noise (http://libnoise.sourceforge.net/examples/complexplanet/index.html), takes 25 minutes to generate a 2048x2048 map, I need it to be more like 20 seconds or less. My single precision SSE ridged multi-fractal takes about 4 seconds at the moment to do the same resolution.
I know, its a tall order!

##### Share on other sites
Is the idea to create a noise function that approximates a particular heightmap, or is it to create one whose output somehow has the same statistics as a collection of real heightmaps?

In other words: You mention a global optimizer; what function would you be optimizing?

Anyway, if it's the former you want, my impulse would just be to approximate the heightmap by a basis expansion and then add some (Perlin?) noise on top. If it's the latter, then my ideas are less well formed, but they essentially involve building a sort of Markov model of your terrain, in which the probability of a given pixel/region having a certain value/parametrization is a function only of the value/parametrization of its neighbor pixels/regions.

##### Share on other sites
I want to generate height maps that have the same statistics as a collection of example (real world) height maps. The problem is working out what those statistics are, how to determine their values for a particular height map, and how to make a height map generator that will take these as inputs.

I can't imagine I could choose 'complex' statistics like values indicating how "mountainous", "rolling", "hilly" etc. the terrain is, as they are to difficult to quantify from pure height map data. It would probably have to be something more easily specified mathematically, but I can't really work out what.

I also can't imagine I will be able to find a single set of parameters that I can both extract from a height map, as well as plug into a height map generator. That is where the optimization comes in:
If I can come up with a noise generator that will produce a wide variety of terrains from a certain set of parameters (probably quite abstract ones, not directly related to the terrain statistics I mentioned above), I can then perform comparisons with its output against the real-world height maps, and determine how "good" a particular set of parameters is. I can then use some optimization method to narrow down a set of parameters that generates "good" terrain.

One problem with this is that if I just directly compare the generated terrain with the real-world height maps (e.g. using RMS of the subtracted height values), then I will end up just optimizing to parameters that produce height maps that have similar heights at the same locations, not terrains that have the same types of features, but randomly arranged (which is what I am going for).

So really I need to be able to parametrise the terrains for comparison as well as parametrise the height generator. It would be great if they were the same set of parameters, but that would be a bit to good to be true (i.e. difficult/impossible for me!). I will look into the things you mention. (its late now, must sleep!).

##### Share on other sites
The technical term for the kind of model I'm suggesting might be useful is a Markov random field (MRF). I'll admit at this point though that although I'm quite familiar with Markov chains, Markov random fields fall outside the set of things I really know...

My first question would be how to generate images according to a given MRF model. A quick google on the subject seems to indicate that MRFs can be sampled by simulating a corresponding Markov chain (MC). In essence, the state of the MC is an image, and the stationary distribution of the MC agrees with the MRF -- so, the gist of it seems to be that if you simulate the MC for a long time to let it "run in" and then stop it, you'll get a sample from the MRF (i.e., an image generated according to the MRF). It seems that there exist both Metropolis and Gibbs samplers for this. So the whole thing falls under the general heading of Markov Chain Monte Carlo (MCMC) methods.

My second question would be how to efficiently store the conditional probabilities, and how to generate enough statistics that your estimates of these probabilities would be meaningful. The naive thing to do would be to just use a big table, but this rapidly gets impractical. For instance, consider a stationary 4-connected MRF model for grayscale images whose values are quantized to 256 levels: You'd need, for each of 256^5 = 2^40 possible combinations of a pixel's value together with its neighbors', to store a probability. Apart from the simple fact that you can't even address that much memory on a 32-bit computer, the amount of data you'd have to analyze to estimate probabilities with any confidence would be staggering... so clearly we need to include more assumptions about the PDF than just the Markov property. If we assume certain kinds of conditional independence, we could represent the probabilities by a Bayes net; that might do what we want...

You could also assume a linear model + white noise; then you'd basically be solving a giant least-squares problem; that might be more feasible...

I'll bet you though that I'm now reinventing a thoroughly investigated wheel. So if you google some of this stuff I strongly suspect you'll find information.

It's also entirely possible I'm leading you down an overcomplicated path though, so keep lots of things in mind!

##### Share on other sites
Have you looked at World Machine?

It's a node-based height map generator. Take a look at the examples and also the user community/tutorials. It should at least give you some ideas....

1. 1
2. 2
3. 3
Rutin
21
4. 4
5. 5
gaxio
10

• 14
• 30
• 13
• 11
• 11
• ### Forum Statistics

• Total Topics
631778
• Total Posts
3002310
×