Jump to content

  • Log In with Google      Sign In   
  • Create Account


Procedural Terrain/World


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 studentTeacher   Members   -  Reputation: 820

Like
1Likes
Like

Posted 04 June 2013 - 07:13 PM

Hello!

 

I've currently been working on an idea for an application that creates procedural terrain with weathered effects (basically, you can speed up simulation and weather on the terrain to shape it with some "history", if you want to). Unfortunately, I've hit a snag in my research -- how to look up material I can learn from!

 

I've started reading Texturing and Modeling, a Procedural Approach by David Ebert, but I have yet to grasp everything I can get from this book. I'm working with voxels that I can translate into triangle meshes. I've learned about using noise and other things to create height maps for terrain, but I'm looking for something else -- how to generate specific terrain. 

 

What I mean here is that Deserts will have thin lines of different sediment on their mountains, where past rivers have dug down into the ground. Rocks look different in different biomes. Are there techniques to modeling things in a certain way, or techniques for realistic terrain, anyway? Does anyone have any resources or possible keywords to search with? my research is running a little dry :/

 

Thanks biggrin.png

 

BTW: This guy, on youtube: (link here) is sorta what I'm looking for. I've also seen his blog, but I'm still running low on research and leads . . .



Sponsor:

#2 JTippetts   Moderators   -  Reputation: 8333

Like
6Likes
Like

Posted 04 June 2013 - 09:35 PM

Even after all these years, terrain generation is still very much a field of experimentation. The book you mentioned even talks about the difficulty of exploring the procedural space in a directed fashion, due to the sheer unpredictability and complexity of the math involved. Really the only reliable trick is to work with functions a lot and get a feel for what certain functions do and how they will interact.
 
For example:
 
Sedimentary rock can be modeled by assigning a gradient of coloration based on the Y coordinate (or whichever axis is your vertical). You can model local tilt of a chunk of strata using a small scale randomized rotation around an arbitrary horizontal axis, perhaps defined by another set of functions. You can model cracks in rock, or caves, or any other kind of fractures by inverting the output of a ridged fractal function and using it multiplicatively. And so forth.
 
One trick that I frequently do is to build a program that will randomly generate a chain or tree of functions, then map the output to see what it looks like. Typically, I will script it to generate several hundred permutations and dump PNGs of the result, as well as an XML or YAML file describing the function chain, to browse at my leisure. Whenever I see a pattern I like I will file it away into a library, or open up the YAML descriptor to see exactly what it is doing and maybe figure out how I can tweak it to better accomplish what I have in mind. With a large enough library of these randomized chains, I can usually browse and find something that sort of fits what I'm looking for at any given time.
 
Here is an image I had posted in an earlier blog entry, showing a few of the results of one such set of randomized functions:
 
1vbrtou.jpg

Alternatively, the texturing and modeling book also discusses (briefly) the idea of using some sort of genetic algorithm to construct noise functions, with input from a user (with plenty of time on his hands) to direct the model toward aesthetically pleasing results. This idea seems to have merit, but would involve a great deal of rather complicated coding and a large investment of time.
 
If you play with noise generation enough, you start to get a feel for things. Be prepared to pour a lot of time into just playing around. The up-side of this is that it's great fun.

Bear in mind that some things are difficult to accomplish using noise functions alone, such as the types of terrain formed by erosion. At this and this links, a couple of noise variants are described that utilize the first-order derivative of a particular layer of noise in order to modify the fractal contributions of successive layers. The results are decent approximations of certain types of eroded terrain, but if you want to accomplish any sort of realism in terrain generation you're going to have to write code to simulate water flow and erosion.

#3 shocobenn   Members   -  Reputation: 273

Like
0Likes
Like

Posted 05 June 2013 - 06:31 AM

Hello like said before, you should first generate a levelmap and use the Y (vertical axis) for the fields vs mountains vs water (you should use a big plane for the water)

 

Then, you generate a new noise but with BIG CELLS like (16x 16) and use this second noise map to delemite regions (You can easily see it in minecraft).

 

Also you can use thoses region zone to populate them with a function for each one, the forest will populate with Trees but no cactus, etc...

 

For the ground shader you can use a big shader give a baseColor from the zone and then play with it. 

 

Or use a shader per zone, like in the desert a do agian a perlin noise for little rocks or cracks.

 

When you do procedural things inception is your friend !



#4 studentTeacher   Members   -  Reputation: 820

Like
0Likes
Like

Posted 05 June 2013 - 08:45 PM

Even after all these years, terrain generation is still very much a field of experimentation. The book you mentioned even talks about the difficulty of exploring the procedural space in a directed fashion, due to the sheer unpredictability and complexity of the math involved. Really the only reliable trick is to work with functions a lot and get a feel for what certain functions do and how they will interact.
 
For example:
 
Sedimentary rock can be modeled by assigning a gradient of coloration based on the Y coordinate (or whichever axis is your vertical). You can model local tilt of a chunk of strata using a small scale randomized rotation around an arbitrary horizontal axis, perhaps defined by another set of functions. You can model cracks in rock, or caves, or any other kind of fractures by inverting the output of a ridged fractal function and using it multiplicatively. And so forth.
 
One trick that I frequently do is to build a program that will randomly generate a chain or tree of functions, then map the output to see what it looks like. Typically, I will script it to generate several hundred permutations and dump PNGs of the result, as well as an XML or YAML file describing the function chain, to browse at my leisure. Whenever I see a pattern I like I will file it away into a library, or open up the YAML descriptor to see exactly what it is doing and maybe figure out how I can tweak it to better accomplish what I have in mind. With a large enough library of these randomized chains, I can usually browse and find something that sort of fits what I'm looking for at any given time.
 
Here is an image I had posted in an earlier blog entry, showing a few of the results of one such set of randomized functions:
 
1vbrtou.jpg

Alternatively, the texturing and modeling book also discusses (briefly) the idea of using some sort of genetic algorithm to construct noise functions, with input from a user (with plenty of time on his hands) to direct the model toward aesthetically pleasing results. This idea seems to have merit, but would involve a great deal of rather complicated coding and a large investment of time.
 
If you play with noise generation enough, you start to get a feel for things. Be prepared to pour a lot of time into just playing around. The up-side of this is that it's great fun.

Bear in mind that some things are difficult to accomplish using noise functions alone, such as the types of terrain formed by erosion. At this and this links, a couple of noise variants are described that utilize the first-order derivative of a particular layer of noise in order to modify the fractal contributions of successive layers. The results are decent approximations of certain types of eroded terrain, but if you want to accomplish any sort of realism in terrain generation you're going to have to write code to simulate water flow and erosion.

This is beyond helpful! You've definitely helped my research pick right back up. I do have one question though: When you talk about a tree of functions (this of which i understand), I have trouble thinking about the implementation. Are these functions you hardcode in and the program chooses a tree-like structure of chaining functions together? Or is there something I'm missing here? I apologize if this is a bad question . . .

 

Otherwise, thank you for this monumental post of yours. I've got lots of fun, amazing work set ahead of me!

 

EDIT:

Also, have you heard of generators? I've heard of people using "sand generators" when creating deserts, and other types of generators. Is this like the simulation you talk about for water flow and erosion, but instead for other things too, like sand and water?


Edited by studentTeacher, 05 June 2013 - 08:47 PM.


#5 JTippetts   Moderators   -  Reputation: 8333

Like
3Likes
Like

Posted 05 June 2013 - 10:12 PM

When I talk tree of functions, I mean that I have a collection of general black-box type function "pieces" that I hook together to create more complex functions. Some examples:
 
This is an image of a gradient function.
 
ypd4R1S.png
 
It is set-up like so:
 
y=anl.CYAMLTreeContainer()
 
y:gradient("Grad1", 0,1,0,0,0,0)
 
It is a black-box function that sets up a gradient based on an input line segment. In the above example, the line segment is oriented from (x=0,y=0,z=0)->(x=1,y=0,z=0). Any input coordinate is projected onto this line and assigned a value based on where it projects to. Values that lie less than or equal to the first line segment end-point map to the value of 0, values that lie greater than or equal to the second end-point map to 1, and values in between map to some value on the gradient from 0 to 1. In the above image, I am mapping a 2D range from (x=0,y=0,z=0) to (x=1,y=1,z=0) to get the final image.
 
Now, I can take the output of the gradient function and use it as the input for another function, say a function to multiply it by 36:
 
y:gradient("Grad1", 0,1,0,0,0,0)
y:math("Math1", anl.MULTIPLY, "Grad1", 36.0)
 
Result:
oRvIgDs.png
 
You can see that the gradient section is now squeezed into a smaller portion of the image. I can take that signal and pass it along to another function; say, a cosine function:
 
y:gradient("Grad1", 0,1,0,0,0,0)
y:math("Math1", anl.MULTIPLY, "Grad1", 36.0)
y:math("Math2", anl.COS, "Math1", 0.0)
 
Vo9mUQm.png

Now, let's get a little bit more complex by setting up a fractal and using it to translate the x input coordinate to the cosine function:
y:gradient("Grad1", 0,1,0,0,0,0)
y:math("Math1", anl.MULTIPLY, "Grad1", 36.0)
y:math("Math2", anl.COS, "Math1", 0.0)

y:fractal("Fractal1", anl.FBM, anl.GRADIENT, anl.QUINTIC, 8, 3)
y:autoCorrect("AutoCorrect1", "Fractal1", -0.25, 0.25)

y:translateDomain("Translate1", "Math2", "AutoCorrect1", 0.0, 0.0)
This function chain sets up a fractal and a translateDomain function, sets the fractal to be the X-axis input, and the output of the cosine module to be the source input. The TranslateDomain module has the effect of translating the input coordinate to the function using the values obtained from the X, Y and Z input functions. In this case, when you call the function with a given (X,Y,Z) coordinate, it will first call the AutoCorrect1 function (which is a helper function attached to Fractal1 that has the effect of re-mapping the output of Fractal1 into a more useful range, in this case (-0.25, 0.25)) to obtain the value of the fractal at (X,Y,Z), then it will add this result value to the X component of the input coordinate, and use the new coordinate to call Math2 and get the value of the cosine function chain. The result looks like the classic turbulence marble effect you often see in beginner texts on procedural textures:

NLLmPQd.png

Now, this is still outputting values in the range (-1,1) so anything that drops below 0 is clamped to 0, meaning a lot of the texture is black. Let's send the output to another module to correct it to the range (0,1):
y:gradient("Grad1", 0,1,0,0,0,0)
y:math("Math1", anl.MULTIPLY, "Grad1", 36.0)
y:math("Math2", anl.COS, "Math1", 0.0)

y:fractal("Fractal1", anl.FBM, anl.GRADIENT, anl.QUINTIC, 8, 3)
y:autoCorrect("AutoCorrect1", "Fractal1", -0.25, 0.25)

y:translateDomain("Translate1", "Math2", "AutoCorrect1", 0.0, 0.0)

y:scaleOffset("ScaleOffset1", "Translate1", 0.5, 0.5)

result:

WPNe7cg.png

Now, just for giggles, let's duplicate that 3 whole set 3 times, and use all 3 as the Red, Green and Blue components of a RGB color composing function:
y:gradient("Grad1", 0,1,0,0,0,0)
y:math("Math1", anl.MULTIPLY, "Grad1", 36.0)
y:math("Math2", anl.COS, "Math1", 0.0)

y:fractal("Fractal1", anl.FBM, anl.GRADIENT, anl.QUINTIC, 8, 3)
y:autoCorrect("AutoCorrect1", "Fractal1", -0.25, 0.25)

y:translateDomain("Translate1", "Math2", "AutoCorrect1", 0.0, 0.0)

y:scaleOffset("ScaleOffset1", "Translate1", 0.5, 0.5)

y:gradient("Grad2", 0,1,0,0,0,0)
y:math("Math3", anl.MULTIPLY, "Grad2", 36.0)
y:math("Math4", anl.COS, "Math3", 0.0)

y:fractal("Fractal2", anl.FBM, anl.GRADIENT, anl.QUINTIC, 8, 3)
y:autoCorrect("AutoCorrect2", "Fractal2", -0.25, 0.25)

y:translateDomain("Translate2", "Math4", "AutoCorrect2", 0.0, 0.0)

y:scaleOffset("ScaleOffset2", "Translate2", 0.5, 0.5)


y:gradient("Grad3", 0,1,0,0,0,0)
y:math("Math5", anl.MULTIPLY, "Grad3", 36.0)
y:math("Math6", anl.COS, "Math5", 0.0)

y:fractal("Fractal3", anl.FBM, anl.GRADIENT, anl.QUINTIC, 8, 3)
y:autoCorrect("AutoCorrect3", "Fractal3", -0.25, 0.25)

y:translateDomain("Translate3", "Math6", "AutoCorrect3", 0.0, 0.0)

y:scaleOffset("ScaleOffset3", "Translate3", 0.5, 0.5)

y:rgbaCompositeChannels("RGBA", "ScaleOffset1", "ScaleOffset2", "ScaleOffset3", 1.0, anl.RGB)
Result:

XXII4uf.png

The three sub-chains are identical copies of one another, and differ only in the seed used. But there is no real need to duplicate them. Each of the inputs, Red, Green, Blue and Alpha, used by the rgbaCompositeChannels() function could be completely different functions.

By chaining modular functions together in this manner, you can build complex effects out of simple building blocks.


To build modules, I can either hard-code them as in the above examples, or I can write a Lua script that will recursively generate a randomized tree. It starts at the root and chooses a function randomly from a table of possible functions. Then for each input of the chosen function, additional functions are chosen randomly from a table. This continues on up the tree until I hit some specified depth, at which point some leaf-node functions are randomly selected. These, of course, would be generators such as fractal(), which wouldn't have higher input functions.

By studying the results of such a randomized tree I can get a feel for how certain functions will behave, and I can store the generated trees away for future reference.

#6 studentTeacher   Members   -  Reputation: 820

Like
0Likes
Like

Posted 06 June 2013 - 03:41 PM

Thanks, JTippetts, I understand what you're talking about now. Focusing on some of my research, I've found out grammars might be a way to go for creating content (especially architecture and trees, which both are good examples I've found), and I think I could potentially use grammars to define these noise generators too, possibly. Thinking way ahead, and I'll need to go over my college programming assignments on grammars again, but oh well. I'll focus on noise and representation for now. Again, thanks for your very thorough help! :D






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS