• Advertisement
Sign in to follow this  

Procedural Texture Generation

This topic is 2205 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I am trying to do procedural planet generation. I want to generate a texture that covers the planet using perlin noise, and then make this texture more detailed as the player gets closer to the surface. Eventually it will render voxels instead of this texture. I am a little confused on how I would do this.

First of all, I feel like there is a bit of interesting math in order to get the right coordinates for the noise. For instance, if I am in space and viewing the planet and generating a texture to cover it, I am using the coordinates of the surface at a very far away level. Every pixel in the texture would be far apart on the surface coordinates(maybe 1 mile apart). But if I am close to the surface, then 1 pixel on the texture could be a couple hundred feet or something. What would the math be to do this? Am i thinking about this correctly?

I know I am probably not conveying this very well, but any help or ideas on how to do this would be really appreciated. Thanks.

Share this post


Link to post
Share on other sites
Advertisement
First of all, I feel like there is a bit of interesting math in order to get the right coordinates for the noise. For instance, if I am in space and viewing the planet and generating a texture to cover it, I am using the coordinates of the surface at a very far away level. Every pixel in the texture would be far apart on the surface coordinates(maybe 1 mile apart). But if I am close to the surface, then 1 pixel on the texture could be a couple hundred feet or something. What would the math be to do this? Am i thinking about this correctly?[/quote]
Uh normally this is done implicitly because of the quantization of your planet into a raster picture. At a distance one pixel may cover an effective surface area of several squared kilometers, whereas when you are standing on the surface one pixel may only represent a squared centimeter. And since you work on pixels... So this problem is effectively solved by projecting your planet from world space into screen space ("projecting" the 3D representation of your planet onto the 2D screen). This will indirectly give you the correct 3D coordinates for each pixel of the screen.

Usually this is done with a perspective matrix (along with an orthographic view matrix, in general), which you multiply the 3D coordinate of each point (or rather, vertex) making up your planet with.

Share this post


Link to post
Share on other sites
You don't need to do anything special in regards to the function itself, just use a 3D Perlin noise fractal function, and ensure that the fractal is composed of enough layers, or octaves, to produce the desired detail at the highest level of zoom.

Each point on the surface of a sphere has an associated 3D coordinate, which is used to index the Perlin function. At low levels of zoom, ie from outer space, the sampling granularity is more coarse, meaning the samples are taken quite far apart, and each pixel covers a quite large area. In cases like this, you need to perform filtering or anti-aliasing in order for the picture to not "sparkle". You can anti-alias a Perlin function by sampling an area of pixels or values, and performing a weighted blend of them together to get the final pixel value that is more representative of the area than a single point sample would be.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement