• Create Account

## Seamless Noise

Posted by , 08 May 2010 · 26,774 views

I've been playing with various types of noise generation for quite awhile, and I use generated noise as an integral part of many of my various workflows: texture creation, level creation, etc... It's great stuff. But there has been something that has bugged me for awhile now, and that is the algorithm I have been using to make seamless, tiling noise.

The simplest form of seamless fractal noise is generated by making the domains of the underlying grid of the basis functions forming each octave of the function wrap around. However, this method imposes some restrictions on the function, and is no good for the sort of work I do. My noise library provides the ability for any arbitrary noise function to act as an octave for yet another noise function, and noise functions are composited from a fairly large library of operations, including operations that transform the domain, either via rotation, scaling or translation. It is the rotation that particularly plays hell with the simple brand of tiling, since it effectively removes a noise function from "the grid", making the brain-dead domain looping technique ineffective. Quite simply, I need general-case looping that can work on any function.

Like many people, I have been using this basic formula for creating seamless noise:

```F_tile(x, y, w, h) = (
F(x, y) * (w - x) * (h - y) +
F(x - w, y) * (x) * (h - y) +
F(x - w, y - h) * (x) * (y) +
F(x, y - h) * (w - x) * (y)
) / (w * h)```

And the code I would use to map a buffer of seamless noise looked something like this:

```for x=0, bufferwidth,1 do
for y=0, bufferheight, 1 do
local s=x/bufferwidth
local t=y/bufferheight

local nx=x1+s*(x2-x1)
local ny=y1+s*(y2-y1)

buffer:set(x,y,F_tile(nx,ny,x2-x1,y2-y1))
end
end```

Essentially, this function samples the noise function 4 times from 4 different locations, and mixes the 4 samples together, weighted by their proximity to edges, in such a way that effectively 4 regions of noise are blended into 1. It works okay, the edges match up well and the function remains continuous; however, there are artifacts of the blending process that can be quite annoying. Due to the way the weighting is performed, the samples near the center of the region are a more even mix of the 4 than the samples near the edges of the region. The result is that the values in the center take on a "muddled", lower-contrast appearance compared to the edges. This is bad, since it introduces low-frequency artifacts that make the grid obvious. As an example, here is a basic simplex fractal with bump-map:

and an equivalent seamless version with bump-map, created using this "old" method using my Accidental Noise Library functions:

As you can see, there are artifacts on the edges of the seamless map that make an appearance, and can spoil any efforts to hide the grid when using this seamless method to create assets, especially assets such as ground tiles that may show up in your game covering large areas and repeating many times.

I have tried a number of schemes for reducing this so-called "muddling", most with only mixed success. I have tried masked blending with a non-seamless source, and while this does somewhat reduce the artifacts, it does not really eliminate them, and in large-scale repeating textures, patterns continue to show themselves. I have also tried generating 2 seamless buffers, offsetting one by 1/2 in each dimension, then blending them together; again, this method has mixed results, and doesn't really alleviate the issue.

So today, I was constructing some dirt tiles and cursing as I attempted to edit out the artifacts, when I had an idea. I've experimented before with mapping a torus of 3D noise onto a 2D plane, generating the seamless noise implicitly as a domain transformation, rather than as a blend of 4 regions, but this produced very bad results, as the distortion from the mapping was extreme. But as I was working today, I started to think about 4D noise. 4D noise is often seen as 3D noise animated over "time", but that is just an abstraction that helps you to think about it. In reality, it is solid, static noise extending in 4 orthogonal dimensions; the "time" aspect is just a convenience. As I was thinking, I struck upon an idea: what if I attempted to map a 4D function onto a 2D plane? Couldn't I work it so that the distortion introduced along each of the 4 axes cancelled each other out?

Now, of course, 4D noise is quite a bit more complex than 3D noise, but I just happened to have a 4D version of simplex noise available to test out my ideas. You can read the paper Simplex Noise De-Mystified for a good introduction to simplex noise, and a ready-made implementation of 2, 3 and 4 dimensional versions of it. Of course, my version is a bit different, having written it for my Accidental Noise Library some time back, but it works just the same. At any rate, I cobbled together some tests.

The idea I had was this: imagine you are trawling along the X-axis of a 2D seamless noise fractal image. At position 0, you are sitting on a certain value. As you progress along the axis, the values change, but as you near the farthest X position, the values start to circle back around so that once you hit furthest X, you are sitting on the same value as at 0. So I imagined this track to be a big circle in some dimensional space. The same happens along the Y axis. Another big, theoretical circle. However, if you attempt to map just 2 axes to circular paths like this, and sample the resultant function to a 2D buffer, some pretty heady distortion occurs. For example, take this image:

For this, I constructed the mapping function as:

```for x=0,bufferwidth-1,1 do
for y=0,bufferheight-1,1 do
local s=x/bufferwidth
local t=y/bufferheight

local nx=cos(s*2*pi)
local ny=sin(t*2*pi)

buffer:set(x,y,Noise(nx,ny))
end
end```

Now, if you look at that image, you can see the distortion, and if you study the function you can understand why it distorts. It's sort of like tossing a ball into the air. You throw the ball up, and it rapidly decelerates until it reverses direction and comes back down. In just this fashion, for instance, the value of nx starts at 0, progresses to 1 along the sine curve, then retraces back to 0. (Of course, a truer abstraction would be if the ball were at the center of the earth, oscillating an equal distance up and down, but hey...) This means that the domain is mirrored about the center of the image, and the non-linear progression of nx mapped to the linear progression of x results in a distortion of the function space.

Now, extend the same idea into 2 dimensions, where instead of metaphorically tossing a ball into the air to have it fall back down, you instead are spinning the ball in a full circle around some center position. The addition of a second axis of sine movement counteracts the "distortion" and forces the ball along a circular path rather than forcing it to retrace its steps back to it's origin.

So, since we are tossing balls along 2 separate axes, we need 2 additional axes orthogonal to the first 2 to provide the necessary "second dimension" that gives the balls' paths their depth. A 3D noise function, of course, doesn't have enough axes, but a 4D does. To test my idea, I constructed a mapping function as so:

```for x=0,bufferwidth-1,1 do
for y=0,bufferheight-1,1 do
local s=x/bufferwidth
local t=y/bufferheight
local dx=x2-x1
local dy=y2-y1

local nx=x1+cos(s*2*pi)*dx/(2*pi)
local ny=y1+cos(t*2*pi)*dy/(2*pi)
local nz=x1+sin(s*2*pi)*dx/(2*pi)
local nw=y1+sin(t*2*pi)*dy/(2*pi)

buffer:set(x,y,Noise4D(nx,ny,nz,nw))
end
end```

The idea here is that the X-axis progression of the 2D buffer describes a circle in 2D space defined by the values of nx and nz, and the Y-axis progression describes a circle in 2D space defined by the values of ny and nw. The 2D spaces of these circles are perpendicular to one another, completing the 4D space. The domains loop around in unbroken circles back to their starting points, and truly seamless, continuous noise is mapped as a result. It can be a little tricky to visualize it, but it works in practical use:

You can see that the character of the maps remains uniform and consistent throughout the images, with none of the middle region averaging displayed by the earlier method. And of course, the ability to apply this algorithm to any general composite of noise functions makes it much more desirable than the simple method of wrapping the octave basis functions. However, there are drawbacks to this technique. The mapping of a curved surface in 4D space to a flat surface in 2D space effects subtle changes in the overall character of the function and thus, certain regular pattern functions or other functions that have pattern or grid-like elements to their nature do not work well with this method. Of course, they tend to not work well with the other method, either. And as for the general case, I can work around the changes to the function's character as long as that character remains consistent across the entire map.

Of course, the addition of a general 4th dimension throws my entire library out of whack, since I short-sightedly coded most of the modules for 3D functions only, but all in all, the improvement of 4D seamless noise over that currently produced by the library makes the endeavour worthwhile, I think.

EDIT:
Here is an instructional image of a single-octave cellular function with coefficients F1=-1 F2=1. On the left, the "old" way of multi-sampling and blending, and on the right the new way as described above:

In the left image, you can very clearly see the muddling/mixing going on in the middle. On the right, you can see how the strange mapping subtly alters the character of the function; nevertheless, the function is crisp and clean. It also computes much faster; my cellular function is brain-dead and fairly heavy-weight, so the multi-sampling approach is slow and clunky.

I admit I didn't quite understand your analogy, but surely a spherical mapping would work, something like:

```for x=0,bufferwidth-1,1 do
for y=0,bufferheight-1,1 do
local s=x/bufferwidth
local t=y/bufferheight

local nx=cos(s*2*pi)*sin(t*2*pi)
local ny=sin(s*2*pi)*sin(t*2*pi)
local nz=cos(t*2*pi)

buffer:set(x,y,Noise3D(nx,ny,nz))
end
end
```
Nope, spherical mapping doesn't work like that. Unwrapping a 3D sphere onto a 2D plane produces singularities at the poles; the closer to the pole you get, the greater the distortion. This is a simple fBm fractal unwrapped using your spherical mapping:

See the areas where it's "spread out"? That's great if you intend to re-map the image back onto a sphere in 3D space, since the distortion disappears, but the bulk of my work is done 2D, so I need seamless unwrappings that don't distort in 2D space.
Let me see if I can try to explain it a bit better. Let's take the case where you want to generate a 1-dimensional buffer of looping noise. You could do the blending method, of sampling 2 regions along a 1-dimesional noise function and blending them, or you could start with a 2D function, and trace a circle somewhere through it, mapping values from the rim of the circle back into your 1-D buffer of noise as you go. By the time you get to the end of the buffer, you've returned back to your starting spot on the circle so the noise loops. So with this method, to create 1D looping noise you need a 2D function, since you need 2 axes to describe the circular path.

Extending it to 2D looping noise, you would need to trace 2 circles through a 4D function, one circle mapping to the X-axis of the 2D buffer, and one mapping to the Y axis of the 2D buffer. Imagine the X-axis inscribing a "circle" along the x/z plane in 4D space, and the Y-axis inscribing a "circle" along the y/w plane, and the resulting 4-tuple being used to evaluate the 4D noise function. Since there is no distortion introduced from mapping a 2D circle to a 1-D line, correspondingly there is no distortion introduced from mapping our 4D circles onto 2D planes.

To take it even further, if you wanted to animate this seamless 2D mapping over time, just as seamlessly and without distortion along the time-frame axis, you could introduce 2 more axes to the function, and inscribe yet another "circle" along the plane formed by those two axes. (6 dimensional noise. Yummy!)

Granted, for a lot of applications, this might be overkill, but I do a lot of creation of procedural textures such as stone, dirt, grass, etc... that map onto flat planes, and I've always been bugged by having obvious distortion and anomalies in my seamless noise functions.
Wow. This is really cool. Shame libnoise only does 3D noise.
That is ridiculously clever. For my generation (because I'm not doing anything as complex as you), I went with making the octaves line up and loop, themselves. This is much better for anything more complex than that.

Thanks for this!
Hi JTippetts,

This trick is awesome because it completely solve the problem.
Thank you for explaining it.

Kindest regards

I may be missing something obvious, but where do x1,x2,y1,y2 come from?

Whatever you specify them to be. I commonly use x1=-1,y1=-1, x2=1,y2=1 or x1=0,y1=0,x2=1,y2=1 in my work.

I had some trouble understanding your explanation. I am pretty sure you are mapping a clifford torus (http://en.wikipedia.org/wiki/Clifford_torus) to a square tile. Is this correct?

Yep, that's pretty much it.

Another question about x1, x2, y1, y2. It's one thing to know possible values for these, but another to know their function! Can you shed some light on what exactly these parameters do?

bull_dog: it looks to me like these parameters map out the squares within the 4D space within which the circles are drawn. In other words, they define where in the 4D space the samples should be taken (X1) and over how large an extent (X2-X1). Since this type of noise is deterministically pseudo-random, i.e. it always comes out exactly the same given the same input parameters, you have to alter (X1, Y1) to get different chunks of noise. You'd pick (X2, Y2) to be (X1, Y1) plus however much of the 4D space you want to consume. If you grab a lot of it, it's going to be more chaotic. If you grab a small portion, it's going to be smoother.

It seems to me that really these parameters could be (X1, Z1, Radius1) and (Y1, W1, Radius2), but I admit I can't think of an example where this would be all that useful.

Thanks JTippets for this post!

The code in this post is pretty hard to understand. it's undocumented and unexplained. Should be easier to grasp in the running shader.