You might be able to find more by googling for "n dimensional simplex noise". The simplest variant to convert to higher dimensions is the original Perlin noise, as it simply deals with interpolating the corners of an N-dimensional hypercube. Conceptually it's simple, but algorithmically it gets complex in an exponential fashion. You can decompose the recursive structure of standard Perlin noise into polynomial form, but the higher orders get pretty sticky. I talk about my efforts to write polynomial versions of 6D noise at http://www.gamedev.net/blog/33/entry-2254250-derivative-noise/

The code posted by vinterberg demonstrates the blending method which is the simplest method, mathematically. For basic cloud noise, it's

*probably*suitable. However, for high-contrast functions, or functions with strongly defined patterns, it might result in blending artifacts, as this image shows:

The variant on the left uses the blending scheme. 4 areas of 2D cellular noise are generated, then blended together to create the seamless pattern. At the edges, the pattern achieves a clarity that the center lacks, due to the center being a near-equal blend of 4 different noise sets, while the edges are more strongly weighted toward one of the 4 areas. Expanded out across a grid, the pattern becomes unmistakeable:

The one on the right uses 4 dimensional noise and the clifford torus mapping to generate the seamless pattern, and while the curvature of the domain space is apparent in the curvy distortion of the shapes (in standard 2D cellular noise, these shapes are convex polygons with straight edges) the overall contrast of the pattern is preserved throughout the image, as is apparent when the image is tiled: