Started by Mar 20 2002 01:43 AM

,
138 replies to this topic

Posted 28 May 2002 - 08:11 AM

How should the sun be blended with the clouds? I can''t come up with a way of doing it that''s really pleasing to the eye.

Posted 28 May 2002 - 09:29 AM

To render shafts of light, use a trapezoid with the top, narrow edge across the disk of the sun, and the wide bottom edge on the ground. Use a 1-dimensional texture going across the top edge; this texture stretched across the trapezoid will give stripes of brightness and darkness that start narrow (at the sun) and widen realistically as they approach the ground (the base of the trapezoid).

When you have clouds, you should base the 1-dimensional texture on the profile of clouds crossing the sun. So, if a cloud is covering the right half of the sun, then the right half of the texture should be black (or have alpha = 0), the left half should be bright, and so a shaft of light leaves the left half of the sun.

This is the basic simple idea. I imagine getting the shafts and the colors just right requires playing around a little. (For example, if there are no clouds crossing the sun, then you probably don''t want any visible shafts, so set the 1-d texture to have alpha = 0 everywhere.)

The cloudbox idea is an optimization for reducing the size of textures that need to be sent to the graphics card (to save bandwidth). It isn''t really necessary until you get the other parts working. Also, if you are generating the noise on the graphics card itself (as Yann seems to be working towards), I''m not sure if it helps you at all. My understanding is that basically, instead of a very large rectangle across the sky with a high-resolution (e.g., 2048x2048) cloud texture on it, you draw a cube surrounding the camera (of course you can cull faces which aren''t visible) which has the cloud texture projected onto it from the large rectangle. This is exactly the same as if you are trying to make a cube environment map. The code I posted above does this projection for the front face of the cube.

-- <-- Sun at top of trapezoid

/ \

/ \

------ <-- Ground at bottom of trapezoid

When you have clouds, you should base the 1-dimensional texture on the profile of clouds crossing the sun. So, if a cloud is covering the right half of the sun, then the right half of the texture should be black (or have alpha = 0), the left half should be bright, and so a shaft of light leaves the left half of the sun.

This is the basic simple idea. I imagine getting the shafts and the colors just right requires playing around a little. (For example, if there are no clouds crossing the sun, then you probably don''t want any visible shafts, so set the 1-d texture to have alpha = 0 everywhere.)

The cloudbox idea is an optimization for reducing the size of textures that need to be sent to the graphics card (to save bandwidth). It isn''t really necessary until you get the other parts working. Also, if you are generating the noise on the graphics card itself (as Yann seems to be working towards), I''m not sure if it helps you at all. My understanding is that basically, instead of a very large rectangle across the sky with a high-resolution (e.g., 2048x2048) cloud texture on it, you draw a cube surrounding the camera (of course you can cull faces which aren''t visible) which has the cloud texture projected onto it from the large rectangle. This is exactly the same as if you are trying to make a cube environment map. The code I posted above does this projection for the front face of the cube.

Posted 28 May 2002 - 02:49 PM

quote:greeneggs wrote:

This is the basic simple idea. I imagine getting the shafts and the colors just right requires playing around a little. (For example, if there are no clouds crossing the sun, then you probably don''t want any visible shafts, so set the 1-d texture to have alpha = 0 everywhere.)

Yeah, that''s kind of tricky. There is no real physical counterpart to the trapezoid idea (the physically real shafts of light would be impossible to render in realtime), so I just improvised. I have coupled the intensity of the lightrays (the amount blended) with the visibility of the sun (average over a rectangle), colour, and the spherical angle over the horizon (at dusk or dawn you have stronger rays than at full daytime). For the 1D raytexture I basically did what greeneggs explained, just with an additional gaussian blur over it (to make the rays less sharp).

I also blend an additional circular glow over the sun, using the ''smooth-add'' blending operator (see below). This is only done when the spherical angle to the horizon is lower than a given threshold, I then slowly fade that glow in (again modulated by the visibility of the sun, a bit like one of those widely used flares/halos around lightsources). This gives additional atmospheric ''thickness'' to sunsets.

Both glow and lightrays are additively blended into the scene. But to avoid those ugly oversaturation effects you get with additive blending, I used the add-smooth blend mode:

Instead of

quote:kill wrote:

How should the sun be blended with the clouds? I can''t come up with a way of doing it that''s really pleasing to the eye.

The standard way of doing that is as follows:

You have your gradient skydome. Now blend your primary sunglow onto that skydome, using additive blending (real additive one, we actually want the oversaturation here !). You should do that in a single pass: c = skygradient + suntexture, this saves fillrate. This should also give you that typical ''hyperbolic'' colour distribution around the sun, when it approaches the horizon. Make sure, you choose nice colour here, this is crucial for a visually nice result.

Now it''s time for the clouds: render them with whatever method you use (plane, cloudbox, etc) using standard alpha blending. Such as you already did on your screenshots, that looked fine to me.

The effect that makes the clouds ''glow'' around the sun is actually the multiple scattering cloud-shading. The results you will get at this point highly depend on your specific shading system.

If you want, you can now add an additional sunglow over the clouds, using additive or smooth-add blending. This one should be subtile, not too strong, or you might loose image detail around the sun due to saturation. Modulate the intensity of that glow by the current visibility of the sun-disc. The colour should more or less match the sun colour, but you can vary a bit to get interesting effects (eg. atmospheric diffraction).

The final touch would then be the lightrays. Or a rainbow

quote:kill wrote:

I just came back from Lake George (I went there for the Memorial Day weekend). The scenery there is beautiful, however I couldn''t fully enjoy it because I kept looking at the clouds, landscape, vegetation and water and kept trying to point out all the different effects I should implement. Ignorance is truly bliss. My non-programming friends enjoyed everything so much more then I did

Heh, I know that... So many beautiful natural effects, just waiting to be implemented in realtime Oh well, should we really live in

/ Yann

Posted 28 May 2002 - 03:33 PM

quote:Original post by kill

I just came back from Lake George (I went there for the Memorial Day weekend). The scenery there is beautiful, however I couldn''t fully enjoy it because I kept looking at the clouds, landscape, vegetation and water and kept trying to point out all the different effects I should implement. Ignorance is truly bliss. My non-programming friends enjoyed everything so much more then I did

ROFL, I feel that pain.. I now look at the world as if it''s one big rendered scene from a massive computer in Heaven, wondering how nature does such cool effects

I guess this is more of a question with a skybox, but how do make the curvature of the sky, and to what degree? Is it something you just play around with until it looks right?

I should probably be going to read some basic stuff on this before going into the advanced aspects (of which are being spoken of right now..) You guys make me feel so stupid -_-;

Anyway, I like the setting of your skies Kill. I think a sun would really make it look great.

Posted 29 May 2002 - 07:45 AM

okonomiyaki: I had the curvature problem before. If you read the posts in this thread you''ll see the part where I asked Yann for some template values to start off with because my sky looked flat. Depending on what you''re using things might be easier/harder. With a skyplane there are a lot of parameters to play around with. It''s very hard to get it to look just right. A little off to one side, and your sky will look like a picture put in front of you. A little off to the other side and it will look like a picture right overhead. If you make it too curved it will look like there''s absolutely no perspective. The only way to get it to look right is to play around with the values.

One suggestion I can give you, make as many values as possible be configurable at runtime. Recompiling your code a thousand times is very time consuming and frustrating. It''s much easier to press a couple of keys during runtime. It will make a difference between taking a week to get it to look right and an hour.

On a different note, I just found out that point sprites have a very small maximum limit on many systems. On GeForce2 MX it''s 64 pixels in screenspace. I have to reimplement the functionality myself

One suggestion I can give you, make as many values as possible be configurable at runtime. Recompiling your code a thousand times is very time consuming and frustrating. It''s much easier to press a couple of keys during runtime. It will make a difference between taking a week to get it to look right and an hour.

On a different note, I just found out that point sprites have a very small maximum limit on many systems. On GeForce2 MX it''s 64 pixels in screenspace. I have to reimplement the functionality myself

Posted 29 May 2002 - 11:52 AM

Yann, thanks for being so helpful. I implemented the 2.5D ray-tracer for tracing through cloud "heightfields." This certainly should improve cloud shading when the sun is at a low elevation.

A very realistic rendering would need to trace through the heightfield for shading and then also for rendering. I think clouds are flatter on the bottom, so the heightfield would have some low fraction beneath the cloud plane and a high fraction above the cloud plane. This should give very good results, and the ray-tracing for rendering would only need to be updated occasionally as the clouds move (not every frame) and should be not too slow as it could be done at low resolution (as long as the cloud shading ray-tracing was done at high resolution). There are various optimizations one could make, including tracing only for low elevation clouds and also taking advantage that most of the cloud height is above the cloud plane trace in only one direction.

More precisely, take the clouds stretched across the cloud plane. To render a ray that hits the cloud texture (cloud heightfield), trace the ray backwards until it leaves the cloud and use the computed shade color for the point at which the ray exits the cloud (shades stored in a 2D texture, indexed by horizontal position, just like the cloud heightfield). To render a ray that misses the cloud texture, trace the ray backwards to verify that it doesn't intersect any clouds before reaching the cloud plane (or skip this step since clouds lie mostly above the cloud plane so it is unlikely that you'll hit anything); then trace the ray forwards. If the ray hits the top of a cloud, then use the computed shade color for the point at which the ray enters the cloud (this point is on *top* of the cloud and so is shaded differently from the bottom of the cloud -- this is a second shading texture). The ray-tracing is done on the CPU, but at low resolution.

Without any ray-tracing for rendering, though, I don't see how one can expect good results. For example, consider noontime when (assume) the sun is directly overhead. The shading ray-tracer is tracing through parallel vertical lines, and so is trivial (it just takes the value from the heightfield). A typical cloud will be dark in the middle, with a lighter stripe around the outside, since this is what the height field looks like. However, even clouds near the horizon will look like this, foreshortened in perspective. And this is incorrect. Clouds near the horizon should have puffy white tops and narrow black bottoms -- because you are looking over the clouds. The tops of the clouds are shaded differently from the bottoms; light coming off the top has just been scattered once, while light off the bottom has gone all the way through the cloud. The basic problem is knowing whether you are looking at the top or the bottom of the cloud.

This is especially important for an elevated perspective, but seems to be important even for a ground-level viewer (as I see out my window). I believe I can see this effect in your screenshot "clouds at daytime."

This looks right! I think that you are not actually shading at high resolution, but are using the heightfield, exponentiated, to give the additional detail? This isn't really physically accurate but if it works who am I to argue? Then all ray-tracing can be done at low resolution.

I've thought that the perspective information can be precomputed for a few viewing directions (three or four) and then linearly combined, but it might look odd. Also when the sun is near the horizon there are other hacks to give the correct effect without ray-tracing to the viewer. But I'm curious how you've done it so efficiently.

On another note, I'm curious if anyone has gotten the Hoffman-Preetham air scattering results working. I implemented it and it seems to work well. It does not give good sky gradients, however. There are exposure problems. Plus, sky gradients would really benefit from full-spectrum calculations, I think. I find I need to hack the radius of the earth (or equivalently, the depth of the atmosphere) depending on the sun height, which isn't so cool.

[Edit: one last remark!]

In "A method for modeling clouds based on atmospheric fluid dynamics" (Miyazaki, Yoshida, Dobashi & Nishita), they show some results on making 2-dimensional Benard convection cells for cirrocumulus clouds. Has anyone gotten this running in real-time? (I imagine the three dimensional computations are too slow, although clouds change slowly so maybe it's doable.) These are very impressive results, much better than their cloud automata models.

[Edit: removed copyrighted image]

[edited by - greeneggs on May 31, 2002 8:06:09 PM]

A very realistic rendering would need to trace through the heightfield for shading and then also for rendering. I think clouds are flatter on the bottom, so the heightfield would have some low fraction beneath the cloud plane and a high fraction above the cloud plane. This should give very good results, and the ray-tracing for rendering would only need to be updated occasionally as the clouds move (not every frame) and should be not too slow as it could be done at low resolution (as long as the cloud shading ray-tracing was done at high resolution). There are various optimizations one could make, including tracing only for low elevation clouds and also taking advantage that most of the cloud height is above the cloud plane trace in only one direction.

More precisely, take the clouds stretched across the cloud plane. To render a ray that hits the cloud texture (cloud heightfield), trace the ray backwards until it leaves the cloud and use the computed shade color for the point at which the ray exits the cloud (shades stored in a 2D texture, indexed by horizontal position, just like the cloud heightfield). To render a ray that misses the cloud texture, trace the ray backwards to verify that it doesn't intersect any clouds before reaching the cloud plane (or skip this step since clouds lie mostly above the cloud plane so it is unlikely that you'll hit anything); then trace the ray forwards. If the ray hits the top of a cloud, then use the computed shade color for the point at which the ray enters the cloud (this point is on *top* of the cloud and so is shaded differently from the bottom of the cloud -- this is a second shading texture). The ray-tracing is done on the CPU, but at low resolution.

Without any ray-tracing for rendering, though, I don't see how one can expect good results. For example, consider noontime when (assume) the sun is directly overhead. The shading ray-tracer is tracing through parallel vertical lines, and so is trivial (it just takes the value from the heightfield). A typical cloud will be dark in the middle, with a lighter stripe around the outside, since this is what the height field looks like. However, even clouds near the horizon will look like this, foreshortened in perspective. And this is incorrect. Clouds near the horizon should have puffy white tops and narrow black bottoms -- because you are looking over the clouds. The tops of the clouds are shaded differently from the bottoms; light coming off the top has just been scattered once, while light off the bottom has gone all the way through the cloud. The basic problem is knowing whether you are looking at the top or the bottom of the cloud.

This is especially important for an elevated perspective, but seems to be important even for a ground-level viewer (as I see out my window). I believe I can see this effect in your screenshot "clouds at daytime."

This looks right! I think that you are not actually shading at high resolution, but are using the heightfield, exponentiated, to give the additional detail? This isn't really physically accurate but if it works who am I to argue? Then all ray-tracing can be done at low resolution.

I've thought that the perspective information can be precomputed for a few viewing directions (three or four) and then linearly combined, but it might look odd. Also when the sun is near the horizon there are other hacks to give the correct effect without ray-tracing to the viewer. But I'm curious how you've done it so efficiently.

On another note, I'm curious if anyone has gotten the Hoffman-Preetham air scattering results working. I implemented it and it seems to work well. It does not give good sky gradients, however. There are exposure problems. Plus, sky gradients would really benefit from full-spectrum calculations, I think. I find I need to hack the radius of the earth (or equivalently, the depth of the atmosphere) depending on the sun height, which isn't so cool.

[Edit: one last remark!]

In "A method for modeling clouds based on atmospheric fluid dynamics" (Miyazaki, Yoshida, Dobashi & Nishita), they show some results on making 2-dimensional Benard convection cells for cirrocumulus clouds. Has anyone gotten this running in real-time? (I imagine the three dimensional computations are too slow, although clouds change slowly so maybe it's doable.) These are very impressive results, much better than their cloud automata models.

[Edit: removed copyrighted image]

[edited by - greeneggs on May 31, 2002 8:06:09 PM]

Posted 30 May 2002 - 10:11 AM

Here is some CML source code, and a screenshot of an unsuccessful attempt grow a cumulus cloud. (Bottom left is vapor source distribution, bottom right is velocity field cross-section, top right is vapor and droplet levels. Top left is a poor rendering of the droplets (clouds). In the upper left hand corner is the number of seconds simulated per computer second.)

[Edit: Added images of Benard convection cells suitable for cirrocumulus clouds (?). Phase transiton doesn''t work yet.]

[Edit: Updated code parameters and fixed phase transition]

[edited by - greeneggs on June 20, 2002 11:42:48 AM]<

/*

* convection.c

*

*/

#include "convection.h"

#include <stdlib.h>

#include <math.h> // for exp

#include <stdio.h>

typedef int bool;

#define true 1

#define false 0

static float *vx, *vy, *vz, *vxnew, *vynew, *vznew;

static float *E, *Enew;

static float *wv, *wl, *wvnew, *wlnew;

static float deltaT = 0;

static float vaporSource = 0;

void convectionVelocity(float **vxh, float **vyh, float **vzh) { *vxh = vx; *vyh = vy; *vzh = vz; }

float *convectionTemperature() { return E; }

float convectionDeltaT() { return deltaT; }

float *convectionVapor() { return wv; }

float *convectionDroplets() { return wl; }

float convectionVaporSource() { return vaporSource; }

#define array(a,x,y,z) a[((z)+1)*(wy+2)*(wx+2) + ((y)+1)*(wx+2) + (x)+1]

#define cleararray(a) for (z = -1; z < wz+1; z++) for (y = -1; y < wy+1; y++) for (x = -1; x < wx+1; x++) a(x,y,z) = 0;

#define swaparray(a,b) temp = a; a = b; b = temp;

#define addperiodic(a) \

for (z = -1; z < wz+1; z++) {

for (x = -1; x < wx+1; x++) {

a(x,wy-1,z) += a(x,-1,z); a(x,-1,z) = a(x,wy-1,z);

a(x, 0,z) += a(x,wy,z); a(x,wy,z) = a(x, 0,z);

}

for (y = -1; y < wy+1; y++) {

a(wx-1,y,z) += a(-1,y,z); a(-1,y,z) = a(wx-1,y,z);

a( 0,y,z) += a(wx,y,z); a(wx,y,z) = a( 0,y,z);

}

}

#define copyperiodic(a) \

for (z = -1; z < wz+1; z++)

for (x = -1; x < wx+1; x++) { a(x,-1,z) = a(x,wy-1,z); a(x,wy,z) = a(x,0,z);

for (y = -1; y < wy+1; y++) { a(-1,y,z) = a(wx-1,y,z); a(wx,y,z) = a(0,y,z);

}

#define addreflections(a) \

for (x = -1; x < wx+1; x++) for (y = -1; y < wy+1; y++) { a(x,y,1) += a(x,y,-1); a(x,y,wz-2) += a(x,y,wz); }

#define swapreflections(a) \

for (x = -1; x < wx+1; x++) for (y = -1; y < wy+1; y++) { a(x,y,1) -= a(x,y,-1); a(x,y,wz-2) -= a(x,y,wz); }

#define clearreflections(a) \

for (x = -1; x < wx+1; x++) for (y = -1; y < wy+1; y++) { a(x,y,0) = 0; a(x,y,wz-1) = 0; }

#define vx(x,y,z) array( vx,x,y,z)

#define vy(x,y,z) array( vy,x,y,z)

#define vz(x,y,z) array( vz,x,y,z)

#define vxnew(x,y,z) array(vxnew,x,y,z)

#define vynew(x,y,z) array(vynew,x,y,z)

#define vznew(x,y,z) array(vznew,x,y,z)

static bool velocityInitialized = false;

static void initializeVelocity(float vamp) {

int x, y, z;

if (velocityInitialized) return;

vx = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

vy = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

vz = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

cleararray(vx)

cleararray(vy)

cleararray(vz)

// small amplitude random initial velocities

for (z = 0; z < wz; z++) {

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

vx(x,y,z) = vamp * (rand() / (float) RAND_MAX - 0.5f);

vy(x,y,z) = vamp * (rand() / (float) RAND_MAX - 0.5f);

vz(x,y,z) = vamp * (rand() / (float) RAND_MAX - 0.5f);

}

}

}

copyperiodic(vx)

copyperiodic(vy)

copyperiodic(vz)

clearreflections(vz)

vxnew = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

vynew = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

vznew = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

cleararray(vxnew)

cleararray(vynew)

cleararray(vznew)

clearreflections(vz)

velocityInitialized = true;

}

#define E(x,y,z) array(E,x,y,z)

#define Enew(x,y,z) array(Enew,x,y,z)

static bool temperatureInitialized = false;

static void initializeTemperature(float dT) {

int x, y, z;

if (temperatureInitialized) return;

deltaT = dT;

E = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

cleararray(E)

// small amplitude random initial temperatures

for (z = 1; z < wz-1; z++) {

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

E(x,y,z) = deltaT * (rand() / (float) RAND_MAX - 0.5f);

}

}

}

// top and bottom plates are separated by fixed temperature differential 2 deltaT

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

E(x,y, 0) = deltaT;

E(x,y,wz-1) = -deltaT;

}

}

copyperiodic(E)

Enew = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

cleararray(Enew)

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

Enew(x,y, 0) = deltaT;

Enew(x,y,wz-1) = -deltaT;

}

}

copyperiodic(Enew)

temperatureInitialized = true;

}

#define wv(x,y,z) array( wv,x,y,z)

#define wl(x,y,z) array( wl,x,y,z)

#define wvnew(x,y,z) array(wvnew,x,y,z)

#define wlnew(x,y,z) array(wlnew,x,y,z)

static bool vaporInitialized = false;

static void initializeVapor(float vs) {

int x, y, z;

if (vaporInitialized) return;

vaporSource = vs;

wv = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

wl = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

cleararray(wv)

cleararray(wl)

// top and bottom plates are separated by fixed vapor differential vaporSource

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

wv(x,y, 0) = vaporSource;

wv(x,y,wz-1) = 0;

}

}

copyperiodic(wv)

copyperiodic(wl)

wvnew = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

wlnew = (float *) malloc((wx+2) * (wy+2) * (wz+2) * sizeof(float));

cleararray(wvnew)

cleararray(wlnew)

copyperiodic(wvnew)

copyperiodic(wlnew)

vaporInitialized = true;

}

static bool convectionInitialized = false;

void initializeConvection() {

if (convectionInitialized) return;

initializeVelocity(1.0f);

initializeTemperature(.04f); //.01f for stable pattern // 1.8 * lambda + .3 for onset of oscillation?

initializeVapor(5.75f);

convectionInitialized = true;

}

#define delta(arr,x,y,z) \

(.1666666666667f * (arr(x-1,y,z)+arr(x+1,y,z)+arr(x,y-1,z)+arr(x,y+1,z)+arr(x,y,z-1)+arr(x,y,z+1)) - arr(x,y,z))

// should the second constant be .125f? check

#define graddivvx(x,y,z) \

(.5f * (vx(x+1,y,z)+vx(x-1,y,z)) - vx(x,y,z) + f * \

(vy(x+1,y+1,z)-vy(x+1,y-1,z)-vy(x-1,y+1,z)+vy(x-1,y-1,z)+vz(x+1,y,z+1)-vz(x+1,y,z-1)-vz(x-1,y,z+1)+vz(x-1,y,z-1)))

#define graddivvy(x,y,z) \

(.5f * (vy(x,y+1,z)+vy(x,y-1,z)) - vy(x,y,z) +5f * \

(vx(x+1,y+1,z)-vx(x-1,y+1,z)-vx(x+1,y-1,z)+vx(x-1,y-1,z)+vz(x,y+1,z+1)-vz(x,y+1,z-1)-vz(x,y-1,z+1)+vz(x,y-1,z-1)))

#define graddivvz(x,y,z) \

(.5f * (vz(x,y,z+1)+vz(x,y,z-1)) - vz(x,y,z) 25f * \

(vx(x+1,y,z+1)-vx(x-1,y,z+1)-vx(x+1,y,z-1)+vx(x-1,y,z-1)+vy(x,y+1,z+1)-vy(x,y-1,z+1)-vy(x,y+1,z-1)+vy(x,y-1,z-1)))

void convectionTimestep(float dt) {

int x, y, z;

int l, m, n;

float dx, dy, dz;

float *temp;

if (!convectionInitialized) initializeConvection();

// Eulerian part

{ // (a) buoyancy procedure

float kb = 3E0f; // not specified in [Yanagita & Kaneko 95], 5 or 6 seems to be what they use, though

for (z = 0; z < wz; z++) { // 0, wz for fixed; 1, wz-1 for reflective boundary

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

vznew(x,y,z) = vz(x,y,z) + kb * ( E(x,y,z) - .25f*(E(x+1,y,z)+E(x-1,y,z)+E(x,y+1,z)+E(x,y-1,z)) );

}

}

}

swaparray(vz,vznew)

copyperiodic(vz)

}

{ // (b) heat diffusion

float kdE = .2f; // .4f, .02f

for (z = 1; z < wz-1; z++) { // 1, wz-1 always

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

Enew(x,y,z) = E(x,y,z) + kdE * delta(E,x,y,z);

}

}

}

swaparray(E,Enew)

copyperiodic(E)

}

{ // vapor diffusion -- do water droplets diffuse?

float kdw = 1E-1f; // .4f, .02f

for (z = 1; z < wz-1; z++) { // 1, wz-1 always

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

wvnew(x,y,z) = wv(x,y,z) + kdw * delta(wv,x,y,z);

wlnew(x,y,z) = wl(x,y,z) + kdw * delta(wl,x,y,z);

}

}

}

swaparray(wv,wvnew)

//swaparray(wl,wlnew)

copyperiodic(wv)

//copyperiodic(wl)

}

{ // © viscosity and pressure effect

float kv = .2f;

float kp = .2f;

for (z = 0; z < wz; z++) { // 1, wz-1 for reflection

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

vxnew(x,y,z) = vx(x,y,z) + kv * delta(vx,x,y,z)

+ kp * graddivvx(x,y,z);

vynew(x,y,z) = vy(x,y,z) + kv * delta(vy,x,y,z)

+ kp * graddivvy(x,y,z);

vznew(x,y,z) = vz(x,y,z) + kv * delta(vz,x,y,z)

+ kp * graddivvz(x,y,z);

}

}

}

swaparray(vx,vxnew)

swaparray(vy,vynew)

swaparray(vz,vznew)

copyperiodic(vx)

copyperiodic(vy)

copyperiodic(vz)

}

// Lagrangian part

#define advect(a,b) \

b( l, m, n) += (1-dx)*(1-dy)*(1-dz) * ,y,z); \

b(l+1, m, n) += dx *(1-dy)*(1-dz) * ,y,z); \

b( l,m+1, n) += (1-dx)* dy *(1-dz) * ,y,z); \

b(l+1,m+1, n) += dx * dy *(1-dz) * ,y,z); \

b( l, m,n+1) += (1-dx)*(1-dy)* dz * ,y,z); \

b(l+1, m,n+1) += dx *(1-dy)* dz * ,y,z); \

b( l,m+1,n+1) += (1-dx)* dy * dz * ,y,z); \

b(l+1,m+1,n+1) += dx * dy * dz * a(x,y,z);

{

cleararray(vxnew)

cleararray(vynew)

cleararray(vznew)

cleararray(Enew)

cleararray(wlnew)

cleararray(wvnew)

for (z = 1; z < wz-1; z++) { // 0, wz

for (y = 0; y < wy; y++) {

for (x = 0; x < wx; x++) {

dz = z + vz(x,y,z);

if (dz < 0) dz = -dz; //continue; // for fixed boundary, just continue

if (dz >= wz-1) dz = (float) (2*(wz-1)-z);//continue;

dy = y + vy(x,y,z);

if (dy < 0) dy += wy;

if (dy >= wy) dy -= wy;

dx = x + vx(x,y,z);

if (dx < 0) dx += wx;

if (dx >= wx) dx -= wx;

l = (int) dx; dx -= l;

m = (int) dy; dy -= m;

n = (int) dz; dz -= n;

advect(vx,vxnew)

advect(vy,vynew)

advect(vz,vznew)

advect(E,Enew)

advect(wl,wlnew)

advect(wv,wvnew)

}

}

}

swaparray(vx,vxnew)

swaparray(vy,vynew)

swaparray(vz,vznew)

swaparray(E,Enew)

swaparray(wl,wlnew)

swaparray(wv,wvnew)

addperiodic(vx)

addperiodic(vy)

addperiodic(vz)

addperiodic(E)

addperiodic(wl)

addperiodic(wv)

addreflections(vx)

addreflections(vy)

swapreflections(vz)

addreflections(E)

//clearreflections(vx)

//clearreflections(vy)

clearreflections(vz)

addreflections(wv)

addreflections(wl)

}

for (y = -1; y < wy+1; y++) {

for (x = -1; x < wx+1; x++) {

E(x,y, 0) = deltaT;

E(x,y,wz-1) = -deltaT;

wv(x,y, 0) = vaporSource;

//wv(x,y,wz-1) = 0;

wl(x,y,0) = 0;

//wl(x,y,wz-1) = 0;

}

}

{ // phase transition

float alpha = 1E-2f;

float Q = 7E-4f; // cal / g??

#define altitude 6000.0f

#define wmax(T) (217 * (float) exp(19.482f - 4303.4f / ((T) - 29.5f)) / (T))

float wmax = .5f;//300 - 0.6f * altitude / 100.0f;

float delta;

//float temperature;

for (z = 0; z < wz; z++) {

//temperature = 300 - 0.6f * (altitude + 20 * z) / 100.0f; // for cumulus clouds only

//wmax = wmax(temperature);

for (y = 0; y < wy; y++) {

x = 0;

//printf("%d %d %d %4.1f %4.5f %4.5f %4.5f %4.2f\t", x,y,z,E(x,y,z),wmax,delta,wv(x,y,z),wl(x,y,z));

for (x = 0; x < wy; x++) {

wmax = wmax(264+E(x,y,z));

delta = wv(x,y,z) - wmax;

//if (x == wy/2)

//printf("%d %d %d %4.1f %4.5f %4.5f %4.5f %4.2f\t\n", x,y,z,E(x,y,z),wmax,delta,wv(x,y,z),wl(x,y,z));

//printf("%d %d %d %4.1f %4.5f %4.5f %4.5f %4.2f\t", x,y,z,E(x,y,z),wmax,delta,wv(x,y,z),wl(x,y,z));

if (delta > 0) {

wvnew(x,y,z) = wv(x,y,z) - alpha * delta;

wlnew(x,y,z) = wl(x,y,z) + alpha * delta;

} else {

if (wl(x,y,z) + alpha * delta < 0) delta = -wl(x,y,z);

wvnew(x,y,z) = wv(x,y,z) - alpha * delta;

wlnew(x,y,z) = wl(x,y,z) + alpha * delta;

}

Enew(x,y,z) = E(x,y,z) - Q * delta;

}

}

}

swaparray(wv,wvnew)

swaparray(wl,wlnew)

swaparray(E,Enew)

copyperiodic(wv)

copyperiodic(wl)

copyperiodic(E)

}

{ // restore boundary conditions

for (y = -1; y < wy+1; y++) {

for (x = -1; x < wx+1; x++) {

E(x,y, 0) = deltaT;

E(x,y,wz-1) = -deltaT;

wv(x,y, 0) = vaporSource;

//wv(x,y,wz-1) = 0;

wl(x,y,0) = 0;

//wl(x,y,wz-1) = 0;

}

}

{ // try setting a horizontal flow as a boundary condition? (as in [Miyazaki, Yoshida, Dobashi & Nishita])

for (y = -1; y < wy+1; y++) {

for (x = -1; x < wx+1; x++) {

#define tempscale 1.25f

vx(x,y, 0) = .125f * tempscale;

vx(x,y,wz-1) = -.01625f * tempscale;

vy(x,y, 0) = .01625f * tempscale;

vy(x,y,wz-1) = .125f * tempscale;

//vz(x,y, 0) = .25f;

//vz(x,y,wz-1) = -.125f;

}

}

}

}

}

[Edit: Added images of Benard convection cells suitable for cirrocumulus clouds (?). Phase transiton doesn''t work yet.]

[Edit: Updated code parameters and fixed phase transition]

[edited by - greeneggs on June 20, 2002 11:42:48 AM]<

Posted 30 May 2002 - 10:33 AM

This thread is getting interesting

I prepared a longer reply, but I have to draw some images to make it complete, I''ll post it later tonight.

/ Yann

I prepared a longer reply, but I have to draw some images to make it complete, I''ll post it later tonight.

/ Yann

Posted 31 May 2002 - 04:30 AM

quote:

Without any ray-tracing for rendering, though, I don't see how one can expect good results. For example, consider noontime when (assume) the sun is directly overhead. The shading ray-tracer is tracing through parallel vertical lines, and so is trivial (it just takes the value from the heightfield). A typical cloud will be dark in the middle, with a lighter stripe around the outside, since this is what the height field looks like. However, even clouds near the horizon will look like this, foreshortened in perspective. And this is incorrect.

Exactly. What you are mentioning is the second term in Dobashi's equations, the scattering towards the eye. I take that into account, by simply tracing a ray from each voxel bottom to the eye position, which is assumed fixed at (0,0,0).

So basically:

1) trace a ray from the sun to the voxel, approximate the scattering integral for the light that reaches the voxel.

2) trace a ray from the voxel to (0,0,0), approximate the light scattered from the voxel towards the eye.

But this alone won't solve the problem. The problem is not so much the scattering towards the eye (actually it doesn't really make that much difference, if I take it out. The clouds get a bit darker, not much more). The problem, is that we are using a

In the realworld, when you see a cloud far away at the horizon, you are not looking at it's bottom, but at it's side. Keep in mind, that real clouds are 3D objects. If the sun is high above (eg. at noon), and sunlight is approximated through a directional lightsource, then you get this situation:

(OK, I know my drawing skills suck)

V is the viewpoint, the yellow arrows represent the sunlight direction, the blue line is our 2D cloud plane. Assume that cloud A and B are the same. If only sunlight direction is taken into account, then both will be illuminated the same way, regardless of their position. Consider the red voxel in both clouds: it receives very little light, since the ray has to travel through the whole cloud to reach it. It will thus be very dark. This behaviour will yield correct lighting for cloud A, but not for cloud B.

The reason why it works in reality is simple: consider the view frustum from the viewpoint to cloud B. If the cloud was a real 3D object, then you would actually see the side of the first voxel, which is highly illuminated. The red voxel would be almost invisible, since it would be hidden by the first two voxels.

But our cloud system is 2D. You always see the bottom of the clouds. That is incorrect, and leads to artifacts when the clouds are viewed from an inadequate perspective. It's a bit as with fixed billboards: they look nice and realistic, if you are just infront of them. But if your camera rises up, then the perspective get lost.

So what can we do ? A solution would be to use full 3D clouds, but this is would be rather expensive...

Solution: we

Normally, we use a directional light to approximate the sun, but this results in a uniform shading of all clouds. Also, directional lightsources do not contain perspective information, they are orthogonal projections. The best way to introduce a perspective into the lighting is to use a perspective lightsource: a pointlight.

Consider this situation:

As you can see, I replaced the directional sunlight by a pointlight source (the yellow 'blobb' is supposed to be the sun ). This is physically highly incorrect, but it gives the desired result. For cloud A, the lighting is still correct, since the incidence light is more or less directional. For cloud B however, we now have a fully perspective form of lighting: the former red voxel is now fully lit (green arrow). The cloud will appear as if it was seen from it's side.

The distance between the sunlight and our cloud plane (distance L) needs to be adjusted by trial and error, changing it influences the strength of the perspective fake. Play around with various values for L and the multiple scattering extinction coefficients, until you get a visually pleasing result.

Of course, being a fake, this method has it's drawbacks. It will introduce errors, if the sun is near the horizon, and a cloud is just infront of it. With our method, the lightrays would now come from behind the cloud, while in reality they would come from above. They will need to travel through the whole cloud, and this results in an over-attenuation during the scattering-integration: we get a dark spot in the cloud.

You can see this effect on the first screenshot:

However, I think this error isn't too bad, considering that the method is very fast and even hardware accelerateable. But there would be ways to overcome this problem, perhaps by adjusting the perspective 'fake-factor' based on the cloud distance from the viewer.

quote:

I think that you are not actually shading at high resolution, but are using the heightfield, exponentiated, to give the additional detail? This isn't really physically accurate but if it works who am I to argue?

That's what I'm doing. The shading is done at 3-4 octave noise, while the opacity detail uses 8 to 12 octaves. Sure it's not physically accurate, but the whole 2D plane Perlin noise idea isn't in the first place And as long as it looks good and is fast...

quote:

In "A method for modeling clouds based on atmospheric fluid dynamics" (Miyazaki, Yoshida, Dobashi & Nishita), they show some results on making 2-dimensional Benard convection cells for cirrocumulus clouds. Has anyone gotten this running in real-time? (I imagine the three dimensional computations are too slow, although clouds change slowly so maybe it's doable.) These are very impressive results, much better than their cloud automata models.

Hmmm, I downloaded the paper, it looks very interesting. I currently don't have the time to investigate into their algorithms, but I think I might add this onto my ToDo list It would be very nice to have a somewhat more controlable system than Perlin noise, something where you can specify atmospherical and climatic conditions, and the system will automatically synthesize the appropriate clouds. Although you'd have to calculate the clouds entirely on the CPU then, and I don't know if that can be done in realtime at high cloud-texture resolutions.

/ Yann

[edited by - Yann L on June 7, 2002 8:12:57 PM]

Posted 31 May 2002 - 06:09 AM

Bit off topic - but I think there should be alot more threads like this, this thread is just so damn interesting

Death of one is a tragedy, death of a million is just a statistic.

Death of one is a tragedy, death of a million is just a statistic.

Posted 01 June 2002 - 12:35 AM

For some new ideas on perlin noise you should look at his (Ken Perlin) new paper: http://mrl.nyu.edu/~perlin/paper445.pdf

You should never let your fears become the boundaries of your dreams.

You should never let your fears become the boundaries of your dreams.

Posted 01 June 2002 - 02:52 AM

..just wanted to share a link on the Intel''s site for those of you who''s interested:

They have an article and a demo on "Generating Procedural Clouds in real time on 3D HW" as well as other interesting stuff. Check it out at:

Intel Developer Servicies/Training/Software Development/Games/(Developer Centers/Games/Graphics)

OR here''s a direct link:

http://cedar.intel.com/cgi-bin/ids.dll/topic.jsp?catCode=CLH

OR direct link to the article:

http://cedar.intel.com/cgi-bin/ids.dll/content/content.jsp?cntKey=Generic+Editorial%3a%3aclouds&cntType=IDS_EDITORIAL&catCode=CLH

They also have a Webcast presentation on Procedural 3D content generation somewhere(sorry I forgot where, just browse) where they talk about clouds as well.

...Also if you guys ever checked out Nvidia''s Effect browser - under Effects/Animation/Atmospheric/Cloud Cover - they do it with vertex shader. Although it works only on the beter video cards like Ti and GTS it is damn fast and creates shadows on an "underlying" mesh. (yet i could not figure out if they have correct shadows on clouds themselves)

They have an article and a demo on "Generating Procedural Clouds in real time on 3D HW" as well as other interesting stuff. Check it out at:

Intel Developer Servicies/Training/Software Development/Games/(Developer Centers/Games/Graphics)

OR here''s a direct link:

http://cedar.intel.com/cgi-bin/ids.dll/topic.jsp?catCode=CLH

OR direct link to the article:

http://cedar.intel.com/cgi-bin/ids.dll/content/content.jsp?cntKey=Generic+Editorial%3a%3aclouds&cntType=IDS_EDITORIAL&catCode=CLH

They also have a Webcast presentation on Procedural 3D content generation somewhere(sorry I forgot where, just browse) where they talk about clouds as well.

...Also if you guys ever checked out Nvidia''s Effect browser - under Effects/Animation/Atmospheric/Cloud Cover - they do it with vertex shader. Although it works only on the beter video cards like Ti and GTS it is damn fast and creates shadows on an "underlying" mesh. (yet i could not figure out if they have correct shadows on clouds themselves)

Posted 01 June 2002 - 07:44 AM

I know a site with some pretty nice stuff about terrain and clouds rendering. Check it out here. Only problem, it's on geocities, so i think the bandwidth is *very* limited.. the screenshots might not appear.. if it happens, try later.

I'll include a screenshot here:

[Removed due to geocities crap, see next post]

Y.

[edited by - Ysaneya on June 1, 2002 2:47:08 PM]

[edited by - Ysaneya on June 2, 2002 6:40:54 AM]

I'll include a screenshot here:

[Removed due to geocities crap, see next post]

Y.

[edited by - Ysaneya on June 1, 2002 2:47:08 PM]

[edited by - Ysaneya on June 2, 2002 6:40:54 AM]

Posted 01 June 2002 - 11:40 PM

Ah yes, that''s a lot better.. i''ll remove the image from my previous post, since it''s using space for nothing.

Y.

Y.

Posted 03 June 2002 - 04:41 PM

Dammit...

Just added this thread to my bookmarks.

By any means, Yann, write that tutorial

Just added this thread to my bookmarks.

By any means, Yann, write that tutorial

Posted 06 June 2002 - 04:10 AM

I''ve been pretty much keeping up with this extremelly interesting post but I have a few questions. Please excuse me if these have been covered (I looked and really haven''t been answered).

So how do you animate the cloud Yann? Do you just animate the Perlin Noise and thats it? Can you just (linearly) interpolate 2 Noises together, or maybe more? How can you simulate this scenario: It starts out a bright sunny day, later it gets cloudy, and finally at dusk all hell breaks loose with violent clouds (I won''t mention the lighting :-).

How do you shoot rays through the 2D Plane. You mentioned voxels. So you break up the noise texture as a plane (when it is in 3D Space) into a bunch of Axis Aligned Bounding Boxes or something similar maybe? This step really stumps me. And you do this to simulate scattering, right? Would this be an Isotropic or Anisotropic process?

Also, maybe off topic. Have you sen any of the research using particles to simulate clouds with billboard impostors (and VFC) as performance enhancements. It creates realistic 3D volumetric clouds and they look amazing, but for a game with a ground perspective, seems a bit useless. But is it really? Have you done any research into performance wayoff''s? For something that looks better, do you think it is too expensive or would the perlin noise calculations compare?

Thanks and sorry if the questions are a little lame! ;-)

So how do you animate the cloud Yann? Do you just animate the Perlin Noise and thats it? Can you just (linearly) interpolate 2 Noises together, or maybe more? How can you simulate this scenario: It starts out a bright sunny day, later it gets cloudy, and finally at dusk all hell breaks loose with violent clouds (I won''t mention the lighting :-).

How do you shoot rays through the 2D Plane. You mentioned voxels. So you break up the noise texture as a plane (when it is in 3D Space) into a bunch of Axis Aligned Bounding Boxes or something similar maybe? This step really stumps me. And you do this to simulate scattering, right? Would this be an Isotropic or Anisotropic process?

Also, maybe off topic. Have you sen any of the research using particles to simulate clouds with billboard impostors (and VFC) as performance enhancements. It creates realistic 3D volumetric clouds and they look amazing, but for a game with a ground perspective, seems a bit useless. But is it really? Have you done any research into performance wayoff''s? For something that looks better, do you think it is too expensive or would the perlin noise calculations compare?

Thanks and sorry if the questions are a little lame! ;-)

Posted 06 June 2002 - 07:54 AM

Dirge, I''ll let Yann answer the other questions, but the particle->imposter cloud paper was the one he referenced at the beginning of the thread as his source for the scattering approximation formula.

Posted 06 June 2002 - 08:51 AM

In order to animate perlin noise you use another dimension. So if you need a texture with noise (2D) you add another dimension, get noise in 3d, and make the third dimension time. As time moves on, the clouds animate. Perlin noise IS smooth blending of the noise. So the third dimension will slowly blend two different noises.

Regarding different weather, you slowly change the density and flufiness values.

Regarding different weather, you slowly change the density and flufiness values.

Posted 06 June 2002 - 10:41 AM

quote:

So how do you animate the cloud Yann? Do you just animate the Perlin Noise and thats it? Can you just (linearly) interpolate 2 Noises together, or maybe more? How can you simulate this scenario: It starts out a bright sunny day, later it gets cloudy, and finally at dusk all hell breaks loose with violent clouds (I won''t mention the lighting :-).

Linear interpolation between 2 Perlin noise sets is just a very basic idea. An extension would be full 3D perlin noise, as kill suggested. Besides the noise set, you can also modify the exponential ramp. You can interpolate different octaves using different functions. You also have tons of alternative interpolators than simple linear ones (even fractal ones).

I know I just barely mentioned the topic of cloud animation in this thread, but I would get some legal troubles with my company if I would go into details. Just one thing: if you are creative in the use of your noise and exponent interpolators, then you can actually create a whole weather simulator. Scenarios, such as your''s above, wouldn''t be a problem with those, you could even go much further... Just hypothetically speaking of course... (My boss reads the forums Hi Alex !)

quote:

How do you shoot rays through the 2D Plane. You mentioned voxels. So you break up the noise texture as a plane (when it is in 3D Space) into a bunch of Axis Aligned Bounding Boxes or something similar maybe?

Not really bounding boxes, more an axis aligned voxel field. Kind of like an inverted heightmap terrain.

quote:

This step really stumps me. And you do this to simulate scattering, right?

Yep. The voxel field is used to approximate the (otherwise continuous) multiple scattering integral over a discrete data field.

quote:

Would this be an Isotropic or Anisotropic process?

Which part do mean ? If applied to the tracing itself, it is isotropic in direction (assuming that your grid spacing is equal in all directions).

quote:

Also, maybe off topic. Have you sen any of the research using particles to simulate clouds with billboard impostors (and VFC) as performance enhancements. It creates realistic 3D volumetric clouds and they look amazing, but for a game with a ground perspective, seems a bit useless. But is it really? Have you done any research into performance wayoff''s? For something that looks better, do you think it is too expensive or would the perlin noise calculations compare?

As already stated, I use a similar technique (the Harris/Lastra paper) to calculate shading. Using true 3D clouds from a ground viewpoint wouldn''t be very useful: you''d wate resources and the visual results won''t be very different. Creating impostors on the fly also make animation more difficult. 3D clouds are interesting, if you do things like flight simulators though.

/ Yann