Sign in to follow this  
jmaupay

Clouds Perlin & Scattering

Recommended Posts

Ok, this is YATC: Yet Another Thread on the Clouds ! I know there are severals of this subject but ... I can't find really precise explanations (waiting for Yann L article !). If what I'am looking for is in a book please let me know.
Quote:
fboivin: JF Dube has a good article in Game Programming Gems 5 about cloud rendering using the latest hardware (ps3.0)
How to implement realtime (60 Hz) clouds changing ? Using: - a sky dome - a procedural texture (perlin noise) - some nice lighting/scattering/shading - shaders and modern hardware See Yann L's example: Others implementations (i.e. impostors ...) are out of subject. References are: - M. Ken Perlin himself ! Improved Noise reference implementation in JAVA (C++ translation is obvious). - Hugo Elias Perlin noise implementation and Clouds and the the article in gamedev form Francis "AK 47" Huang - THE thread on Clouds (Yann L). - Others threads (okonomiyaki: Cloud 2.5d raytracing ... need more eyes, HellRaiZer Cloud Shading ...) So, my questions are: - could someone explain the whole process in detail (especially scattering) ? - and also could you give some explanation on what is done in CPU and what in GPU ? What I understand so far: 0. The process: 0.1 Compute Perlin Noise in N octaves 0.2 Compute the scattering of sun using a 2.5d raytracing (Bresenham line drawing for example) on small resolution of noise and less octaves. 0.3 in the fragment shader compute the final sum of octaves and exponentiation (using a lookup texture ?)and compute the blending with sky colour and sun glow. 1. Perlin Noise: 1.0 Noise generation. The aim is to generate N octaves of noise. They are basically 2D arrays. The basic implementation uses Ken Perlin's noise generator which does the noise generation + interpolation + smoothing. Several others implementations use tricks to minimize the number of calculations. 1.1 Do I need to have seamless noise ? 1.2 If yes, how to have efficent seamless noise generation ? 1.3 Does anyone knows if hardware vendors will implement the GLSL noise() functions ? Trick 1 (cf. Yann L thread): Precompute sum of octaves in a RGB texture: R0 = (1*O1 + 1/2*O2 + 1/4*O3) G0 = (1*O4 + 1/2*O5 + 1/4*O6) B0 = (1*O7 + 1/2*O8) Then the final sum is: final = R0+1/8*G0+1/64*B0 Trick 2: Remember that the low frequency octaves give the general shape and high frequency octaves give the small variations. One simple trick is to compute only low frequency octaves when needed, using allways the same high frequency octaves. Trick 3: cf coelurus The lookup texture to do the exponentiation could be: lookup_texture[u, v] = 255 * (1 - exp(a, u + v)) 2. Sky dome 2.1 Polygons Can I use a dome generated by program ? Use for example : http://www.spheregames.com/files/SkyDomesPDF.zip 2.2 Tesselation I understand that dome needs to be more subdivided in the horizon than in the zenith, how to do that by program ? 3. Scattering 3.1 I understand that I have to consider the noise as a value of thickness of the cloud. (cf. Yann L thread). 3.2 How to have a voxel from the noise map ? Do I need a render to texture phase ? Trick 1: The sun is a point light. See Yann L's thread: Trick 2: Mark Harris does the scattering using a 2 phases rendering see his paper, Algorithm 1 P. 118. Place the observer at the sun position, render the clouds and get the opacity (from colour buffer or z-buffer ?), then render the scene using that opacity for scattering. Could we do that with our 2D texture ? 4. Colour of sky The reference document seems to be Dobashi: http://nis-lab.is.s.u-tokyo.ac.jp/~nis/cdrom/BasisSky.ps, how to implement ? J. [Edited by - jmaupay on September 8, 2005 4:09:53 AM]

Share this post


Link to post
Share on other sites
may I ask what is seamless noise? perlins noise interpolation seems to me reasonably seamless, you will notice repeating patterns if you vew a texture over a large enough domain, but nothing I'd consider seams?

Tim

Share this post


Link to post
Share on other sites
It would be nice if you managed this thread and put answers to your questions in your first post. A sort of reference for cloud... stuff.

0.1-0.3) Using the original Perlin noise-approach, you can add up octaves immediately per texel. Loop through the octaves you want, scale and offset coords for the noise function, scale result and sum up. Scaling buffers with noise isn't very good and you can get much nicer and smoother results using real Perlin noise. You shouldn't really use 2048x2048 textures either, you can tile high freq noise textures and use 64x64 textures for 3 octaves per texture.

1.1) If you want to get rid of 2048x2048 textures, yes.
1.2) Try the original approach, it's quick enough and with a proper RNG (search for Mersenne Twister on google), it looks great.

2.1) A skydome is just a collection of tris spanning a capped sphere, get it in any way you want.
2.2) Doesn't matter if it's efficient or not, the skydome should not be regenerated or reloaded every frame anyway.

3.2) As you mentioned in 3.1, the cloud array is a collection of noise values that specify cloud thickness. A cloud voxel exists at position (x, y, z) in the cloud array if "0 <= y < cloud[x + z * width]" (y being up). You can trick this out a bit and try:
c = cloud[x + z * width]
-c/4 <= y < c
to get some "substance" under the clouds during scattering, it really makes a difference.

That's what I got to say atm [smile]

Share this post


Link to post
Share on other sites
Quote:
Original post by timw
may I ask what is seamless noise? perlins noise interpolation seems to me reasonably seamless, you will notice repeating patterns if you vew a texture over a large enough domain, but nothing I'd consider seams?
Tim


Seamless for me means it can repeat without showing the tile. And when I generate noise it is not "repeatable" at all. The first octave gives the general pattern. Something has to be done to the octaves to be repeatable. So my question is: how to have repeatable noise ? For example: Matt Zucker
The Perlin noise math FAQ
explains how to do that but I feel that it is not an efficient way of doing it ?

If you take the 3DLabs 3Dnoise texture generation here GLSL demos from "Orange book", it's tilable but they are doing an other way (I don't understand). So if you can explain the "how to" ...


Quote:
Original post by coelurus
It would be nice if you managed this thread and put answers to your questions in your first post. A sort of reference for cloud... stuff.
I'll try to do that
Quote:

0.1-0.3) Using the original Perlin noise-approach, you can add up octaves immediately per texel. Loop through the octaves you want, scale and offset coords for the noise function, scale result and sum up. Scaling buffers with noise isn't very good and you can get much nicer and smoother results using real Perlin noise. You shouldn't really use 2048x2048 textures either, you can tile high freq noise textures and use 64x64 textures for 3 octaves per texture.


OK. No problem with that. By the way, the "original perlin noise" is from M. Ken Perlin himself ?! Improved Noise reference implementation in JAVA
(C++ translation is obvious).

Why don't we use that one ? Is there any copyright problem ? Or any not-random results due to the pre-computed permutation table ? Any limitation in dimension (256) ? Or any not good interpolation ?

Then generate octaves and sum them:
(peudo code from Hugo Elias:)

function PerlinNoise_2D(float x, float y)

total = 0
p = persistence
n = Number_Of_Octaves - 1

loop i from 0 to n

frequency = 2^i
amplitude = p^i

total = total + InterpolatedNoisei(x * frequency, y * frequency) * amplitude

end of i loop

return total

end function



But I thought that Yann L's approach is not this one. He is doing the sum in the GPU. I feel that he could provide smaller textures then let the GPU doing the magnification interpolations ?

Quote:

2.1) A skydome is just a collection of tris spanning a capped sphere, get it in any way you want.
2.2) Doesn't matter if it's efficient or not, the skydome should not be regenerated or reloaded every frame anyway.

Ok my point 2) is stupid. I'll remove it, perhaps providing some links to how to generate a sky dome for example here:
Sky dome tutorial from Spheregames

Any other good url for that?

Quote:

3.2) As you mentioned in 3.1, the cloud array is a collection of noise values that specify cloud thickness. A cloud voxel exists at position (x, y, z) in the cloud array if "0 <= y < cloud[x + z * width]" (y being up). You can trick this out a bit and try:
c = cloud[x + z * width]
-c/4 <= y < c
to get some "substance" under the clouds during scattering, it really makes a difference.


Sorry I don't understand. What is cloud[] ? Is it my noise texture ? For the moment I have a texture with several octaves (the sum and exponentiation is done in the frament shader). That texture is mapped to the dome (the dome is quite large, that's why I wanted the texture to be 2048x2048 and not to repeat it) using texture spherical coordinates(glBindTexture/glTexCoord). Where do I have a voxel ? Probably in the fragment program, but I feel that you are not talking about that ?

Share this post


Link to post
Share on other sites
First of all, here's a shot of a demo of mine that I took 1.5 year ago or so, just to make sure you know what I know [smile]
Dark clouds [Apoch: link redacted 2008-11-29 - no longer links to pleasant things]

I'll rearrange my answers here a bit:

The Perlin-noise link you found is _the_ one, afaik you should be able to simply copy and paste that code (and make it 2D for your clouds). It's way better than what Hugo Elias did. Make sure you get a proper RNG (Random Number Generator), rand() is usually not good enough.

Here's how I stored and added noise octaves (iirc):
I had 4 octaves for R and 4 for G, making 8 octaves. I used 2 128x128 textures with 4 octaves each and I used octave-coefficients around 1/2, 1/4, 1/8 and 1/16. The textures are very identical to this point, the only thing that differs is the noise values; the texture sizes and octave setups are identical. Then, I rendered the first texture to the skydome to the R channel without any scaling and the next to G scaled by 2^-4 and repeated. I did another pass with a 2D exponentiation and sum texture ( lookup_texture[u, v] = 255 * (1 - exp(a, u + v)) or whatnot ). I tried the 3D texture approach, but it slowed down considerably on my TI4400 I had at the time.
There's no need to sum everything on the GPU, otherwise you'd have to send a whole bunch of ultra-large textures to the GPU, which is not practical. Repeating the high-frequency cloud noise isn't bad either, it's practically impossible to see any repeating patterns. Low-freq octaves should not be scaled at all, but that doesn't mean the texture has to be large.

I never looked for refs on generating skydomes, it's simply the surface that's left when you cut a spherical shell with a cone. Play with restricted spherical coordinates and you should get a dome rather quickly. You don't have to worry about tessellation, flat domes with uniform distribution of vertices will compress geometry in image-space at the dome edges since they get far away from the camera viewpoint.

What kind of raycasting are you gonna do, preprocess on the CPU or by using the newest fragment program model in realtime? I have no experience at all from "ps2.0" and up so I can't comment at all on realtime solutions. The CPU preprocess is just a straightforward scan through the lowest freq octaves of your clouds. Loop through each entry in your generated cloud texture (you do generate it in some memory I hope?) and cast rays from cloud particles to light source and camera views. There are no distinct voxels in the clouds, only pillars of voxels so you have to check for lightrays that cut cloud pillars.

It's pretty tricky to explain the entire process in a post...

[Edited by - ApochPiQ on November 28, 2008 11:21:20 PM]

Share this post


Link to post
Share on other sites
Quote:
Original post by coelurus
Dark clouds [Apoch: link redacted]

Very nice shot !

Quote:

I'll rearrange my answers here a bit:

The Perlin-noise link you found is _the_ one, afaik you should be able to simply copy and paste that code (and make it 2D for your clouds).


Well, I use Z as the time and I would like clouds to change over the time. So just taking x,y values at a fixed z (time) is straightforward. New textures will be computed over several frames then replaced when completed (= time slicing). z will change only when a new cloud texture is completed and will change very slowly. So it's necessary to keep the 3D noise function.

Quote:

Make sure you get a proper RNG (Random Number Generator), rand() is usually not good enough.


Where do I need to use a RNG ? Ken Perlin uses a permutation table, I don't see any rand. If I generate a white noise, then do the smooth and interpolation myself, then a RNG is needed to generate the white noise. But with the Ken Perlin function, I don't see where to use rand() ?

Quote:

What kind of raycasting are you gonna do, preprocess on the CPU or by using the newest fragment program model in realtime? I have no experience at all from "ps2.0" and up so I can't comment at all on realtime solutions. The CPU preprocess is just a straightforward scan through the lowest freq octaves of your clouds. Loop through each entry in your generated cloud texture (you do generate it in some memory I hope?) and cast rays from cloud particles to light source and camera views. There are no distinct voxels in the clouds, only pillars of voxels so you have to check for lightrays that cut cloud pillars.

It's pretty tricky to explain the entire process in a post...


I dont know if I'll use preprocessed or GPU raycasting. For the mement I have to understand how to do it on CPU (preprocess). I'll see after if shaders could help in some way. (yes I have a texture in memory).

What you mean is (your texture is 128x128):
for (int i=0;i<128;i++)
for (int j=0;j<128;j++)
RayCast(LowFreqNoise[i][j],SunPosition,CameraPosition);

I have a problem of transformation. The texture will be mapped in a dome. So how to transform the i,j coordinates to the coordinates x,y,z in the dome ? I mean the sun and camera are in world coordinate, how to transform i,j to that coordinate system ? (I assume that the texture is not repeated and is entierly mapped on the dome i.e. u,v coordinates will vary from 0,0 to 1,1).

[Edited by - ApochPiQ on November 28, 2008 11:47:02 PM]

Share this post


Link to post
Share on other sites
I wonder where I got the idea of using a custom RNG from, as you said it's not needed unless you plan on generating your own permutation tables.

The dome should be very flat and always right above the player. You can approximate the skydome using a virtual plane and maybe offset the Y coord slightly by the distance from the zenith to adjust for the approximation. It depends a little on how you generate or get your model, try using only a virtual plane first and extend if the shading looks funny.

You should also think about interpolating cloud textures on the GPU, even small changes can look jerky.

Share this post


Link to post
Share on other sites
Quote:
Well, I use Z as the time and I would like clouds to change over the time. So just taking x,y values at a fixed z (time) is straightforward. New textures will be computed over several frames then replaced when completed (= time slicing). z will change only when a new cloud texture is completed and will change very slowly. So it's necessary to keep the 3D noise function.


I implemented a time slicing method as well, but didn't use the 3rd dimension explicitely like you are. I just kept a second cloud array and populated it with values over several frames. Memory wise, this costs about the same, but would allow for you to remove the 3rd dimension from your cloud array. However, the problem I found with the time slicing method is that there weren't enough updates being performed, so each time the new cloud texture was being combined with the old one, there was a jerk associated. I'm in the process of porting what I've done into a shader though, so that should help. But trying to recalculate everything on the cpu just wasn't fast enough.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Well I don't have a 3D noise array I just have a 2D noise array populated using the 3D function. To make this clearer (my english is so bad):

NoiseArray[i][j] = PerlinNoise(x,y,z)

where:
- x,y are calculated from i,j (offset, scale, frequency) and
- z depends on time.
Then the changes from 1 array of noise to the next one should be smooth (they are coherent noise in the 3 dimensions).

As said coelurus you have to interpolate your 2 sets of cloud values. This could be done for example in GPU using 2 textures units and it's probably better if you do it on the original noise, not on the exponentiated values.

I'm currently trying to implement scattering but I have strange pixelized lines crossing the clouds, if someone know if it is a normal effect of Bresenham raycasting ? (white square is supposed to be the sun !)

Share this post


Link to post
Share on other sites
Visible blocking in shading can depend on too low resolution, but the artifacts in your image look pretty bad. Would you care to give some pseudo code for your raycaster?

Share this post


Link to post
Share on other sites
I'm not sure but I would guess you have a height problem in your rays. The sun seems to be at the same height as the clouds. Those white pixels could be caused by errors in ray-tracing, the colors for that pixel isn't computed or is computed erroneously.

Share this post


Link to post
Share on other sites
Quote:
Original post by coelurus
Visible blocking in shading can depend on too low resolution, but the artifacts in your image look pretty bad. Would you care to give some pseudo code for your raycaster?


My "scattering" is a very simple and very intuitive 2D calculation of thickness of cloud the ray traverses.

Here a drawing of what I do:


Pseudo code is:
foreach texel
{
voxel0 = current texel x,y , z = noise value
RayTrace(voxel0, sun)
}

RayTrace(voxel0, sun)
{
calculate the equation of the red line (mA and mB)
mScatteringValue = 0
BresenhamLine(from voxel0, to sun)
finaltexel0 = mScatteringValue
}

BresehhamLine(voxel0,sun)
{
foreach texel in the line from voxel0 to sun
CalculateScattering(x,y)
}

CalculateScattering(x,y)
{
calculate distance in X,Y from voxel0 to that texel
calculate the rayZ (using the equation of the red line)
if ( rayZ is inside the cloud)
mScatteringValue += thickness of that texel
}

And here is the C++ code:

class ScatteringCalculator
{
public:
void SetSun(int aSunX,int aSunY,int aSunZ) ;
void Apply(CorNoiMap2D<float>& vNoise,CorNoiMap2D<float>& vNoiseDest) ;
protected:
virtual bool CalculateScattering(int x,int y) ;
void RayTrace(int aVoxelX,int aVoxelY) ;
void BresenhamLineRunning(int x1,int y1,int x2,int y2) ;

float Angle(int x0,int y0,float z0,int x1,int y1, float z1) ;

// Position of the sun in the sky coordinate (0,0,0 is the top/left corner of the sky, with no cloud)
int mSunX ;
int mSunY ;
int mSunZ ;

// Start of current ray
int mVoxel0X ;
int mVoxel0Y ;

// Equation of the current Ray: y = a x + b
double mA ;
double mB ;

double mScatteringValue ;
CorNoiMap2D<float>* mNoise ;
CorNoiMap2D<float>* mNoiseDest ;

} ;

void ScatteringCalculator::SetSun(int aSunX,int aSunY,int aSunZ)
{
mSunX = aSunX ;
mSunY = aSunY ;
mSunZ = aSunZ ;
}


void ScatteringCalculator::Apply(CorNoiMap2D<float>& aNoise,CorNoiMap2D<float>& aNoiseDest)
{
mNoise = &aNoise ;
mNoiseDest = &aNoiseDest ;
int i ;
int j ;
for (i=0;i<vTex0.SizeX();i++)
{
for (j=0;j<vTex0.SizeY();j++)
{
RayTrace(i,j) ;
}
}
}

void ScatteringCalculator::RayTrace(int aVoxelX,int aVoxelY)
{
// No cloud = no scattering needed
if ( mNoise->Elt(aVoxelX,aVoxelY) < 0.01 )
{
mNoiseDest->Elt(aVoxelX,aVoxelY) = 0. ;
return ;
}

mScatteringValue = 0. ;
mVoxel0X = aVoxelX ;
mVoxel0Y = aVoxelY ;

double vDistanceSunVoxel = sqrt( (aVoxelX-mSunX) * (aVoxelX-mSunX) + (aVoxelY-mSunY) *(aVoxelY-mSunY) ) ;
double vVz0 = - mNoise->Elt(aVoxelX,aVoxelY) ;
// Compute the line parameters
if (vDistanceSunVoxel == 0)
{
mA = 0 ;
}
else
{
mA = ((float)(mSunZ-vVz0)) / vDistanceSunVoxel ; // Slope
}
mB = vVz0 ; // Starting height


BresenhamLineRunning(aVoxelX,aVoxelY,mSunX,mSunY) ;

float vResult = 1 - (mScatteringValue*0.1) ;
if (vResult < 0)
vResult = 0 ;
if (vResult >1.)
vResult = 1. ;
mNoiseDest->Elt(aVoxelX,aVoxelY) = vResult ;
}


bool ScatteringCalculator::CalculateScattering(int x,int y)
{
double vVz = mNoise->Elt(x,y) ;

double vDistVoxel0Voxel = sqrt ( (mVoxel0X-x) * (mVoxel0X-x) + (mVoxel0Y-y) *(mVoxel0Y-y) ) ;

double vRayZ = mA * vDistVoxel0Voxel + mB ;

if ( -vVz <= vRayZ && vVz >= vRayZ)
{
mScatteringValue += vVz - vRayZ ;
return true ;
}
return false ;
}

// cf. http://www.gamedev.net/reference/articles/article1275.asp
// Calls CalculateScattering for each pixel instead of drawing it.
// If CalculateScattering returns true the continue, else exit
void ScatteringCalculator::BresenhamLineRunning(int x1,int y1,int x2,int y2)
...



Share this post


Link to post
Share on other sites
"ScatteringCalculator::CalculateScattering" looks a little funny:


mScatteringValue += vVz - vRayZ;


It looks like you're taking the height of the cloud column above each ray sample and appending that to the scattering "value". When you sample light rays in voxel volumes, every voxel hit by light will be alone with affecting the incoming light.

Just some info (maybe superfluous, but better too much than too little [smile]):

The idea about light scattering in clouds is that a percentage of the light hitting a cloud particle scatters away from the light path and the rest continues traveling through the cloud (more stuff happens, but that's irrelevant). For each voxel that's in the way of a light ray (imagine a light-ray thinking: "if there is a voxel right where I am now, I lose some of my light."), the amount light that passes through the clouds decreases:


cloud_hits = 0
for each voxel in "path of light ray" do
if ray cuts voxel then
cloud_hits++
light = light0 * pow(a, cloud_hits)


where 'a' describes how much light passes through a cloud voxel and 'light0' is the initial light intensity entering the clouds. Experimentation is the only way to get decent values there, approximate roughly first and then play around.

Share this post


Link to post
Share on other sites
[quote]Original post by coelurus
"ScatteringCalculator::CalculateScattering" looks a little funny:


mScatteringValue += vVz - vRayZ;


Well, yes. I did that because if I take

mScatteringValue += 1 ;

The final image looks like this:

(for the moment there is no lookup texture, the texture is only greyscale + alpha, don't worry).
And taking the "thickness" of the cloud smooth the result.

Thank you for your explanations, they are not superfluous.

But my problem probably also comes from the usage I do of that "scattering texture". How do you use it in your illumination calculations ? For the moment I'm just using it as a multiplicator of the noise value.

i.e.
for each pixel:
final color = noise * scattering

Share this post


Link to post
Share on other sites
How do you use the accumulated number of voxel-hits for a lightray? Just multiplying by a factor 0.1 like in the code you posted before won't give proper shading:

If you single out a cloud voxel, light that hits the voxel will separate into some light moving on and the rest "disappearing" by absorption, scattering etc. The light that passes through a cloud voxel (which is a large group of cloud particles, so we can think in average) is not dependent on some absolute reference value, but on the incoming light.

lo = l0 * a

'a' is a factor telling us how much light passes through a cloud voxel, 'l0' is the incoming light intensity to a voxel and 'lo' the output light intensity. Hitting two voxels means:

lo1 = l0 * a
lo2 = lo1 * a = l0 * a * a

=> The final formula: l = l0 * a^n, n = number of voxel hits

Multiplication by some constant would give something like this:

l = l0 * (a + ... + a)

which means that the light that hits a voxel is independent on the path taken before hitting the voxel, which is obviously not correct.


As for how to combine cloud coverage (alpha) and shading (RGB); I basically just add them into an RGBA quadruplet and blend that onto the framebuffer. Using accuracy tricks such as splitting up noise into 2 color channels or more can be a little intricate, but the basic idea is to simply let the noise values work as alpha in blending the shading map.


I'm starting to feel the urge to write myself a new, improved cloud demo...

Share this post


Link to post
Share on other sites
Quote:
Original post by coelurus
l = l0 * a^n


Is "l0" the same for each voxel ? (= a constant value over the sky). Or does it depend on the distance from sun ? the angle sun/voxel ? ... Something else ?

Quote:
Original post by coelurus
I'm starting to feel the urge to write myself a new, improved cloud demo...


Yessss ! great !

Share this post


Link to post
Share on other sites
Quote:
Original post by jmaupay
Is "l0" the same for each voxel ? (= a constant value over the sky). Or does it depend on the distance from sun ? the angle sun/voxel ? ... Something else ?


Generally, a constant value should suffice, but Harris mentions a dot-product between the voxel-sun and voxel-eye rays in his paper. This means that lightrays with rather straight paths shine through more than those that have to reflect from clouds from far away.
The distance to the sun should not matter at all because all rays hit the clouds from above. The pointlight-sun trick for shading doesn't change this, you could think of the trick backwards: we transform the clouds 'til the sun rays become parallell and do our shading in that space. Distance still doesn't matter.

Quote:
Original post by jmaupay
Quote:
[i]Original post by coelurus[!i]
I'm starting to feel the urge to write myself a new, improved cloud demo...

Yessss ! great !


Thanks for inspiring me, but I got an engine to write and studies to keep in mind (hmm, flip those two), and I'm not in CS [looksaround]

Share this post


Link to post
Share on other sites
Well. I allways have such strange shading and analyzing the code I find that the result is coherent, but not what I expect ...



I have a 128x128 grid. I populate it with some noise (8 octaves added). I perform the scattering as described previously:
for each texel:
- calculate nb_hits of the ray in the noise grid
- that texel value = L0 * pow(a,nb_hits)

In the good case (the sun is very close to the clouds) 1 ray hits a maximum of 60 texels (nb_hits max = 60). If the sun is not close to the clouds, the ray hits let's say 10 texels.

So if I use that value as the final texture it is a 128x128 grid with values from 0 (no hit) to 10 (10 hits). So if I use that value as a greyscale it has only 10 different values (the pow() doesn't change anything).

What did I miss ?

[Edited by - jmaupay on September 12, 2005 9:02:52 AM]

Share this post


Link to post
Share on other sites
I had another look at your code and it looks as if you step in the XY-plane and let the Z hang tight to where (X, Y) go. If the sun is right above a really thick cloud, Z will jump in large steps and very few voxels will be tested when maybe 200 voxels should be considered. Try a full 3D raycaster that will step in small steps in any direction.

The following is almost a straight copy from my code for computing the shading values:

intens = exp(hits * -0.02);
dot = lx*vx + ly*vy + lz*vz;
intens *= 0.9 * (1 + dot*dot);

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this