# Atmospheric Scattering from Space (O'Neil)

This topic is 3762 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Sorry for the repost here, I already posted in the DX Forum but unfortunately I wasn't able to solve my issue.. I'm posting here hoping to get to the bottom of this issue. In short, my HLSL implementation of O'Neil's algorithm isn't working. As in the following picture, it only seems to be working around the edges (notice the faint blue outline). The central pixels are all white. By using PIX I discovered that the problem lies in the "startAngle" computation. Values for central pixels happen to be like -0.8 -0.9, close to -1. So when they are passed to the "expScale" function, it returns astronomical values. I've checked and double checked my implementation against dozens of implementations available here and on the net and nothing seems to be very different. This is the source to the VS: Vertex Shader
float cameraHeight; // The camera's current height
float ESun;
float kr =0.0025;
float km =0.0015f;
float krESun; // Kr * ESun
float kmESun; // Km * ESun
float kr4PI; // Kr * 4 * PI
float km4PI; // Km * 4 * PI
float scaleDepth = 0.25f; // The scale depth (i.e. the altitude at which the atmosphere's average density is found)
float scaleOverScaleDepth; // fScale / fScaleDepth
float g = -0.95f;
int samples=4;

float4 waveLength;
float4 lightDirection;
float4 cameraPosition;
float4 invWavelength

float cameraHeight2;

float4x4 WorldViewProj;

struct vpconn
{
float4 Position : POSITION;
float3 t0 : TEXCOORD0;
float3 c0 : COLOR; // The Rayleigh color
float3 c1 : COLOR1; // The Mie color
};

void SetupValues() {
float Pi = 3.14159265359;
lightDirection = normalize(lightDirection);
scaleOverScaleDepth = scale/scaleDepth;

float4 waveLength4 = pow(waveLength,4);

cameraHeight = length(cameraPosition);
cameraHeight2 = pow(cameraHeight,2);

krESun = kr *ESun;
kmESun = km*ESun;
kr4PI = kr *4.0f * Pi;
km4PI = km *4.0f * Pi;

invWavelength = 1 / waveLength4;
}

// The scale equation calculated by Vernier's Graphical Analysis
float expScale (float fCos)
{
float x = 1.0 - fCos;
return scaleDepth * exp(-0.00287 + x*(0.459 + x*(3.83 + x*(-6.80 + x*5.25))));

}

// Calculates the Mie phase function
float getMiePhase(float fCos, float fCos2, float g, float g2)
{
return 1.5 * ((1.0 - g2) / (2.0 + g2)) * (1.0 + fCos2) / pow(1.0 + g2 - 2.0*g*fCos, 1.5);
}

// Calculates the Rayleigh phase function
float getRayleighPhase(float fCos2)
{
return 0.75 + (1.0 + fCos2);
}

// Returns the near intersection point of a line and a sphere
float getNearIntersection(float3 v3Pos, float3 v3Ray, float fDistance2, float fRadius2)
{
float B = 2.0 * dot(v3Pos, v3Ray);
float C = fDistance2 - fRadius2;
float fDet = max(0.0, B*B - 4.0 * C);
return 0.5 * (-B - sqrt(fDet));
}

// Returns the far intersection point of a line and a sphere
float getFarIntersection(float3 v3Pos, float3 v3Ray, float fDistance2, float fRadius2)
{
float B = 2.0 * dot(v3Pos, v3Ray);
float C = fDistance2 - fRadius2;
float fDet = max(0.0, B*B - 4.0 * C);
return 0.5 * (-B + sqrt(fDet));
}

vpconn AtmosphereFromSpaceVS(float4 vPos : POSITION )
{

SetupValues();

float3 ray = vPos.xyz - cameraPosition.xyz;

float far = length (ray);
ray /= far;

float near = getNearIntersection (cameraPosition, ray,

float3 start = cameraPosition + ray * near;
far -= near;

float startAngle = dot (ray, start) / outerRadius;
float startDepth = exp (scaleOverScaleDepth * (innerRadius - cameraHeight));
float startOffset = startDepth * expScale (startAngle);

float sampleLength = far / samples;
float scaledLength = sampleLength * scale;
float3 sampleRay = ray * sampleLength;
float3 samplePoint = start + sampleRay * 0.5f;

float3 frontColor = float3 (0,0,0);

for (int i = 0; i < samples; i++)
{
float height = length (samplePoint);
float depth = exp (scaleOverScaleDepth * (innerRadius - height));
float lightAngle = dot (lightDirection, samplePoint) / height;
float cameraAngle = dot (ray, samplePoint) / height;
float scatter = (startOffset + depth * (
expScale (lightAngle) - expScale (cameraAngle)));

float3 attenuate = exp (-scatter * (invWavelength.xyz * kr4PI + km4PI));

frontColor += attenuate * (depth * scaledLength);

samplePoint = samplePoint + sampleRay;

}

vpconn OUT;

OUT.t0 = cameraPosition.xyz - vPos.xyz;
OUT.Position = mul(vPos, WorldViewProj);
OUT.c0.xyz = frontColor * (invWavelength.xyz * krESun);
OUT.c1.xyz = frontColor * kmESun;

return OUT;
}


Other info could be found in my other thread here. If anyone would like to download my source code and have a try at it here's the link. Thanks in advance.

##### Share on other sites
Are you rendering only one sphere?

So, if there is no surface mesh, the output of yours might acutally be somewhat ok, since the surface would hide (some) of the strange white areas.

EDIT:
"The problem is in the front vertices.. everything is white."
The atmosphere should be rendered with seeing the inside of the sphere, rather then the outside, no?

##### Share on other sites
Yes only one sphere.
It might cover those areas, but I don't think it's working as it should, because as I said, the expScale function returns very big values (in the order of billions) that get capped at 1.0 (resulting in white). And I'd expect it to fade to back gradually, without leaving white squares all over.

The sphere I'm rendering is not "hollow".. If I were to render such a sphere, wouldn't it appear invisible from my point of view (which is external to the sphere), unless it had faces even on my sides.

The correct implementations should (I think) color the center pixels with a faint blue shade so that the final resulting color (added to the color of the planet that will be inside the atmosphere) will have a sligh blue tint.

##### Share on other sites
If I remember correctly, O'Neil does render the back faces of the atmosphere so that the inside of the sphere is showing (I missed this at first as well). Depending on your methods, this saves you a sphere intersection check as well since the vertices rendered consist of one side of the intersection. Though if you calculate all the intersections and distances correctly I believe it shouldn't matter.

As well you need to discard the points that would have otherwise intersected the planet, either by calculating the intersection, and adjusting your distance, or rendering a sphere the size of your inner radius to cover them.

As the cosine of your start angle appraoches -1 the x=1-fCos in expScale approaches 2, which makes the equation in the exp() call eventually reach approximately 22.919, and exp(22.919) is extremely large, as you have seen.

The distance covered across the entire atmosphere, where the planet would otherwise be covering, is larger by comparison to the ones you get from horizon to horizon. Towards the center of the sphere the distance covered moves towards being the diameter of the sphere; also as the sample point starts to move through the "center" the angle between the sample and view/light directions increases causing the cosine of those angles to reverse. Since the scattering tends towards white at the horizon, and the planet/atmosphere diameter is much larger than the distance to the horizon, combined with the cosine reversal, you get white across the center of the sphere.

Without handling intersections with the planet the distances will only be within a suitable range in the "relatively" small area of the atmosphere where the planet does not appear, which is why you see your "edges" appearing to look correct. Otherwise, rendering the planet with the same shader is actually what fills in the rest of what you see, as well as accounting for terrain color, etc.

Hope I worded that right, and hope it makes sense.

Feel free to correct me, heh.

I'll try and take a look and see if I find anything else.

[Edited by - Amadeus on July 17, 2008 7:21:40 AM]

##### Share on other sites
So you say that that is the correct way in which it should render? And that I've been trying to find a nonexistant error? I hope so :)
I'll have to try to render the planet itself and see what happens. But do you have implemented this algorithm? Can you tell me whether by rendering the atmosphere only, those jagged areas in the bottom do appear anyway? In all implementation it seems that the part of the planet that is lighted gradually fades to black.. Probably they are occluded by the planet?

Also, I've not perfectly understood the requirements of the model itself. Should the normals be facing towards the center of the sphere or outside from it (about the atmosphere.. sphere)? I didn't find anything explicitily stating that a "hollow" sphere was required so I assumed that a normal sphere would suffice.

##### Share on other sites
For the atmosphere, because the planet is in the way, the backfaces are really all you will ever see, especially when inside the atmosphere at lower altitudes. So yes, the artifacts you are seeing would normally be occluded by the planet geometry.

It has been a while since i have messed with this, and I am just getting back to playing with it as well, but i remember having this and a number of similar issues as well when I originally tried. Rendering the planet, or accounting for where intersection with the planet would be should take care of it. And from what I have seen, at least from my initial view, your problem is "nonexistant". I would have to delve deeper to be sure.

The other problem with using the parameters as they are is that the inner and outer radii are 10 and 10.25 respectively. An atmosphere thickness of 0.25 is extremely small, thus you get very little of the atmosphere actually visible. Try scaling your inner radius value, and thus your mesh as well, and see if that helps you evaluate whether you are getting correct result.

For the atmosphere at least, it should not matter which way the normals are, as long as it is centered at the origin. Just flip your culling based on which side of the sphere you want visible. The normals are not used directly in the calculation, though implicitly the normalization of the points on the sphere create vectors that are equivalent to the normal directions.

Here is a slight modification of your shader taking into account planet intersections to give you an idea of what it would look like. Normally rendering the planet would do this instead of handling it all in the same shader. I have not tried it from within the atmosphere, but you should probably be able to figure out if any thing needs to be changed. You can also try uncommenting the two lines in the pixel shader and playing with the exposure to see if you get better looking results. The extra intersection check I added is normally not needed as the planet and atmosphere are rendered separately with front or back faces visible respectively, that keeps the first length calculation valid for the farthest intersection.

float4x4 mWorldViewProj;float4 invWavelength;float4 vEyePos;float4 vLightDir;float cameraHeight; // The camera's current heightfloat cameraHeight2; // fCameraHeight^2float outerRadius; // The outer (atmosphere) radiusfloat outerRadius2; // fOuterRadius^2float innerRadius; // The inner (planetary) radius//float innerRadius2; // fInnerRadius^2float krESun; // Kr * ESunfloat kmESun; // Km * ESunfloat kr4PI; // Kr * 4 * PIfloat km4PI; // Km * 4 * PIfloat scale; // 1 / (fOuterRadius - fInnerRadius)float scaleDepth; // The scale depth (i.e. the altitude at which the atmosphere's average density is found)float scaleOverScaleDepth; // fScale / fScaleDepthfloat g;float g2;struct vpconn{	float4 Position : POSITION;	float3 t0 : TEXCOORD0;	//float DepthBlur : TEXCOORD1;	float3 c0 : COLOR; // The Rayleigh color	float3 c1 : COLOR1; // The Mie color};struct vsconn{	float4 rt0 : COLOR0;	//float4 rt1 : COLOR1;};// The scale equation calculated by Vernier's Graphical Analysisfloat expScale (float fCos){	//float x = 1.0 - fCos;	float x = 1 - fCos;	return scaleDepth * exp(-0.00287 + x*(0.459 + x*(3.83 + x*(-6.80 + x*5.25))));}// Calculates the Mie phase functionfloat getMiePhase(float fCos, float fCos2, float g, float g2){	return 1.5 * ((1.0 - g2) / (2.0 + g2)) * (1.0 + fCos2) / pow(1.0 + g2 - 2.0*g*fCos, 1.5);}// Calculates the Rayleigh phase functionfloat getRayleighPhase(float fCos2){	return 0.75 + (1.0 + fCos2);}// Returns the near intersection point of a line and a spherefloat getNearIntersection(float3 v3Pos, float3 v3Ray, float fDistance2, float fRadius2){	float B = 2.0 * dot(v3Pos, v3Ray);	float C = fDistance2 - fRadius2;	float fDet = max(0.0, B*B - 4.0 * C);	return 0.5 * (-B - sqrt(fDet));}// Returns the far intersection point of a line and a spherefloat getFarIntersection(float3 v3Pos, float3 v3Ray, float fDistance2, float fRadius2){	float B = 2.0 * dot(v3Pos, v3Ray);	float C = fDistance2 - fRadius2;	float fDet = max(0.0, B*B - 4.0 * C);	return 0.5 * (-B + sqrt(fDet));}vpconnAtmosphereFromSpaceVS(float4 vPos : POSITION ){    float3 pos = vPos.xyz;	float3 ray = pos - vEyePos.xyz;	pos = normalize(pos);	 	float far = length (ray);	ray /= far;		// check if this point is obscured by the planet	float B = 2.0 * dot(vEyePos, ray);	float C = cameraHeight2 - (innerRadius*innerRadius);	float fDet = (B*B - 4.0 * C);	if (fDet >= 0)	{		// compute the intersection if so		far = 0.5 * (-B - sqrt(fDet));	}	float near = getNearIntersection (vEyePos, ray,	cameraHeight2, outerRadius2);	float3 start = vEyePos + ray * near;	far -= near;	float startAngle = dot (ray, start) / outerRadius;	float startDepth = exp (scaleOverScaleDepth * (innerRadius - cameraHeight));	//float startDepth = exp ((innerRadius - cameraHeight) / scaleDepth);	//float startDepth = exp (-(1.0f / 0.25));	float startOffset = startDepth * expScale (startAngle);	float sampleLength = far / 1.0f;	float scaledLength = sampleLength * scale;	float3 sampleRay = ray * sampleLength;	float3 samplePoint = start + sampleRay * 0.5f;	float3 frontColor = float3 (0,0,0);	for (int i = 0; i < 1; i++)	{		float height = length (samplePoint);		float depth = exp (scaleOverScaleDepth * (innerRadius - height));		float lightAngle = dot (vLightDir, samplePoint) / height;		float cameraAngle = dot (-ray, samplePoint) / height;		float scatter = (startOffset + depth * (expScale (lightAngle) - expScale (cameraAngle)));		float3 attenuate = exp (-scatter * (invWavelength.xyz * kr4PI + km4PI));		frontColor += attenuate * (depth * scaledLength);		samplePoint += sampleRay;	}	vpconn OUT;//	OUT.DepthBlur = ComputeDepthBlur(mul (vPos, viewMatrix));	OUT.t0 = vEyePos.xyz - vPos.xyz;	OUT.Position = mul(vPos, mWorldViewProj);	OUT.c0.xyz = frontColor * (invWavelength.xyz * krESun);	OUT.c1.xyz = frontColor * kmESun;	return OUT;}vsconnAtmosphereFromSpacePS1(vpconn IN){  float3 color = IN.c0;      vsconn OUT;  OUT.rt0.rgb = color.rgb;  OUT.rt0.a = 1.0f;  return OUT;}vsconnAtmosphereFromSpacePS(vpconn IN){	vsconn OUT;	float cos = saturate(dot (vLightDir, IN.t0) / length (IN.t0));	float cos2 = cos*cos;	float fMiePhase = 1.5 * ((1.0 - g2) / (2.0 + g2)) * (1.0 + cos*cos) / pow(1.0 + g2 - 2.0*g*cos, 1.5);	float fRayleighPhase = 0.75 * (1.0 + cos*cos);	OUT.rt0.rgb = (fRayleighPhase * IN.c0 + fMiePhase * IN.c1);	//float exposure = 0.85;	//OUT.rt0.rgb = 1.0 - exp(-exposure * (fRayleighPhase * IN.c0 + fMiePhase * IN.c1));	OUT.rt0.a = OUT.rt0.b;	//OUT.rt1 = ComputeDepthBlur (IN.DepthBlur);	return OUT;}technique AtmosphereFromSpace{pass P0{AlphaBlendEnable = true;DestBlend = ONE;SrcBlend = ONE;CullMode = CW;VertexShader = compile vs_3_0 AtmosphereFromSpaceVS();PixelShader = compile ps_3_0 AtmosphereFromSpacePS();}}

Like I said, it has been a while since I have fully messed with oneil's shaders, so there may still be some values you need to play with to get everything in scale and acting correctly from both sides of the atmosphere. I don't remember needing the saturate call I added in the pixel shader, but I definitely remember getting the artifact it causes if you remove it in my original testing. I can't remember whether or not some angles flip when moving from space to in atmosphere, but i think comparing the original variations of the shaders and where they differ should shed some light on that.

Hope that helps.

I'll have to see if I can dig up my original tests in this area, though I plan on looking into it again fully sometime soon anyway.

[Edited by - Amadeus on July 18, 2008 8:36:59 AM]

##### Share on other sites
Hey,

Its been quite some time since I've implemented O'Neil's method, but I remember having the same thing show up on mine. I cant remember exactly what it was I was doing wrong(sorry), but I think it was that I was not passing the viewers position to the shader correctly. In general the shader is exteremly finiky, so if you arent passing exactly the right constants, drawing in the right order, or w/e, you'll end up with problems. The way I ended up solving my problem was downloading the reference implementation off O'Neil's site, commenting down to just the first stage of the shader, and compared my inputs with his, until everything matched. Then I uncommented some of the code and continued on until all was well.

Regards,
Jesse

##### Share on other sites
I tried to do some more experiments but still no luck.. :(

I used your shader instead, and the difference is that only the blue halo is rendered while the inner pixels do appear white. I'm now also rendering a planet but the result doesn't seem to be correct.
<br/>
The planet itself is being rendered with a variation of O'Neil's GroundFromSpace shader. These shaders were sent to me quite some time ago by a GameDev user named Sensate. He also sent me a link to what the final result should look like. Here's a movie. Well anyway it's very different. My "results" only show the blue outline around the planet, while the final result should also overlay the atmosphere over the rest of the planet. For example, if the atmosphere is of a bluish color, the surface of the planet should get a faint bluish tint. Which is absent from mine.

Anyway, while implementing the ground shader, which should be some sort of specular bump shader, it occurred to me that no specular highlights, nor bump effects are visibile.. (!) So as laeuchli said, could the problem of it all lie in the eye position values not being sent correctly to the shader?

as "vEyePos" I'm simply passing the float4 value of the camera position. In my test app the camera is at (0, 15, -30). So the value that is sent to the shader will be (0, 15, -30, 1.0). In O'Neil source code he does a similar thing, he just sends the value (0,0, 25) to the shader.

Should the camera value be transformed in some way?

Anyway, I also tried to compile O'Neil C++ source code, but either I've really bad luck but it doesn't seem to be working. The .exe file available from his website shows a result completely different from my .exe resulting from building the project. I just see a thick green halo around a white circle :D

I didn't touch anything in the source code.. I just had to set the ouput directory again because the conversion process to the 2008 version probably changed it. Did anyone encounter this problem or I'm the only one?

I cannot apply the scientific method if I've three different results :D

##### Share on other sites
Hmmm. This is what you should have seen with the shader as I had it. I also moved the camera position to (30, 0, 0), but that shouldn't have mattered since you could move around the planet.

##### Share on other sites

I hope to solve this problem by the end of this century :D

EDIT: I compiled O'Neil sources on another computer but I'm again getting a thick green halo around a white circle. Is anyone else encountering this problem?

1. 1
Rutin
49
2. 2
3. 3
4. 4
5. 5

• 10
• 28
• 20
• 9
• 20
• ### Forum Statistics

• Total Topics
633410
• Total Posts
3011727
• ### Who's Online (See full list)

There are no registered users currently online

×