Strange issue with Sean O'Neil's atmospheric scattering (ground).

Started by
11 comments, last by c_olin 12 years, 10 months ago
I have implemented Sean O'Neil's atmospheric scattering for the atmosphere (works well) and for the ground (does not work well). I am not rendering an entire planet, only a small island from the ground. So I made an atmosphere mesh that is a semi-sphere roughly (very) the radius of the earth's atmosphere (6100 km) and I translate the camera and sample points into "planet-space" (adding the radius of the planet to the y values).

This method worked great for the atmosphere shader, but I am getting strange results for the ground shader. Everything below the camera seems to be working properly, but everything above the camera gets dark very fast. Here are some pictures:

The good:
GCSJ6.jpg


The bad and the ugly:
dVdAr.png

nYfCq.jpg



The only significant difference in the code is that I am performing the calculations in a pixel shader (a post-process shader in a deferred renderer). I know that the pixel positions are correct. Here is the code and values (pretty much a copy-paste from O'Neil at this point).



const float fOuterRadius = 6100000; // The outer (atmosphere) radius (roughly earth proportions).
const float fInnerRadius = 6000000; // The inner (planetary) radius (roughly earth proportions).

vec3 atmosOffset = vec3(0.0, fInnerRadius, 0.0); // For offsetting world space positions to
// atomspheric space.

const float kr = 0.0025; // Rayleigh scattering constant.
const float km = 0.0005; // Mie scattering constant.

const float sun = 25.0; // Sun brightness constant.

float fKrESun = kr * sun;
float fKmESun = km * sun;
float fKr4PI = kr * 4.0 * pi;
float fKm4PI = km * 4.0 * pi;

float fScale = 1.0 / (fOuterRadius - fInnerRadius);
float fScaleDepth = 0.18; // The altitude at which the atmosphere's average density is found.
float fScaleOverScaleDepth = fScale / fScaleDepth;

void main(void) {

// Transform pixel position and camera position into planetary space.
vec3 v3CameraPos = cameraOrigin + atmosOffset;
vec3 samplePos = pixelOrigin + atmosOffset;

float fCameraHeight = length(cameraAtmosPos);

// Get the ray from the camera to the vertex, and its length (which is the far point of the ray passing through the atmosphere)
vec3 v3Pos = ;
vec3 v3Ray = v3Pos - v3CameraPos;
float fFar = length(v3Ray);
v3Ray /= fFar;

// Calculate the ray's starting position, then calculate its scattering offset
vec3 v3Start = v3CameraPos;
float fDepth = exp((fInnerRadius - fCameraHeight) / fScaleDepth);
float fCameraAngle = dot(-v3Ray, v3Pos) / length(v3Pos);
float fLightAngle = dot(v3LightPos, v3Pos) / length(v3Pos);
float fCameraScale = scale(fCameraAngle);
float fLightScale = scale(fLightAngle);
float fCameraOffset = fDepth*fCameraScale;
float fTemp = (fLightScale + fCameraScale);

// Initialize the scattering loop variables
float fSampleLength = fFar / fSamples;
float fScaledLength = fSampleLength * fScale;
vec3 v3SampleRay = v3Ray * fSampleLength;
vec3 v3SamplePoint = v3Start + v3SampleRay * 0.5;

// Now loop through the sample rays
vec3 v3FrontColor = vec3(0.0, 0.0, 0.0);
vec3 v3Attenuate;
for(int i=0; i<nSamples; i++)
{
float fHeight = length(v3SamplePoint);
float fDepth = exp(fScaleOverScaleDepth * (fInnerRadius - fHeight));
float fScatter = fDepth*fTemp - fCameraOffset;
v3Attenuate = exp(-fScatter * (v3InvWavelength * fKr4PI + fKm4PI));
v3FrontColor += v3Attenuate * (fDepth * fScaledLength);
v3SamplePoint += v3SampleRay;
}

vec3 c1 = v3FrontColor * (v3InvWavelength * fKrESun + fKmESun);

// Calculate the attenuation factor for the ground
vec3 c2 = v3Attenuate;

vec3 pixelColor = texture2D(texture3, textureCoords).rgb;

gl_FragData[0].rgb = c1 + pixelColor * c2;
}



Any help at all would be appreciated.

Thanks
Advertisement
In O'Neils GPU Gems demo the planet has a radius of 100, you have yours set to 6000000. If you want to understand what the hell is going on in O'Neils shader then you should read Nishita's original paper. It is worth it just to find out what the different terminology O'Neil uses mean. You will probably figure out pretty quick what is wrong with what you are doing after reading it.
The params for the shader take an inner radius and an outer radius, off the top of my head it looks like the top of your geometry lies outside of the outer radius. The parameters can be very sensitive though as I'm sure you've found out.

The params for the shader take an inner radius and an outer radius, off the top of my head it looks like the top of your geometry lies outside of the outer radius. The parameters can be very sensitive though as I'm sure you've found out.


Thanks for the reply. I am positive that my coordinates are within the inner and outer radius.

In O'Neils GPU Gems demo the planet has a radius of 100, you have yours set to 6000000. If you want to understand what the hell is going on in O'Neils shader then you should read Nishita's original paper. It is worth it just to find out what the different terminology O'Neil uses mean. You will probably figure out pretty quick what is wrong with what you are doing after reading it.


O'Neil's shader is supposed to work for any radius.


[font=Verdana, Geneva, Arial, Helvetica,]In the demo included on this book's CD, the atmosphere's thickness (the distance from the ground to the top of the atmosphere) is 2.5 percent of the planet's radius, and the scale height is 25 percent of the atmosphere's thickness. The radius of the planet doesn't matter as long as those two values stay the same.[/font]
[/quote]

My planet radius is 6000000 making my atmosphere 6150000. I have tried using these values in my shader and I get the same results.

I have read Nishita's paper a couple of times and have a hard time absorbing it completely. After reading it again I do not see any clues as to why I would be getting the results I am currently getting.
Sorry to bump the thread but this is really stumping me. I've been re-reading the articles and going over my code but I'm not understanding where I'm going wrong. If anyone has any help to offer I would really appreciate it.

Thanks

Edit:

I did notice this in the article which appears to be part of the problem:


[font=Verdana, Geneva, Arial, Helvetica,]When a vertex is above the camera in the atmosphere, the ray from sample point P through the camera can pass through the planet itself. The height variable is not expected to go so far negative, and it can cause the numbers in the lookup table to go extremely high, losing precision and sometimes even encountering a floating-point overflow. The way to avoid this problem is to reverse the direction of the rays when the vertex is above the camera. Examine the rays passing through the sky vertex (B [sub]2[/sub]) in Figure 16-3 for a graphical representation of this.[/font]
[/quote]

However the code doesn't seem to account for this. So I swap the camera position and the ground position and negate the sample ray direction as it suggested and I get better results above the camera. However there is still a dark ring at the camera level:

fUmfQ.jpg


nK3Ag.jpg
How are you ensuring that your input data is definitely within the inner to outer atmosphere range? If, for instance, you are using the default or improved Perlin noise algorithm, they do not provide numbers that are clamped between -1 and 1, they can exceed those bounds. How about doing some shader diagnostics? Output height / (OuterRadius - InnerRadius) as the colour. Add branches to output red if the value escapes 0 to 1 bounds. Simplify the shader then build it back up, adding each stage, testing values against what they should be by outputting them as colour.

How are you ensuring that your input data is definitely within the inner to outer atmosphere range? If, for instance, you are using the default or improved Perlin noise algorithm, they do not provide numbers that are clamped between -1 and 1, they can exceed those bounds. How about doing some shader diagnostics? Output height / (OuterRadius - InnerRadius) as the colour. Add branches to output red if the value escapes 0 to 1 bounds. Simplify the shader then build it back up, adding each stage, testing values against what they should be by outputting them as colour.



Thanks for the reply.

My terrain is generated volumetrically via marching cubes and is subdivided by an octree. So my terrain has a definite bound of 8192 for the maximum height. I also have the camera's origin output through my GUI so I can tell how high neighboring geometry is. I am 100% certain that the highest mountains of my terrain are only around 6,800 (outerRadius - innerRadius = 150,000 for me).

Note that the dark band appears at camera level. If the camera is at sea level then the horizon appears dark. As you move up the horizon will lighten and whatever terrain is at camera level will darken. Whatever error is in my computation is relative to the camera, not the terrain. Mathematically it happens as the sample ray approaches being parallel, or as the height of the sample position (the ground) becomes close to the height of the camera.
In that case, please post the code where you are setting all the uniforms for your shader.

/edit

vec3 v3Pos = ;
What is going on here?
I assume you removed the interpolated vertex position variable from here for some reason?
Could you please post your vertex shader as well?

In that case, please post the code where you are setting all the uniforms for your shader.

/edit

vec3 v3Pos = ;
What is going on here?
I assume you removed the interpolated vertex position variable from here for some reason?
Could you please post your vertex shader as well?


The uniforms are set in a lua script:



-- Atmospheric scattering constants.
renderer : setShaderVariable("invWavelength", Vector3(1.0 / math.pow(0.650, 4), -- 650 nm for red.
1.0 / math.pow(0.570, 4), -- 570 nm for green.
1.0 / math.pow(0.475, 4))) -- 475 nm for blue.

local kr = 0.0025
local km = 0.0010

local sun = 20

renderer : setShaderVariable("krSun", kr * sun)
renderer : setShaderVariable("kmSun", km * sun)

renderer : setShaderVariable("kr4pi", kr * 4 * math.pi)
renderer : setShaderVariable("km4pi", km * 4 * math.pi)

local outerRadius = 6150000
local innerRadius = 6000000

renderer : setShaderVariable("outerRadius", outerRadius)
renderer : setShaderVariable("innerRadius", innerRadius)

local scale = 1 / (outerRadius - innerRadius)

renderer : setShaderVariable("scale", scale)

local scaleDepth = 0.25

renderer : setShaderVariable("scaleDepth", scaleDepth)
renderer : setShaderVariable("scaleOverScaleDepth", scale / scaleDepth)


Slightly more complete frag shader.



// Copyright (c) 2011 Colin Hill.

uniform vec3 sunDirection;

uniform float seaLevel;

uniform vec3 invWavelength;

uniform float outerRadius;
uniform float innerRadius;

uniform float krSun;
uniform float kmSun;
uniform float kr4pi;
uniform float km4pi;

uniform float scale;
uniform float scaleDepth;
uniform float scaleOverScaleDepth;

const int numSamples = 2;


uniform vec3 cameraOrigin;
uniform vec3 cameraFarTopLeft, cameraFarTopRight;
uniform vec3 cameraFarBottomLeft, cameraFarBottomRight;
uniform float cameraNearClip, cameraFarClip;
uniform float cameraAspectRatio;

uniform float viewportWidth, viewportHeight;

uniform sampler2D diffuseBuffer;
uniform sampler2D materialBuffer;
uniform sampler2D normalDepthBuffer;

uniform sampler2D backBuffer0;
uniform sampler2D backBuffer1;

varying vec2 textureCoords;

vec3 getEyeVec(vec2 screenCoords) {
return normalize(cameraFarBottomLeft + (cameraFarBottomRight - cameraFarBottomLeft) * screenCoords.x +
(cameraFarTopLeft - cameraFarBottomLeft) * screenCoords.y - cameraOrigin);
}

float getFragDistance(vec2 screenCoords) {
float fragDistance = texture2D(normalDepthBuffer, screenCoords).w;
if (fragDistance < cameraNearClip) {
fragDistance = cameraFarClip;
}

return clamp(fragDistance, cameraNearClip, cameraFarClip);
}

vec3 getFragWorldOrigin(vec2 screenCoords) {
return cameraOrigin + getEyeVec(screenCoords) * getFragDistance(screenCoords);
}

bool fragIsLit(vec2 screenCoords) {
float fragLighting = texture2D(materialBuffer, textureCoords).r;

return fragLighting >= 1.0;
}

float scaleFactor(float cosi) {
float x = 1.0 - cosi;
return scaleDepth * exp(-0.00287 + x * (0.459 + x * (3.83 + x * (-6.80 + x * 5.25))));
}

void main() {
float fragLighting = texture2D(materialBuffer, textureCoords).r;

vec3 fragWorldOrigin = getFragWorldOrigin(textureCoords);

if (fragLighting >= 1.0) { // Only perform atmospheric scattering on pixels that are affected by lights.
vec3 cameraPos = cameraOrigin + vec3(0.0, innerRadius, 0.0);
vec3 pos = fragWorldOrigin + vec3(0.0, innerRadius, 0.0);

vec3 ray = pos - cameraPos;

float far = length(ray);
ray /= far;

if (cameraPos.y < pos.y) {
vec3 t = pos;
pos = cameraPos;
cameraPos = t;
ray = -ray;
}

pos = normalize(pos);

vec3 start = cameraPos;
float depth = exp((innerRadius - cameraPos.y) * (1.0 / scaleDepth));
float cameraAngle = dot(-ray, pos);
float lightAngle = dot(sunDirection, pos);
float cameraScale = scaleFactor(cameraAngle);
float lightScale = scaleFactor(lightAngle);
float cameraOffset = depth * cameraScale;
float temp = lightScale + cameraScale;

float sampleLength = far / float(numSamples);
float scaledLength = sampleLength * scale;

vec3 sampleRay = ray * sampleLength;
vec3 samplePoint = start + sampleRay * 0.5;

vec3 frontColor = vec3(0.0, 0.0, 0.0);
vec3 attenuate = vec3(0.0, 0.0, 0.0);

for (int i = 0; i < numSamples; i++) {
float height = samplePoint.y;
float depth = exp(scaleOverScaleDepth * (innerRadius - height));
float scatter = depth * temp - cameraOffset;
attenuate = exp(-scatter * (invWavelength * kr4pi + km4pi));
frontColor += attenuate * (depth * scaledLength);
sampleRay += sampleRay;
}

vec3 c0 = frontColor * (invWavelength * krSun + kmSun);
vec3 c1 = attenuate;

gl_FragData[0].rgb = c0 + c1 * texture2D(backBuffer0, textureCoords).rgb;
} else {
gl_FragData[0] = texture2D(backBuffer0, textureCoords);
}
}


There is no vertex shader. This is entirely post-process in screen-space. Note that I am sure that the fragment world origin computation is correct (my light model looks fine).

This topic is closed to new replies.

Advertisement