View more

View more

View more

### Image of the Day Submit

IOTD | Top Screenshots

### The latest, straight to your Inbox.

Subscribe to GameDev.net Direct to receive the latest updates and exclusive content.

# O'Neil atmospheric scattering.

9 replies to this topic

### #1piluve  Members

Posted 16 February 2017 - 11:09 AM

Hello once again.

I've been adding the atmosphere model explained by Sean O'Neil (http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter16.html) and I have it 70% working. I'm thinking how is the best way to implement it in a scenario where I won't be using a planet. I'll just render a terrain (lets say 20x20km) and a few other elements.

My first idea was to render a 3d sphere for the sky with a radius of 4Km, then on the shader I'll specify the outer and inner radius accordingly. This is not a really good idea as I have to hack a few values to move up all the elements of my scene (so they will placed at the "earth" surface).

The other idea is to make it as a postprocessing effect, define a sphere and raycast it using the scattering shader. I think it will work but I'm not sure how well it will fit in my scene (water,terrain,clouds) forward rendering with some transparent objects.

Any ideas?

See you!

Edit:

Here are the vertex and fragment shader I've been using. They are used with the 3D sphere:

In those shaders i've been rendering the sky sphere like a skybox(depth mask disabled, rendered before the scene). The sphere in those shaders have 1m radius and I'm harcoding the camera position.

Update

Alright, I've been working on the solution using the sphere as a skybox and I'm having some problems with the algorithm itself. The sun gets black when its position is not laying on the x axis (0.0f,0.0f,1.0f) or (0.0f,0.0f,-1.0f) this is how it looks: https://i.gyazo.com/47778fc56b41eeccef798ff79a8a7379.png

Lets see an example if I set the sun position (sun position in the shader = sun direction) (0.0f,0.5f,1.0f): https://i.gyazo.com/22c556be20360b32e2f4431dc22816a0.png.

I updated the vertex and fragment codes:

I would like to mention that I'm ignoring the alpha value in the fragment shader.

Edited by piluve, 16 February 2017 - 02:04 PM.

### #2piluve  Members

Posted 17 February 2017 - 11:24 AM

Update 2

I've updated all the variables in the shader that were harcoded and now I'm ussing uniforms to set them. I found a problem with the SunDirection that I send from the vertex to the fragment shader, I must normalize it, now the sun is no longer black but it has a strange shape.

### #3piluve  Members

Posted 02 March 2017 - 01:39 PM

Update 3

I started working again on the atmosphere scattering shader and I'm still figuring out the best way to use it without having to render a sphere with the size of the atmosphere.

I've been using a sphere with radius 1 on top of the camera (like you would do with a skybox) but I have to hack some values. For example, I have to displace down the sphere, if I dont do that, the atmosphere apears on the North pole of the sphere...

Maybe the best way, would be to implement it on a postprocess but I'm not sure how it will fit with other elements (with alpha).

Any ideas?

Thanks!

### #4swiftcoder  Senior Moderators

Posted 02 March 2017 - 02:33 PM

Maybe the best way, would be to implement it on a postprocess but I'm not sure how it will fit with other elements (with alpha).

I implemented O'Neil's scattering as a post-process a long time ago. It works.

Planets are pretty big, so broadly sorting rendered elements into [ground, ocean, clouds, atmosphere, space] makes it pretty easy to ensure the atmosphere is rendered in the right order such that alpha blending is resolved beforehand.

Tristam MacDonald - Software Engineer @ Amazon - [swiftcoding] [GitHub]

### #5piluve  Members

Posted 02 March 2017 - 02:34 PM

Maybe the best way, would be to implement it on a postprocess but I'm not sure how it will fit with other elements (with alpha).

I implemented O'Neil's scattering as a post-process a long time ago. It works.

Planets are pretty big, so broadly sorting rendered elements into [ground, ocean, clouds, atmosphere, space] makes it pretty easy to ensure the atmosphere is rendered in the right order such that alpha blending is resolved beforehand.

Did you use raycasting to fake the atmosphere?

### #6swiftcoder  Senior Moderators

Posted 02 March 2017 - 11:50 PM

Did you use raycasting to fake the atmosphere?

Yep. Standard ray/sphere intersection.

Mine implementation is buried somewhere, but it looks like someone wrote the same thing on shadertoy.

Tristam MacDonald - Software Engineer @ Amazon - [swiftcoding] [GitHub]

### #7piluve  Members

Posted 03 March 2017 - 04:30 AM

Did you use raycasting to fake the atmosphere?

Yep. Standard ray/sphere intersection.

Mine implementation is buried somewhere, but it looks like someone wrote the same thing on shadertoy.

Alright I'll give it a try. Thanks!

### #8piluve  Members

Posted 03 March 2017 - 02:09 PM

Did you use raycasting to fake the atmosphere?

Yep. Standard ray/sphere intersection.

Mine implementation is buried somewhere, but it looks like someone wrote the same thing on shadertoy.

Hey!

I found this project: https://github.com/wwwtyro/glsl-atmosphere and it's basically what I wanted. I implemented it in shadertoy (https://www.shadertoy.com/view/4sXcRS) so I can test with it first.

The doubt I have right now, is how to translate the directions:

vec3 rayDir = normalize(vec3(uv,-1.0));

So the effect is in camera space. Should I multiply it by the view matrix? I'm not sure about that.

See you!

### #9piluve  Members

Posted 04 March 2017 - 07:03 AM

Okey I found a way  to get a ray and also to use the current view matrix. I would try to figure how to mix both the postprocess and the scene.

### #10piluve  Members

Posted 05 March 2017 - 07:05 AM

Alright, I found a way to mix it with my scene, I render the effect first with depth mask disabled and then render everything else. That part is working!

I have a problem tho with how I calculate the ray you can see the problem here: https://i.gyazo.com/745305ef1e04aab74b3c4cffaf0812f1.mp4.

As you can see its moving up and down...

I calculate the ray using:

vec3 GetRayDir(vec2 uv)
{
mat4 rotMatrix = mat4(uView);
rotMatrix[3] = vec4(0.0f,0.0f,0.0f,1.0f);
mat4 iView = inverse(rotMatrix);
mat4 iProj = inverse(uProjection);

mat4 iVp = iProj * iView;
vec3 rd;
rd = (iVp * vec4(uv.x * uAspect,uv.y,-1.0f,1.0f)).xyz;
return normalize(rd);
}
vec3 rayDir = GetRayDir(iPos.xy);


iPos is the attribute position of the fullscreen quad.

Any ideas?

Thanks!

Edited by piluve, 05 March 2017 - 07:05 AM.