Hello once again.
I've been adding the atmosphere model explained by Sean O'Neil (http://http.developer.nvidia.com/GPUGems2/gpugems2_chapter16.html) and I have it 70% working. I'm thinking how is the best way to implement it in a scenario where I won't be using a planet. I'll just render a terrain (lets say 20x20km) and a few other elements.
My first idea was to render a 3d sphere for the sky with a radius of 4Km, then on the shader I'll specify the outer and inner radius accordingly. This is not a really good idea as I have to hack a few values to move up all the elements of my scene (so they will placed at the "earth" surface).
The other idea is to make it as a postprocessing effect, define a sphere and raycast it using the scattering shader. I think it will work but I'm not sure how well it will fit in my scene (water,terrain,clouds) forward rendering with some transparent objects.
Here are the vertex and fragment shader I've been using. They are used with the 3D sphere:
In those shaders i've been rendering the sky sphere like a skybox(depth mask disabled, rendered before the scene). The sphere in those shaders have 1m radius and I'm harcoding the camera position.
Alright, I've been working on the solution using the sphere as a skybox and I'm having some problems with the algorithm itself. The sun gets black when its position is not laying on the x axis (0.0f,0.0f,1.0f) or (0.0f,0.0f,-1.0f) this is how it looks: https://i.gyazo.com/47778fc56b41eeccef798ff79a8a7379.png
Lets see an example if I set the sun position (sun position in the shader = sun direction) (0.0f,0.5f,1.0f): https://i.gyazo.com/22c556be20360b32e2f4431dc22816a0.png.
I updated the vertex and fragment codes:
I would like to mention that I'm ignoring the alpha value in the fragment shader.
Edited by piluve, 16 February 2017 - 02:04 PM.