Sign in to follow this  
  • entries
    12
  • comments
    5
  • views
    15748

Ambient Occlusion

Sign in to follow this  

372 views

I have been busy finalizing the rendering pipeline of Geist3D, including HDR, Bloom, SSAO etc. I am really glad now that I have created a development environment alongside the graphics engine right from the get go. The inbuilt GLSL editor and the scripting capabilities helped to streamline the development process for all of the necessary shaders. Soon, I will dedicate a few journal entries to the editor alone. But over the next few entries I will outline all that I have done regarding the rendering pipeline during the past weeks, starting with Screen Space Ambient Occlusion (SSAO) culling. Check out the Geist3d website for downloads, images and movies of the current status.

Examples and discussions about SSAO are plentiful and I stitched my implementation together after digging around in various forums. In a nutshell, screen space ambient occlusion culling adjusts the intensity of every pixel according to the differences in depth of the surrounding pixels along the viewing axis. I think the idea is that the larger the difference in depth to the neighboring pixels, the more likely it is that the surface is shadowing itself; thus the darker it will become. The remarkable thing about SSAO is that it clearly produces a 3D image without there being a light source at all. Yeah, that's right; there is no light source in these images.

In order to implement this algorithm in Geist3D, it became necessary to render the scene into multiple render targets rather than straight into the back buffer. The occlusion algorithm is then applied in screen space to the content of the render targets. For SSAO, I used two GL_RGBA16F render targets to store color and depth as well as the eye space surface normal:

Buffer 1
R - diffuse X
G - diffuse Y
B - diffuse Z
A - depth

Buffer 2
R - normal X
G - normal Y
B - normal Z
A - unsued

Much of the literature suggests using a random set of sample points within a fixed radius around a pixel. It turns out that a regular distribution of sample points produced better results for me. Somehow this also makes sense; you would want the sample points to cover the surrounding area as evenly as possible.

Another important step I came across is to reflect a sample point over the surface normal if the angle between the sample point and the normal is greater than 90 degrees. The assumption is that for angles greater than 90 the sample point is inside the "surface" and should thus not count. The images below illustrate how much this additional condition improves the quality. The left hand side shows Galactica with the reflection over the normal and the one on the right without it:



I also realized that it is possible to use the displacement and normal maps to compute depth values even within the texture space of a single triangle. Normally, a flat surface such as a wall would generate a smooth gradient of depth values. However, if the wall is textured with a normal and displacement maps, it should contain small variations in depth which are easily computed given the displacement value and surface normal. The left hand side below shows a wall with depth information on the mesh level, the one on the right with depth at the pixel level:



There is still a problem when applying SSAO to the planet surface since the range of 16 bit floating point number is somewhere around 65000. However, at least for now you can see much further than that when you are on the planet surface or in low oribit. I am therefore "blending" out SSAO up to the maximum of a 16 bit floating point value. I am not quite sure if I am doing this right, but at least there are no visible artifacts. You can see in the images below that the peak in the distance has no SSAO applied to it at all. As soon as I figure out how, I will encode the depth in 2 RGBA16F.



I have to say that in general my SSAO implementation darkens the final images too much. HDR undoes some of that problem, but I am not quite happy with it. For now I will leave it at that, since my goal is too get the entire rendering infrastructure in place first. Later I will make another pass over everything and refine it.

Unlike many others Journals, I again have not posted any code, but all of the shaders are accessible via the geist3d editor. All you have to do is figure out how to use it :)
Sign in to follow this  


1 Comment


Recommended Comments

Not related to this specific post, but I just wanted to say that I lul'd at the title "Memoirs of a Graphics Engine."

Share this comment


Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now