ssao noise/halo reduction

Started by
8 comments, last by Jason Z 12 years, 2 months ago
Hi

I implemented ssao in my game based on code from here http://www.gamerendering.com/2009/01/14/ssao/

It looks great but I have two problems with my implementation

1. A heavy amount of random noise appears everywhere (it corresponds with the noise texture input)

2. A black halo resembling a dark cloud appears on faraway objects

I tried blurring the output using a 9x9 separable blur but for some reason it has no effect.

What methods do most games use to reduce these artifacts?

I have written some optimizations but I tried using the exact shader from my source and the same artifacts appeared
Advertisement
you can adjust the kernel size based on the distance to the camera, this way far distance object wouldn't have those big dark halos.
if your blur is not working, you shall debug it, once it works, it shall give you quite good results in reducing the noise, but 9x9 sounds quite big (this could actually extend the dark halos).

Hi

I implemented ssao in my game based on code from here http://www.gamerende...009/01/14/ssao/

It looks great but I have two problems with my implementation

1. A heavy amount of random noise appears everywhere (it corresponds with the noise texture input)

2. A black halo resembling a dark cloud appears on faraway objects

I tried blurring the output using a 9x9 separable blur but for some reason it has no effect.

What methods do most games use to reduce these artifacts?

I have written some optimizations but I tried using the exact shader from my source and the same artifacts appeared
Normally you would want to do a bilateral blur to ensure that you aren't filtering across depth discontinuities. You can also get improved results by checking the normal vector of the point being occluded to ensure you aren't including sample points that are behind you current point. Can you post a screen shot of what you are seeing? That might help to give suggestions for it.

[quote name='ic0de' timestamp='1327291479' post='4905322']
Hi

I implemented ssao in my game based on code from here http://www.gamerende...009/01/14/ssao/

It looks great but I have two problems with my implementation

1. A heavy amount of random noise appears everywhere (it corresponds with the noise texture input)

2. A black halo resembling a dark cloud appears on faraway objects

I tried blurring the output using a 9x9 separable blur but for some reason it has no effect.

What methods do most games use to reduce these artifacts?

I have written some optimizations but I tried using the exact shader from my source and the same artifacts appeared
Normally you would want to do a bilateral blur to ensure that you aren't filtering across depth discontinuities. You can also get improved results by checking the normal vector of the point being occluded to ensure you aren't including sample points that are behind you current point. Can you post a screen shot of what you are seeing? That might help to give suggestions for it.
[/quote]

I uploaded a screenshot like you asked

[sharedmedia=core:attachments:6844]

EDIT:

I got the blur working for sure but it doesn't seem to be enough to remove the noise
I don't think your blur is working at all - with a 9x9 you should wipe out that noise very easily. I don't see any blurring in either of the screen shots... Try taking a frame grab with PIX and see what is going on with your texture resources at each step of the algorithm (I'm assuming you are using D3D...).

If you are using D3D11, you could take a look at the SSAO sample for my engine (see here). It has a bilateral filter implementation for the compute shader, which can be adapted to a pixel shader if you are using < D3D11.

I don't think your blur is working at all - with a 9x9 you should wipe out that noise very easily. I don't see any blurring in either of the screen shots... Try taking a frame grab with PIX and see what is going on with your texture resources at each step of the algorithm (I'm assuming you are using D3D...).

If you are using D3D11, you could take a look at the SSAO sample for my engine (see here). It has a bilateral filter implementation for the compute shader, which can be adapted to a pixel shader if you are using < D3D11.


these shots don't have any blur applied to them, the blur is definitely noticeable but its still very noisy

oh and by the way I'm not using D3D I'm using OpenGL but your shader could still be useful although I can't seem to find it

[quote name='Jason Z' timestamp='1327433731' post='4905849']
I don't think your blur is working at all - with a 9x9 you should wipe out that noise very easily. I don't see any blurring in either of the screen shots... Try taking a frame grab with PIX and see what is going on with your texture resources at each step of the algorithm (I'm assuming you are using D3D...).

If you are using D3D11, you could take a look at the SSAO sample for my engine (see here). It has a bilateral filter implementation for the compute shader, which can be adapted to a pixel shader if you are using < D3D11.


these shots don't have any blur applied to them, the blur is definitely noticeable but its still very noisy

oh and by the way I'm not using D3D I'm using OpenGL but your shader could still be useful although I can't seem to find it
[/quote]

If you have pulled the SVN trunk, then it is in Hieroglyph3 > Applications > Data > Shaders. There are two implementations in separate files, BilateralBruteForceCS and BilateralSeparableCS.
Those artifacts looks like you are getting a whole bunch of self intersection (points on the same plane as P are working to occlude P). Even without blurring, the interiors of your walls and ground plane should be white (there are no occluders). You can scale your ambient occlusion by how much "in front" your random sample point Q is to the pixel P you are shading:

float s = max(dot(n, normalize(q - p)), 0.0f);

This is in the game programming gems 8 book by the starcraft guy.
-----Quat

Those artifacts looks like you are getting a whole bunch of self intersection (points on the same plane as P are working to occlude P). Even without blurring, the interiors of your walls and ground plane should be white (there are no occluders). You can scale your ambient occlusion by how much "in front" your random sample point Q is to the pixel P you are shading:

float s = max(dot(n, normalize(q - p)), 0.0f);

This is in the game programming gems 8 book by the starcraft guy.


is P in view space or world space?

[quote name='Quat' timestamp='1327514502' post='4906153']
Those artifacts looks like you are getting a whole bunch of self intersection (points on the same plane as P are working to occlude P). Even without blurring, the interiors of your walls and ground plane should be white (there are no occluders). You can scale your ambient occlusion by how much "in front" your random sample point Q is to the pixel P you are shading:

float s = max(dot(n, normalize(q - p)), 0.0f);

This is in the game programming gems 8 book by the starcraft guy.


is P in view space or world space?
[/quote]

As long as P and Q are in the same space, then it can be either one. It all depends on how you are reconstructing your positions. Normally it is easiest to work in viewspace, since the depth data is with respect to the camera, but you can use whichever one you want.

This topic is closed to new replies.

Advertisement