Gaussian blur fading to black - is it normal?

Started by
4 comments, last by g0nzo 17 years, 6 months ago
Hi, I'm doing 8 gaussian blur passes on luminance channel only. After ~5th pass the result is just black texture. Is it normal for blur, which uses gaussian distribution? I need to mention (however I don't know if it makes any difference in this case) that I'm using an approximation of gaussian convolution: - in original alogrithm the kernel size was increased with every pass (3x3 up to ~40x40) and (however I'm not sure) the original image was used as input for each pass. - I'm using approximation by downscaling the input texture (different downscale factor for each pass), performing convolution using always 3x3 kernel and upscaling the result to the original size. The input for the next pass is the output from the previous pass. [EDIT] It's not normal [smile] I just summed up all 3 kernel weights for different standard deviation values and sometimes it's slighlty above 1 (not sure how it's possible) and sometimes it's ~0.9. So probably I need to scale them, so the sum is 1. However I'm not sure if it will help. Is there anywhere a tutorial with equation for 1D gaussian distribution, weight scaling and shader code, so I can compare it with my own code? [Edited by - g0nzo on October 6, 2006 5:49:20 AM]
Advertisement
After calculating the kernel, it is normal for the sum to be less than 1, because you have truncated the 'tails' off the edge of the distribution. So you should follow this up by dividing every element in the kernel by the sum, effectively normalising it. Once this is done, the kernel's sum should be as close to unity as floating point error will allow.

I can't imagine why the blur would fade out after several passes if the kernel is unitary. Are you sure that isn't just an illusion? If the image is primarily black with a few small light patches, the blurring will tend to homogenise the image to something pretty dark, which will eventually round off the black due to numerical inaccuracy.

A unitary blur is guaranteed to conserve the total lightness of the image, which is often too subtle for use in blooms effects and the likes. It is not uncommon to deliberately scale up the whole kernel to brighten the image as it blurs. However, this is only advised if the number of blur passes is fixed.

Regards
Admiral
Ring3 Circus - Diary of a programmer, journal of a hacker.
Thanks.

Scaling weights helped, but now I've got another problem.

I get white/black pixels at the edges of the image where they shouldn't be any. After first gaussian blur I get grey line along bottom edge, however in original image there were only black pixels. After downscaling, another gaussian blur and upscaling grey color becomes white and it gets worse with every iteration. Probably that's because when using 5x5 kernel, for pixels on the edge of the image, it takes color of 2 pixels which lie outside of the image.

Is there a solution for this problem?
Try wrapping around to the opposite edge of the image when you look for "out-of-bounds" pixels maybe?
Wrapping may solve your problem but it could introduce other artifacts. The most common way to fix an overlapping kernel is to set the texture sampler to clamp or border mode (for both u and v). You may prefer other boundary conditions, so try out a few of them.
Many people don't like the way the colour tends to congregate at the edges of the image under a clamp sampler, so they code a bounds test into the blur shader to set the out-of-range samples to black.

Note that under a clamp, lightness will be generated at the boundaries, and with boundaries fixed to zero, colours will 'leak' out of the image. In general, unless you wrap or mirror, lightness will not be conserved, though most people don't consider this to be a problem.

Regards
Admiral
Ring3 Circus - Diary of a programmer, journal of a hacker.
Thank you again.

I've set AddressU and V to CLAMP and it looks much better now.

This topic is closed to new replies.

Advertisement