gausian blur kernel calculation

Started by
3 comments, last by Zipster 17 years, 7 months ago
Hi does somebody know a website/applet to produce different gausian blur kernels? I want to setup the parameters for example a 3x3, 5x5, 9x9 Kernel (or others) and see the weights of a generated kernel example. Does anybody know of a good example for this? thank you
Advertisement
If you check out the Gaussian blur page at Wikipedia, it has the equation you would use to generate the values of the kernel. Unless there's any specific reason you need a website/applet that does it, because I'm not aware of any off the top of my head. But you could try Googling for one.
Thanks. I've read the article but was asking cause I havn't understood everything. :)

I want to blur a texture and tested a 3x3 blur kernel for that (cause Matrix4x4 is my largest possible type).

I've not really understood the meaning of the "standard deviation" sigma and how does it affect/represent the latter kernel?

Until I know what to do with it I used sigma=1.0. But the sum of the weights created were not 1.0 - so if calculated with a set of texel the brightness change. Is it the correct way to go to scale the kernel to the sum of 1.0 (every element * 1/sum) or would a different kernel be correct (using a specific sigma)?

But with that kernel I blur a certain texel with 100%. But what if I just want to blur a texel by for example 30%. Would it simply go like this
new_texel = (original_texel * 0.7) + (blurred_texel * 0.3) or would these adjustments also be done to the kernel?

I was thinking about maybe using a 9x9 Matrix with 9 Matrix3x3 - or is it usually not worth the effort (on a GPU)?

This is BTW my 3x3 kernel:
0,075114 0,123841 0,0751140,123841 0,204180 0,1238410,075114 0,123841 0,075114
Any help with this would be really great. Thank you!
Quote:Original post by quasty
I've not really understood the meaning of the "standard deviation" sigma and how does it affect/represent the latter kernel?

The standard deviation in this case controls how "wide" your distribution is. For example, about 2/3 of the weighting (the area under the curve) will fall within one standard deviation of the mean (middle).

Quote:Original post by quasty
Is it the correct way to go to scale the kernel to the sum of 1.0 (every element * 1/sum) or would a different kernel be correct (using a specific sigma)?

Since the gaussian function goes on to infinity, you're never going to get exactly one. Thus using a "clamped" gaussian and rescaling your weights so that they sum to one is the usual method.

Quote:Original post by quasty
But what if I just want to blur a texel by for example 30%.

I'm not sure what you mean, but to get a blurrier image, you should increase sigma and probably increase the number of samples that you are taking as well.

Quote:Original post by quasty
I was thinking about maybe using a 9x9 Matrix with 9 Matrix3x3 - or is it usually not worth the effort (on a GPU)?

The great thing about the gaussian function is that it's separable... i.e. you can do it in one axis at a time. Unless you need to do it in a single pass, I'd highly recommend doing a horizontal blur pass followed by a vertical blur pass for efficiency. This will *really* start to make a difference with large blur kernels like 9x9, etc.
Quote:Original post by quasty
But with that kernel I blur a certain texel with 100%. But what if I just want to blur a texel by for example 30%.

It's funny you should ask, because for my senior project I'm working on some terrain generation stuff (among other things) and had to ask myself a similar question. I wanted one of the color channels in a heightmap to encode the "smoothness" of the terrain at that point, so somehow I had to modify the moving average filter I was using to take into account variable degrees of smoothness. What I came up with was a weighted moving average filter variant. You start out with a regular (2N+1)x(2N+1) moving average filter kernel, and then scale all the non-center weights by X/Xmax, where X is the value of your smoothness input and Xmax is the maximum value. Then you reassign the center weight to 1 - (sum of non-center weights) to re-normalize. I don't know if such a technique already exists or has a name, but it produces good results. You might try and see if such a technique works for Gaussian distribution filters (moving average is technically a uniform distribution).

This topic is closed to new replies.

Advertisement