Sign in to follow this  

Generalisation algorithm

This topic is 4748 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

This is a simple generalisation algorithm. First you have the "learning" side of it. You simply add the input value, and the correct output value to the storage array. No the generaliser. You are given an input. that can be a value, or a vector of values, it doesn't matter much either way (just different fucntions.). You use the comparison function, to compare the input, with every other input in the storage array. you then use that in a weighted average function, and you eventually arive with a number. That number is your output. The comparison function, takes two arguments, and returns how similar they are. The weighted average function, takes two sets of inputs, and returns an average, which is changed based on how large its corresponding weight value is. (for eg. an inp with a weight of 30 would have a much larger change in he output then one with a weight of say 2). Is this a nice litle algorithm? From, Nice coder

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
It's a nice algorithm, but I'm afraid you're not the first person to think of it. In machine learning what you described is called kernel regression (kernel refering to the comparison function). In statistics it is called parzen window density estimation. There is a neural network interpretation called general regression neural networks.

There are lots of varieties as well, for example instead of taking a weighted average of all of the instances, only take the average of the k closest instances (this is called the k nearest neighbors algorithm), or use a comparision function such that the weight for far away instances is zero (restrict your average to a certain range).

There are also ways to make this almost as fast as traditional learning algorithms with cool data structures, see Andrew Moore at Carnegie Mellon's work, especially his tutorials on KD-Trees.

Share this post


Link to post
Share on other sites

This topic is 4748 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this