Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

levendis

Support Vector Machines

This topic is 5534 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Does anyone have any recommendations as to sites that overview Support Vector Machines and kernel based learning?

Share this post


Link to post
Share on other sites
Advertisement
I''ve never heard of these Support Vector Machines or Kernel-based Learning (I''m assuming their related). If somebody wouldn''t mind could you give a quick 1 paragraph explanation of what these techniques are?

Share this post


Link to post
Share on other sites
NB: I'm no machine learning expert but here goes:

A support vector machine is basically a way to classify instances based on a separating hyperplane. This plane is the maximum margin hyperplane and is defined by the instances that support it, the support vectors. So it is an instance based classification scheme but instances that don't affect the maximum margin can be discarded, you just keep the support vectors.

-edit
The reason you use a maximum margin separating hyperplane is that you can show that maximizing the margin is equivalent to minimizing classification error (or at least some sort of error my memory is slightly rusty).
-edit

Kernel functions is a way to use a non linear mapping to transform the instances so the hyperplane becomes non-planar. Think of a circular blob of black dots in the center of a grid (instance space) that is surrounded by red dots. It's impossible to separate the red dots from the black ones with a line/plane. If you place this grid on a sphere and treat it like a piece of cloth however, the black dots will be on top of the sphere and the red ones will be hanging lower, so you can use a plane perpendicular with the up-axis to separate the black dots from the red ones.

The point with kernelfunctions is that they are computationally efficient since they enable you to classify in the lower dimension (the grid without the sphere) and still use the non linear mapping.

I hope that made sense. It's kind of hard to explain without pictures. Anyway, this all assumes you at least have some experince with machine learning concepts. If you don't you probably need more explanations

[edited by - GameCat on August 28, 2003 12:37:02 PM]

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
The trick, then, I suppose is how to come up with appropriately adaptive kernel functions, because a lot of the shape of your learning corpus would actually be encoded in the functions themselves. Fascinating!

Share this post


Link to post
Share on other sites
Here''s another informative link: http://www.support-vector.net/icml-tutorial.pdf

Also, I probably could have been clearer above. Choosing the maximum margin hyperplane does not minimize classification error on training data (which is fairly trivial and only leads to overfitting) it minimizes error on test data.

Share this post


Link to post
Share on other sites
Does anybody knows pros and cons about kernel and support vector machines versus the neural nets?

Cause the description GameCat made is nearly the same as what neural nets are doing.

I''m a specialist in neural net, and begining to write my AI applications, would know if it is interresting to learn this?

Thanks

Crazyjul

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!