Support Vector Machines

Started by
10 comments, last by Emergent 14 years, 2 months ago
Quote:Original post by Emergent
Quote:Original post by Steadtler
You just use a linear regression algorithm with your kernel, instead of a classifier. The biggest difference lies with the support: for classification, you want to consider the points closest to the solution's hyperplane ('inside' the support). For regression however, you want to consider the points that are far enough from the solution ('outside' the support). Obviously in regression you can ignore on which side of the hyperplane a point lies.


Oh! Ok! So you're not really talking about SVMs any more; you're talking about applying the kernel trick to a (different) linear regression algorithm...

So, got it; thanks.


Some call it Support Vector Machine for Regression, some just Support Vector Regression. As long as you use an hyperplane and a support, its a SVM. Its strange how everyone (including me) have come to associate SVMs with the kernel trick rather than the idea of the support.
Advertisement
Quote:Original post by Steadtler
Some call it Support Vector Machine for Regression, some just Support Vector Regression.


That was a useful phrase; thanks; it led me to this tutorial paper (which others might be interested to read too).

This topic is closed to new replies.

Advertisement