Sign in to follow this  
discodowney

Support Vector Machines

Recommended Posts

discodowney    102
Has anyone ever worked with one of these? They are like a neural network. If anyone has had experience would you reccomend using one? Im thinking about doing a Masters Dissertation on using one of these to collaborate the AI of things like police cars in GTA so the cars dont just ram the player but try to intelligently take you down or bring you to dead ends or the sea or something. They were mentioned to me as a possibility instead of neural networks as they can be too much hassle. Any other suggestions are welcome. Cheers

Share this post


Link to post
Share on other sites
Emergent    982
I think that something like an SVM would be at most a very small part of such an AI, and that planning algorithms are the more interesting (and useful) thing to be looking at for this problem. SVMs or the like might be useful to classify certain configurations as "good" or "bad;" then you might plan to reach a "good" or "bad" configuration... I'm not sure that you need the function approximator (which is all an SVM is -- an approximator of boolean-valued functions) in there at all, but maybe it'd be useful to cut down the length of the plans you consider...

Share this post


Link to post
Share on other sites
discodowney    102
Its for a dissertation. So it cant be a rehash of ideas out there. Its more for seeing how they would work in the idea. I also like the online learning aspect of SVM. Im still not sure of the application ill be working with. Ive heard Robocup as an idea. Im looking into that. Anyone know anymore

Share this post


Link to post
Share on other sites
Predictor    198
Quote:
Original post by discodowney
Has anyone ever worked with one of these? They are like a neural network. If anyone has had experience would you reccomend using one? Im thinking about doing a Masters Dissertation on using one of these to collaborate the AI of things like police cars in GTA so the cars dont just ram the player but try to intelligently take you down or bring you to dead ends or the sea or something. They were mentioned to me as a possibility instead of neural networks as they can be too much hassle. Any other suggestions are welcome.


I have worked on an SVM project, but the other person did most of the SVM work. My perception was that they are needlessly complex, but some people have claimed great success working with them on some applications. I don't think that neural networks need be as complicated as many people make them out to be. Consider my article on simplifying neural network configuration:

http://abbottanalytics.blogspot.com/2006/11/family-recipe-for-neural-networks.html

If you're looking for other modeling tools, there are too many to list. Take a look at the tools listed on KDnuggets:

http://www.kdnuggets.com/software/index.html

If you're looking for other applications, think about data mining. There are several annual competitions and practical, interesting applications abound.

Share this post


Link to post
Share on other sites
Steadtler    220
They are definitly NOT like a neural network.

They are simply a linear sum-of-squared classifier made non-linear by using the kernel trick. They can also be used for regression ('curve-fitting') instead of classification. How conveniently can you express your problem in terms or a classification or regression problem?

Neural networks are a representation of a cost (or 'fit', depending on how you see it) function as something closely resembling a polynomial.

I dont see how any of those two have anything to do with collaborative AI. Seems like you want an AI that thinks ahead, which implies some form of planning.

Share this post


Link to post
Share on other sites
Emergent    982
Quote:
Original post by Steadtler
[SVMs] can also be used for regression ('curve-fitting') instead of classification.


I'm only familiar with their use in classification... How does this work? Do you use the surface which separates "yes" from "no" as your curve, courtesy of the implicit function theorem? Or is this something else?

Share this post


Link to post
Share on other sites
Steadtler    220
Quote:
Original post by Emergent
Quote:
Original post by Steadtler
[SVMs] can also be used for regression ('curve-fitting') instead of classification.


I'm only familiar with their use in classification... How does this work? Do you use the surface which separates "yes" from "no" as your curve, courtesy of the implicit function theorem? Or is this something else?


You just use a linear regression algorithm with your kernel, instead of a classifier. The biggest difference lies with the support: for classification, you want to consider the points closest to the solution's hyperplane ('inside' the support). For regression however, you want to consider the points that are far enough from the solution ('outside' the support). Obviously in regression you can ignore on which side of the hyperplane a point lies.

Share this post


Link to post
Share on other sites
Emergent    982
Quote:
Original post by Steadtler
You just use a linear regression algorithm with your kernel, instead of a classifier. The biggest difference lies with the support: for classification, you want to consider the points closest to the solution's hyperplane ('inside' the support). For regression however, you want to consider the points that are far enough from the solution ('outside' the support). Obviously in regression you can ignore on which side of the hyperplane a point lies.


Oh! Ok! So you're not really talking about SVMs any more; you're talking about applying the kernel trick to a (different) linear regression algorithm...

So, got it; thanks.

Share this post


Link to post
Share on other sites
Steadtler    220
Quote:
Original post by Emergent
Quote:
Original post by Steadtler
You just use a linear regression algorithm with your kernel, instead of a classifier. The biggest difference lies with the support: for classification, you want to consider the points closest to the solution's hyperplane ('inside' the support). For regression however, you want to consider the points that are far enough from the solution ('outside' the support). Obviously in regression you can ignore on which side of the hyperplane a point lies.


Oh! Ok! So you're not really talking about SVMs any more; you're talking about applying the kernel trick to a (different) linear regression algorithm...

So, got it; thanks.


Some call it Support Vector Machine for Regression, some just Support Vector Regression. As long as you use an hyperplane and a support, its a SVM. Its strange how everyone (including me) have come to associate SVMs with the kernel trick rather than the idea of the support.

Share this post


Link to post
Share on other sites
Emergent    982
Quote:
Original post by Steadtler
Some call it Support Vector Machine for Regression, some just Support Vector Regression.


That was a useful phrase; thanks; it led me to this tutorial paper (which others might be interested to read too).

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this