Jump to content
  • Advertisement
Sign in to follow this  
sidhantdash

How are NNs used in NEAT trained?

This topic is 4591 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi Can someone please tell me how NNs used in NEAT are trained. Just a pointer plz. You can find the paper here They use a speciated GA to evolve a population of NNs. I am working similiar to this, and find this intersting but havent been able to find out how such networks are trained. Is is recurrent BP?

Share this post


Link to post
Share on other sites
Advertisement
Guest Anonymous Poster
It isn't recurrent BP because it doesn't use backpropagation. The genetic algorithm evolves the weights, although the paper doesn't go into much detail on this area. It just says that the weights can be perturbed, which probably means they can be changed by a small amount when a mutation occurs.

Share this post


Link to post
Share on other sites
But do you think recurrent BP would be a good idea to do train the individual nets in the population? And using the GA only to evolve structure.

Share this post


Link to post
Share on other sites
I don't believe any form of backpropagation is used in NEAT at all. All training of weights and structure of the NEAT networks comes from a very specialized genetic alogorithm. The paper you mentioned does a fairly good job of explaining the details behind NEAT. Ken Stanley's homepage is http://www.cs.ucf.edu/~kstanley/ - he has a number of publications on NEAT some of which go into more depth on the method itself. His dissertation probably gives you the most complete picture of NEAT.

-Kirk

Share this post


Link to post
Share on other sites
Quote:
Original post by sidhantdash
But do you think recurrent BP would be a good idea to do train the individual nets in the population? And using the GA only to evolve structure.

You are trying to combine two methods here: backpropagation to adjust the weights, and genetic algorithm to evolve the structure. From the paper you linked, it doesn't seem that NEAT use backpropagation at all. NEAT use genetic algorithm to adjust weights, weight connectivity, and structure.

Share this post


Link to post
Share on other sites
So NEAT doesn't use BP at all, I really hadn't been able to understand that from the paper. Thanks a lot everyone.

But nevertheless, assuming that I try to write my own version of the program (using ideas like speciation and minimal starting populations) and use BP (recurrent) to train the individual nets in the population and use the GA only to evolve structure, do you think it will work? Actually I am in the process of preparing a research proposal for a summer undergraduate research program, and want to work on evolving neural nets. I have already done a course on NNs, and am familiar with most forms of neural nets (except the evolving ones).

My proposal as of now is to apply a neuro-genetic evolutionary process to predict currency exchange rates and compare the results with those obtained using standard statistical processes (like ARCH, GARCH). For the neuro-genetic method, I referred to the NEAT paper, and because I will be writing my own code, I thought of this recurrent BP to train the individual nets ('coz I have already done some BP at college). Also, my project will be much simpler than NEAT (quite naturally, given that NEAT happens to be a PhD dissertation). Do you think it is a good (read FEASIBLE) idea? Will it work, or is it like one has to stick to what the original paper says regarding the training for the process to work?

Thanks!

Share this post


Link to post
Share on other sites
You can definitely deviate from the NEAT methodology and attempt new methods. There are a number of publications out there that cover a wide variety of ways to evolve neural net structure and weights. NEAT is one of the newest methods out there and Ken Stanley has gotten a lot of press lately, which is well deserved, in my opinion. But this doesn't not mean that other methods shouldn't be attempted.

If you're interested in other publications, you should look to Google Scholar (http://scholar.google.com/) and CiteSeer (http://citeseer.ist.psu.edu/). I've found numerous papers through CiteSeer on this topic. I haven't tried Google Scholar on this particular topic, but I've found it useful in many other cases.

-Kirk

Share this post


Link to post
Share on other sites
Quote:
Original post by sidhantdash
But nevertheless, assuming that I try to write my own version of the program (using ideas like speciation and minimal starting populations) and use BP (recurrent) to train the individual nets in the population and use the GA only to evolve structure, do you think it will work? Actually I am in the process of preparing a research proposal for a summer undergraduate research program, and want to work on evolving neural nets. I have already done a course on NNs, and am familiar with most forms of neural nets (except the evolving ones).

It's a huge research subject worthy of a Ph.D. Think about it, if a neural network needs to be adjusted, and you have backpropagation and genetic algorithm, how should you adjust it? Should you adjust the weight, or should you adjust the structure and weight connectivity? How do you decide? Deciding which method to use is a tricky part. There might even be interaction between these two methods. How does BP affect the evolution and vice versa? Chances are you might need to insert backpropagation into the evolving structure and let the evolution decides which backpropagation that works.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!