Capacity of learning within an NN

Started by
14 comments, last by Dark Star 20 years ago
hi Guys, Would u say that the amount of neurons that an NN has affects how much it can remember or learn from its Training set. For example if one NN had a small set of neurons per layer and another NN had a very large number of neurons per layer, would the larger NN be able to learn more or better besides being slower to process of course. Thanks in advance DarkStar UK ------------------------------- Loves cross-posting because it works
---------------------------------------------You Only Live Once - Don't be afriad to take chances.
Advertisement
quote:Original post by Dark Star
Would u say that the amount of neurons that an NN has affects how much it can remember or learn from its Training set. For example if one NN had a small set of neurons per layer and another NN had a very large number of neurons per layer, would the larger NN be able to learn more or better besides being slower to process of course.


Generally, yes, although I'd clarify this by saying that a larger neural network will have the ability to learn more patterns, as opposed to examples. For example, if a neural network of a certain size effectively learns a simple pattern, then more examples of that simple pattern will not require a larger neural network.

-Predictor
http://will.dwinnell.com




[edited by - Predictor on January 7, 2004 9:06:56 AM]
If you have too many neurons you risk the problem of over fitting. The easiest way to describe this, is that NNs don''t really learn the underlying concept of what you think they might be. They learn the training data. More neurons lets your netowrk learn the training data better. So theoritically, with enough nurons, you could train a network that gets 100% accuracy, on any set of training data. However, this doesn''t mean that it is learning what it''s suposed to, it''s learning the data. As well as any errors that that data contains. So if your data is not a truely perfect representation of exactly how things are in the problem you are trying to solve (as is normally the case in real world problems), your network with a lot of nodes will actually perform worse than your network with fewer nodes.

For this reason, it is best to have training data, and test data. Two separate data sets. You train the network on on the training data, and test it on the test data. You can then repeat the process altering the number of neurons, you can then select the network that performs the best on the test data. There are many modifications of this process, but this is the simplest.
Yeah, the larger net will have a better learning capacity. You thinking of making one?

My Kung Fu is stronger.
neo88
My Kung Fu is stronger.May the Source be with you.neo88
quote:Original post by neo88
Yeah, the larger net will have a better learning capacity.


neo88, I''m interested to hear what you mean by ''learning capacity''. Do you mean that a larger network has a larger capacity for knowledge, or that it has a faster rate of learning, or perhaps something else? Could you elaborate on your statement further as I am genuinely interested.

Thanks,

Timkin
This needs saying, not necessarily to disagree with anyone in this thread, but to challenge a few myths. More neurons do not lead to better performance. With large NN, you get the potential for a bit more precision, but that usually affects the performance negatively.

The best and fastest learning NN is the smallest that fits the patterns in question.

Alex


AiGameDev.com

Join us in Vienna for the nucl.ai Conference 2015, on July 20-22... Don't miss it!

You can also have a problem with over generalization if you have too many neurons in your hidden layer, along with too many hidden layers...

most uses of Neural Nets only need one hidden layer and at most 2 before reaching the output..so the issue can simply put 1 hidden layer or 2, and how many neurons do I need in the hidden layer.

not enough neurons and the net can not make an accurate generalization on the input data, and too many it will over generalization or find a solution that works with the training data and perform poorly with everything else.

utimately it boils down to its use, are you making a NN just to make one or does it serve a more practical purpose...my greatest advice is code up a neural net that you can change and add and remove neurons/layers, so you can test out which number works best for you...

I personally try not to use NN at all unless there is a true need for it, cause often times they are not needed, and only used cause the name sounds cool. when more often then not you could replace a neural net with a few ''switch'' statements or if needed a fuzzy inferencer. Plus they are too damned computationally expensive.

Read this, it is good reading on NN by one of my professors Dr. Beaty
http://www.jodypaul.com/cs/ai/BeatyNN.pdf

-lucas
-Lucas
too many neurons and your NN starts to behave like a simple lookup-table
While it is true that blindly allowing a large nerual network to train for too long will result in overfitting, any qualified analyst would constrain the model by early stopping or some other method.

-Predictor
http://will.dwinnell.com



[edited by - Predictor on January 7, 2004 10:05:32 AM]
quote:Original post by Predictor
While it is true that blindly allowing a large nerual network to train for too long will result in overfitting, any qualified analyst would constrain the model by early stopping or some other method.



I'm curious what the techniques for that are. You can't guess how well a network is going to perform, thus just 'stopping' the training early could leave you with something that doesn't perform near where it should.

I think the superior method is to just try different numbers of nodes and train them untill they converge, then see which one performs the best on a data set that you didn't train them on.

Or you could start with many nodes, and remove nodes during training (there are methods of doing this).

[edited by - drjonesdw3d on January 7, 2004 11:09:50 AM]

This topic is closed to new replies.

Advertisement