# ANN questions

This topic is 5396 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I was wondering: a) Does having multiple hidden layers require an exponential amount of training, and how much does it increase accuracy? b) which functions(sigmoid, etc) do you use?

##### Share on other sites
a) I would assume so, but I'm still working on my backpropagating network, so I can't give you any numbers as of yet!

b) I use a simple step limiting function for single neurons, as it only needs to be differenciable for multi-layer networks, and it doesn't give "messy" numbers like the Sigmoid function.

For everything else so far, though, I've used the Sigmoid function because it's cleaner than everything else I've come across! and it's range is limited to {0, 1} which is so very helpful when teaching boolean operations.

##### Share on other sites
a) yes. I'm not sure if it's exponential or just to some high power, but a lot for sure.

However, the accuracy may even be significantly worse. Neural network training is analoguous to finding a function that goes through many points on a plane. The number of layers and neurons in them is like the number of multipliers in the function - if it is too high, the function will be complicated.
For two points the best would be a straight line, but with a "more advanced" function you might find a strange curve that happens to cross these points. Accordingly a big neural network needs a lot more training and tends to memorize the answers, making it better on the exact training data, but worse on similar but a bit different data. And if you make the network too small, it might not have the complexity needed to match the data (you can't put a straight line through points on a curve).
The best idea is to use a network as small as possible, but big enough for the job. For most applications a single hidden layer is enough.

• 21
• 13
• 9
• 17
• 13