Archived

This topic is now archived and is closed to further replies.

llvllatrix

Artificial intelligence

Recommended Posts

KoalaGerch    122
ANNs are pretty easy to write, it's understanding the maths behind them that is hard.

pseudo code for a single pass of a single-layer perceptron network is as follows:
----------------
// the values at each stage
outputs(o)
inputs(i)

// the weighting values for each stage
weights(o)(i) // all inputs connect to each output

for(int a=0; a lessthan o; a++) // for all outputs
for(int b=0; b lessthan i; b++) // for all inputs
outputs(a) += inputs(b)*weights(a)(b);

----------------

for a multilayer perceptron network you simply change the current output layer into a "hidden" layer and add a new output and hence another set of weights between the hidden & output neurons and another for loop surrounding what is there already.
As for training... don't. it's evil.


[edited by - KoalaGerch on March 17, 2002 3:34:14 PM]

Share this post


Link to post
Share on other sites
vincoof    514
Too bad. Training is the most interesting part
(in a mathematical point of view, of course. in a programmer''s point of view, that''s ... evil)

Share this post


Link to post
Share on other sites
edotorpedo    198

I coded a fully dynamic backpropagation neural network. It is capable of creating a 323*34*3*342*87 network and fully train it (although in this case it would take a lot of time ) using backpropagation. When trained it can be provided with a new input pattern and perform a forward sweep.

It really wasn''t that difficult (only the debugging part), you just have to read up on NN''s on the numerous AI-sites (there''s a section right here on gamedev.)


Edo.

Share this post


Link to post
Share on other sites
echeslack    122
Generation5 has some excellent articles (go under the essays section), but they aren''t exactly for dummies. you have to know a good amount of math (or just know the symbols? they tend to use greek). anyway, the hard part, as was said, is understanding the math behind it, why they work, but understanding how and why they work is what makes them powerful. i''m not saying there is anything wrong with it, but just because you can create a 323*34*3*342*87 network doesn''t mean that it is going to be the best solution for your problem. understanding the math really gives you that ability to apply them in situations where they are appropriate and do it well. unfortunately it generally takes a while to get the basic math behind it and to really understand it fully, but once you do, the rest comes really easy.

ewen

Share this post


Link to post
Share on other sites
edotorpedo    198

echeslack,

This network dimension was just an example. I totally agree that you have to know about the math. In neural networks however there are some functions that can give you a hint on what some of your paramters should be. (for example the number of neurons in the hidden layer).
The math of a NN isn''t that hard? Only to fully understand backpropagation (not how it works but why it works) is a bit hard.
Personally I would recommend reading the article by fup first.
Edo

Share this post


Link to post
Share on other sites