# Weighing Neural Nets

This topic is 4458 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

On AI_junkie I leaerned about Neural Nets and hoiw to train them using Genetic ALgorithims. How would I train them manually: In one tutorial on GameDev they programmed a neural net without a genetic a;lgorithim. Is it just a process of tinkering with values??

##### Share on other sites
Another popular way of adjusting weights is backpropagation.

##### Share on other sites
I have now read this method but what if i have 2 desired outputs for the network like :

Inputs: 1,5 Desired Output: 9
Inputs: 2,5 Desired Output: 6

How would Backpropagation work here??

##### Share on other sites
Quote:
 Original post by brwarnerI have now read this method but what if i have 2 desired outputs for the network like :Inputs: 1,5 Desired Output: 9Inputs: 2,5 Desired Output: 6How would Backpropagation work here??

##### Share on other sites
I wasnt sure if u could use 2 values from the graphic explination at wikipedia.

##### Share on other sites
Is there any major speed difference between the two?

##### Share on other sites
I don't understand what you're saying, what do you mean by manually train them. In a fully connected network every weight change will effect all values downstream, unless you're having something trivial like two input and one output node, this is an impossible task other than to do it by th methods found in the literature, backprop, secong order (conjugate gradient, levenberg marquardt etc), and genetic algorithms. for the latter to work though you'll need some kind of measure of preformance for it to work. Simulated annealing is another one (but basically random search as is the GA by the way). For a network to produce 6 or 9 as outputs, you'll need linear output units, normally it's between 0 and 1 or -1 and 1 (you could scale it up of course)

Quote: I have now read this method but what if i have 2 desired outputs for the network like :

Inputs: 1,5 Desired Output: 9
Inputs: 2,5 Desired Output: 6

How would Backpropagation work here??

With the right structure (see above) the inputs 1,5 would generate 9 and 2,5 would generate 6 after training. For input 2,5 you would probably get something like 7.5 (that's called generalisation, and is the objective of BP-learning) But since in you're training you kept the second input constant at 5 (and there are no other samples), some thing like 1,3 would give some unpredictable (and utterly useless) result.

• 32
• 12
• 10
• 9
• 9
• ### Forum Statistics

• Total Topics
631352
• Total Posts
2999483
×