# My neual net totally missunderstood the meaning of XOR!

This topic is 4759 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Quite funny actually, I made the simpliest neural net imageinable, two inputs, one output, the goal to learn XOR. However, after around 20 000 training runs, it had learned this: 1 ^ 1 = 1 1 ^ 0 = 0 0 ^ 1 = 0 0 ^ 0 = 1 Not quite what was intented. :P I setup the initial weights like this:
Neuron::Neuron() {
srand( (unsigned)time( NULL ) );

w1 = (float)((rand()%2000)-1000)/1000;
w2 = (float)((rand()%2000)-1000)/1000;
wb = (float)((rand()%2000)-1000)/1000; // bias

cout << w1 << endl << w2 << endl << wb << endl;
}


I train it like this:
int Neuron::Train(int i1, int i2, int correctOutput) {
float output = (i1*w1) + (i2*w2) + (1*wb);
output = hardlimiter(output);	// sum up inputs and weights

// training!

int error = correctOutput - output;
//cout << error << endl;

w1 = w1+(LEARNRATE*error*i1);
w2 = w2+(LEARNRATE*error*i2);
wb = wb+(LEARNRATE*error*1);

return output;
}


LEARNRATE is defined as 0.0001. Then in main I do:
for(int i = 0; i < TRAINTIMES; i++) {
cout << "(1, 1) --- " << Net.Train(1, 1, 0) << endl;
cout << "(1, 0) --- " << Net.Train(1, 0, 1) << endl;
cout << "(0, 1) --- " << Net.Train(0, 1, 1) << endl;
cout << "(0, 0) --- " << Net.Train(0, 0, 0) << endl;
}


So... anyone can tell me why my net got the meaning of XOR in the complete opposite way? :P It isnt really much more except that in the code but ask and I shall post the entire thing.

##### Share on other sites
You can look at these links to see why:

Scroll down a bit: Click
See how to overcome this: Click

##### Share on other sites
Oh... hehe... I thought I set out to do the simpliest thing imaginable, but instead I took on some kind of notorious legend. [smile] Way to go.

Ill make something else then.

 By the way, big thanks to you, NickGeorgia, it was your tutorials in your journal that finally got me started on this. Gave me a great laugh to. [smile] Big thanks!

##### Share on other sites
No problem, perceptrons are fun. But remember, when you need a "universal approximator" you need at least one hidden layer. Now if you ask me how many nodes in each layer and such I will plead the 5th.

##### Share on other sites
Maybe I don't completely understand this.... but couldn't you make a network of 3 (or more) of these perceptrons and make it learn XOR? It seems like this would be possible because you could teach it Z = (A. !B) + (!A . B) = A ^ B, which would be possible since perceptrons can learn NOT, AND and OR (first link). Am I missing something here?

##### Share on other sites
Good idea, but a perceptron has 1 node = 1 output.
Suppose we did like you said, have 3 perceptrons. Then you would have 3 outputs. How would you combine these outputs? ah ha... see, the hidden layer emerges.

Which will work by the way (but have a hidden layer): see second link diagram from above.

[Edited by - NickGeorgia on January 4, 2006 1:19:19 PM]

##### Share on other sites
Quote:
 Original post by NickGeorgiaGood idea, but a perceptron has 1 node = 1 output.Suppose we did like you said, have 3 perceptrons. Then you would have 3 outputs. How would you combine these outputs? ah ha... see, the hidden layer emerges.Which will work by the way: see the second link.

i think what foreignkid means is to have more than 2 inputs in your perceptron, so you should get xor working if you feed it: A, !A, B, !B, that are 4 inputs 1 output...

T2k

##### Share on other sites
Well in that case, sounds good to me. But XOR usually takes 2 inputs. Clever nonetheless.

##### Share on other sites
Um, shouldn't outputs of

1 ^ 1 = 1
1 ^ 0 = 0
0 ^ 1 = 0
0 ^ 0 = 1

for a nueral network be just as difficult as a correctly working XOR? What weights and transfer functions result in that output? XOR can't be learned without a hidden layer, and I don't think that output can either.

##### Share on other sites
Yep, I see. Still the linearly separable problem. Good catch. Hidden layer is our friend again.

• ### What is your GameDev Story?

In 2019 we are celebrating 20 years of GameDev.net! Share your GameDev Story with us.

• 17
• 14
• 10
• 9
• 11
• ### Forum Statistics

• Total Topics
634097
• Total Posts
3015505
×