Jump to content
  • Advertisement
Sign in to follow this  

My neual net totally missunderstood the meaning of XOR!

This topic is 4517 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Quite funny actually, I made the simpliest neural net imageinable, two inputs, one output, the goal to learn XOR. However, after around 20 000 training runs, it had learned this: 1 ^ 1 = 1 1 ^ 0 = 0 0 ^ 1 = 0 0 ^ 0 = 1 Not quite what was intented. :P I setup the initial weights like this:
Neuron::Neuron() {
	srand( (unsigned)time( NULL ) );

	w1 = (float)((rand()%2000)-1000)/1000;
	w2 = (float)((rand()%2000)-1000)/1000;
	wb = (float)((rand()%2000)-1000)/1000; // bias

	cout << w1 << endl << w2 << endl << wb << endl;
}

I train it like this:
int Neuron::Train(int i1, int i2, int correctOutput) {
	float output = (i1*w1) + (i2*w2) + (1*wb);	
	output = hardlimiter(output);	// sum up inputs and weights

	// training!

	int error = correctOutput - output;
	//cout << error << endl;

	w1 = w1+(LEARNRATE*error*i1);
	w2 = w2+(LEARNRATE*error*i2);
	wb = wb+(LEARNRATE*error*1);

	return output;
}

LEARNRATE is defined as 0.0001. Then in main I do:
for(int i = 0; i < TRAINTIMES; i++) {
		cout << "(1, 1) --- " << Net.Train(1, 1, 0) << endl;
		cout << "(1, 0) --- " << Net.Train(1, 0, 1) << endl;
		cout << "(0, 1) --- " << Net.Train(0, 1, 1) << endl;
		cout << "(0, 0) --- " << Net.Train(0, 0, 0) << endl;
	}

So... anyone can tell me why my net got the meaning of XOR in the complete opposite way? :P It isnt really much more except that in the code but ask and I shall post the entire thing.

Share this post


Link to post
Share on other sites
Advertisement
Oh... hehe... I thought I set out to do the simpliest thing imaginable, but instead I took on some kind of notorious legend. [smile] Way to go.

Ill make something else then.

[Edit] By the way, big thanks to you, NickGeorgia, it was your tutorials in your journal that finally got me started on this. Gave me a great laugh to. [smile] Big thanks!

Share this post


Link to post
Share on other sites
No problem, perceptrons are fun. But remember, when you need a "universal approximator" you need at least one hidden layer. Now if you ask me how many nodes in each layer and such I will plead the 5th.

Share this post


Link to post
Share on other sites
Maybe I don't completely understand this.... but couldn't you make a network of 3 (or more) of these perceptrons and make it learn XOR? It seems like this would be possible because you could teach it Z = (A. !B) + (!A . B) = A ^ B, which would be possible since perceptrons can learn NOT, AND and OR (first link). Am I missing something here?

Share this post


Link to post
Share on other sites
Good idea, but a perceptron has 1 node = 1 output.
Suppose we did like you said, have 3 perceptrons. Then you would have 3 outputs. How would you combine these outputs? ah ha... see, the hidden layer emerges.

Which will work by the way (but have a hidden layer): see second link diagram from above.


[Edited by - NickGeorgia on January 4, 2006 1:19:19 PM]

Share this post


Link to post
Share on other sites
Quote:
Original post by NickGeorgia
Good idea, but a perceptron has 1 node = 1 output.
Suppose we did like you said, have 3 perceptrons. Then you would have 3 outputs. How would you combine these outputs? ah ha... see, the hidden layer emerges.

Which will work by the way: see the second link.


i think what foreignkid means is to have more than 2 inputs in your perceptron, so you should get xor working if you feed it: A, !A, B, !B, that are 4 inputs 1 output...


T2k

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Um, shouldn't outputs of

1 ^ 1 = 1
1 ^ 0 = 0
0 ^ 1 = 0
0 ^ 0 = 1

for a nueral network be just as difficult as a correctly working XOR? What weights and transfer functions result in that output? XOR can't be learned without a hidden layer, and I don't think that output can either.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!