• Create Account

## Neural Network Genome Help Please :'(

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

13 replies to this topic

### #1CryoGenesis  Members

528
Like
0Likes
Like

Posted 20 March 2012 - 01:55 PM

Hello, Sorry for another post (the other post's title didn't fit right).
I'm having problems with converging 2 separate Neural Networks into one.
I am literally stumped at the moment.
The program creates two separate Neural Networks with random Weights.
A class called Genome holds the Neural Network and can create a completely new Genome using another Genome's Neural Network.
The aim is to have bots that have neural networks that can fish out food. The most successfull neural networks(bots) reproduce to create an even better one. (including Mutations)

Anyone Who Posts an answer I will give +1 Rep for at least trying to help me out

I have literally No Idea why I'm getting errors.
Error:

at AI.Node.input(Node.java:43)
at AI.NeuralNetwork.input(NeuralNetwork.java:103)
at AI.Running.keyPress(Running.java:57)

The Code:
package AI;
import java.awt.Graphics;
import java.awt.event.KeyEvent;
import java.awt.event.MouseEvent;
import AI.NeuralNetwork;
import Resource.*;
public class Running extends State {
static WindowCreator win;
NeuralNetwork n;
NeuralNetwork n2;
float[] in;

public Running(){
win = new WindowCreator(this);}

public static void main(String[] args) {
Running running = new Running();}

public void init() {

in = new float[1];
in[0] = 10;
fullscreen = false;
printFPS = false;
n = new NeuralNetwork(1,1,10,2);
n2 = new NeuralNetwork(1,1,10,2);
n.createNetwork();
n2.createNetwork();
n.input(in);
n2.input(in);
System.out.println("Neural 1: " + n.getOutput(0));
System.out.println("Neural 2: " + n2.getOutput(0));

}@Override
public void keyPress(KeyEvent e) {
int key = e.getKeyCode();

if(key == e.VK_SPACE){
Genome a = new Genome(n);
Genome b = new Genome(n2);

n = a.returnConvergedNet(b);
n2 = b.returnConvergedNet(a);
n.input(in);
n2.input(in);
System.out.println("Neural 1: " + n.getOutput(0));
System.out.println("Neural 2: " + n2.getOutput(0));
}

if(key == e.VK_ESCAPE){
win.shutdown();
}

}
@Override
public void keyRell(KeyEvent arg0) {
// TODO Auto-generated method stub

}@Override
public void mouse(int arg0, int arg1) {
// TODO Auto-generated method stub

}
@Override
public void mousePress(MouseEvent arg0) {
// TODO Auto-generated method stub

}@Override
public void mouseRell(MouseEvent arg0) {
// TODO Auto-generated method stub

}
@Override
public void render(Graphics arg0) {
// TODO Auto-generated method stub

}@Override
public void shutdown() {
// TODO Auto-generated method stub

}
@Override
public void update() {
// TODO Auto-generated method stub

}}

public class Genome {

public final int mutationRate = 1;
public int splitn;
public int splitno;
public int spliti;

NeuralNetwork net;

public Genome(NeuralNetwork n){
net = n;

splitn = net.node[0].length;
splitno = net.out.length;
spliti = net.in.length;

}

public NeuralNetwork returnConvergedNet(Genome g){
Random a = new Random();
int split = a.nextInt(splitn);
Node[][] nod = new Node[net.node.length][net.node[0].length];
Node[] ot = new Node[net.out.length];
Node[] ni = new Node[net.in.length];

for(int i = 0; i < net.node.length; i ++){

for(int b = 0; b < split; b++){
nod[i][b] = net.node[i][b];
}
for(int b = split; i < net.node[i].length; i ++){
nod[i][b] = g.net.node[i][b];
}

}

split = a.nextInt(splitno);

for(int i = 0; i < split; i ++){
ot[i] = net.out[i];
}
for(int i = split; i < net.out.length; i ++){
ot[i] = g.net.out[i];
}

split = a.nextInt(spliti);

for(int i = 0; i < split; i ++){
ni[i] = net.in[i];
}
for(int i = split; i < net.in.length; i ++){
ni[i] = g.net.in[i];
}
NeuralNetwork network = new NeuralNetwork(net.in.length,net.out.length,net.node.length,net.node[0].length);
network.CreateFromArrays(nod, ot, ni);
return network;
}

public Genome converge(Genome g){
Random a = new Random();
int split = a.nextInt(splitn);
Node[][] nod = new Node[net.node.length][net.node[0].length];
Node[] ot = new Node[net.out.length];
Node[] ni = new Node[net.in.length];

for(int i = 0; i < net.node.length; i ++){

for(int b = 0; b < split; b++){
nod[i][b] = net.node[i][b];

}
for(int b = split; i < net.node[i].length; i ++){
nod[i][b] = g.net.node[i][b];

}

}

split = a.nextInt(splitno);

for(int i = 0; i < split; i ++){
ot[i] = net.out[i];
}
for(int i = split; i < net.out.length; i ++){
ot[i] = g.net.out[i];
}

split = a.nextInt(spliti);

for(int i = 0; i < split; i ++){
ni[i] = net.in[i];
}
for(int i = split; i < net.in.length; i ++){
ni[i] = g.net.in[i];
}
NeuralNetwork network = new NeuralNetwork(net.in.length,net.out.length,net.node.length,net.node[0].length);
network.CreateFromArrays(nod, ot, ni);
return new Genome(network);
}

}

package AI;
import java.util.Random;public class NeuralNetwork {
public Node[] in;
public Node[] out;
public Node[][] node;	  public NeuralNetwork(int ins, int outs, int layers, int num){
in = new Node[ins];
out = new Node[outs];
node = new Node[layers][num];
}

public float[][] returnInWeights(){
float[][] ini = new float[in.length][node[0].length];
for(int i = 0; i < in.length; i ++){
for(int b = 0; b < node[0].length; b++){
ini[i][b] = in[i].weight[b];
}
}
return ini;
}
public float[][][] returnNodeNormWeights(){
float[][][] weight = new float[node.length][node[0].length][node[0][0].weight.length];
for(int i = 0; i < node.length - 1; i ++){
for(int b = 0; b < node[i].length; b ++){
for(int a = 0; a < node[i][b].weight.length; a++){
weight[i][b][a] =node[i][b].weight[a];
}
}
}
return weight;
}
public float[][] returnOutNodeWeights(){
int length = node.length - 1;
float[][] nodes = new float[node[length].length][node[length][node[length].length - 1].weight.length];
for(int i = 0; i < node[length].length; i ++){
for(int b = 0; b < node[length][node[length].length - 1].weight.length; b++){
nodes[i][b] = node[length][i].weight[b];

}
}
return nodes;
}

public float[] returnRanWeights(int amount){
Random a = new Random();
float[] weight = new float[amount];
for(int i = 0; i < amount; i ++){
weight[i] = a.nextFloat() + a.nextFloat() - 1;

}
return weight;
}

public void CreateFromArrays(Node[][] Hidden, Node[] in, Node[] out){
this.node = Hidden;
this.in = in;
this.out = out;
}

public void createNetwork(){
Random a = new Random();
float w = a.nextFloat() + a.nextFloat() - 1;
for(int i = 0; i < in.length; i ++){
in[i] = new Node(returnRanWeights(node[0].length));
}
for(int i = 0; i < node.length; i ++){
for(int b = 0; b < node[i].length; b ++){

if(i < node.length - 1){
node[i][b] = new Node(returnRanWeights(node[i + 1].length));

}else{
node[i][b] = new Node(returnRanWeights(out.length));
}
node[i][b].setBiasWeight(w);

}
}
for(int i = 0; i < out.length; i ++){
out[i] = new Node(null);

}
}

public void input(float[] inp){
for(int i = 0; i < in.length; i++){
in[i].value = inp[i];

}

for(int i = 0; i < node.length; i ++){
for(int b = 0; b < node[i].length; b ++){

if(i == 0){
node[i][b].input(in, b);
}else{

node[i][b].input(node[i - 1], b);

}

}
}

for(int i = 0; i < out.length; i++){
out[i].input(node[node.length - 1], i);
}

}

public float getOutput(int num){
return out[num].getOutput();
}
public float[] getOutput(){
float[] a = new float[out.length];
for(int i = 0; i < a.length; i++){
a[i] = getOutput(i);
}
return a;
}

}

package AI;
public class Node {public float[] weight;
public float value;
public float activation;
public final float e = 2.7183f;
public final float p =  0.5f;
public final float bias = 1.0f;
public float biasWeight = 0f;

public Node(float[] weight){
this.weight = weight;
}

public void setBiasWeight(float w){
biasWeight = w;
}
public float activationSigmoidMethod(float activation){
double a = -activation/p;
double b = e;

double c = Math.pow(e, a);
double e = 1 + c;
double f = 1/e;

return (float) f;

}

public void input(Node[] node, int num){

activation += bias*biasWeight;
for(int i = 0; i < node.length; i++){

activation += (node[i].value * node[i].weight[num]);

}

value = activationSigmoidMethod(activation);
activation = 0;
}

public float getOutput(){
return value;
}

}


Gen.

### #2wuut  Members

126
Like
1Likes
Like

Posted 21 March 2012 - 06:30 AM

Look at "neuralNullPointer2.jpg". There you try to access a weight array thats not initialized.

More null pointer issues are shown in "neuralNullPointer1.jpg".

Please: try to use a runtime debugger!

good luck!

### #3CryoGenesis  Members

528
Like
0Likes
Like

Posted 21 March 2012 - 10:04 AM

Look at "neuralNullPointer2.jpg". There you try to access a weight array thats not initialized.

More null pointer issues are shown in "neuralNullPointer1.jpg".

Please: try to use a runtime debugger!

good luck!

Oh thanks, never knew I could do that.
Cheers!

### #4wuut  Members

126
Like
0Likes
Like

Posted 23 March 2012 - 02:46 PM

Oh thanks, never knew I could do that.
Cheers!

Sorry for the lack of explanation. I was in hurry.

Maybe I am wrong but you maybe need some more experience in programming techniques to program such complex systems.

I suggest you get "Eclipse IDE for Java Developers", thats my favourite IDE when I program in Java: You can get it from "http://www.eclipse.org/downloads/".
i ve heared also that "NetBeans IDE" should be good. I never tried it.

Learn how to use it an learn how to use the debug function.

Good luck!

Joe

### #5CryoGenesis  Members

528
Like
0Likes
Like

Posted 23 March 2012 - 04:04 PM

Oh thanks, never knew I could do that.
Cheers!

Sorry for the lack of explanation. I was in hurry.

Maybe I am wrong but you maybe need some more experience in programming techniques to program such complex systems.

I suggest you get "Eclipse IDE for Java Developers", thats my favourite IDE when I program in Java: You can get it from "http://www.eclipse.org/downloads/".
i ve heared also that "NetBeans IDE" should be good. I never tried it.

Learn how to use it an learn how to use the debug function.

Good luck!

Joe

I have Eclipse and I believe my knowledge of neural networks is good.
The neural network that I have programmed works fine but unfortunately I can't seem to merge two seperate neural networks.
I'm doing this in the first place so the neural networks wont need any backpropagation algorithm. The neural networks are trained using a genetic algorithm.
In doing this I can get creatures to search for food.
The most successful creature gets to reproduce.
The creatures that reproduce with each other merge each others networks + any random mutation that happens.

Would you know how to merge to seperate networks because that is the problem I am having :/.
First I tried copying the weight floats themselves but now I tried to split the nodes instead and I am getting problems with copying pre initialised weights.

### #6wuut  Members

126
Like
1Likes
Like

Posted 23 March 2012 - 05:16 PM

I'm doing this in the first place so the neural networks wont need any backpropagation algorithm.

I dont get the point of this. What has this to do with GA?

The most successful creature gets to reproduce.

Whats about the almost "most successful creatures"? Are they dont worth the try?

Would you know how to merge to seperate networks because that is the problem I am having :/.
First I tried copying the weight floats themselves but now I tried to split the nodes instead and I am getting problems with copying pre initialised weights.

Oh man! Thats hard to read. You dont use any translator, are you?

Anyway:

I am not very familiar with genetic algorithms. But as far as i know it goes like this:
1. make a population of neural networks, generate the weights randomly (there are also several algorithms to produce optimized weights for quicker results, buts not necessary)
2. calculate the performance( cost factor or productivity or waht ever you want to call it) - factor for each network.
3. define a gene (binary or float,(stochastic?)) for each network.
4. choose some networks with the highest performance.
5. combine them by their genes and make new population of networks out of it.
6. GOTO: 1{ UNTIL CPU REACHES MAX_TEMPERATURE OR WIFE GETS HOME;}

So I think your problem may be is 2,3 and or 5.

If you have a constant network size, then I would prefer combining the nets by their weights.

J

### #7wuut  Members

126
Like
0Likes
Like

Posted 23 March 2012 - 05:27 PM

Can you describe where your problem is, in terms of my latter post?

### #8CryoGenesis  Members

528
Like
0Likes
Like

Posted 23 March 2012 - 08:04 PM

I'm doing this in the first place so the neural networks wont need any backpropagation algorithm.

I dont get the point of this. What has this to do with GA?

The most successful creature gets to reproduce.

Whats about the almost "most successful creatures"? Are they dont worth the try?

Would you know how to merge to seperate networks because that is the problem I am having :/.
First I tried copying the weight floats themselves but now I tried to split the nodes instead and I am getting problems with copying pre initialised weights.

Oh man! Thats hard to read. You dont use any translator, are you?

Anyway:

I am not very familiar with genetic algorithms. But as far as i know it goes like this:
1. make a population of neural networks, generate the weights randomly (there are also several algorithms to produce optimized weights for quicker results, buts not necessary)
2. calculate the performance( cost factor or productivity or waht ever you want to call it) - factor for each network.
3. define a gene (binary or float,(stochastic?)) for each network.
4. choose some networks with the highest performance.
5. combine them by their genes and make new population of networks out of it.
6. GOTO: 1{ UNTIL CPU REACHES MAX_TEMPERATURE OR WIFE GETS HOME;}

So I think your problem may be is 2,3 and or 5.

If you have a constant network size, then I would prefer combining the nets by their weights.

J

The whole point is to make a program that can has creatures that can learn.
The way the creatures would reproduce is based on their fitness level. The higher the fitness level, the more times they reproduce (a creature can reproduce more than once each generation).
Each creature has a Genome.
Each genome holds the information to the creatures Neural Network (Brain).
Every time the creature reproduces the two creature's Genomes split into two pieces (the size of the two pieces is a random integer).
This then creates a new Genome which is passed on to the next generation.

The problem is that I cannot find a way to decode the Neural Network into the Genome then merge two genomes to create a new Neural Network.

### #9CryoGenesis  Members

528
Like
0Likes
Like

Posted 23 March 2012 - 08:05 PM

So my problem is Number 5.

### #10Tournicoti  Prime Members

704
Like
1Likes
Like

Posted 25 March 2012 - 06:38 AM

Hello Gen

Is it possible to consider the list of weights as the genome itself ?
So you can alter and combine these lists to get new altered or combined genomes.
Honestly I don't know how to combine two genomes here, but I would first try to do some kind of average of genomes ?

Good luck
Nico

### #11Dagz  Members

110
Like
1Likes
Like

Posted 25 March 2012 - 06:36 PM

So my problem is Number 5.

Okay tell me if I'm wrong but your problem is that you want to take two parent networks, and combine them to form a child network as some sort of simulation of sexual reproduction? How you do this depends entirely on what kind of neural network you're using.

If you're using a fixed topology (meaning the graph-structure), that never changes and thus every one of the creatures neural networks is the same then it's super easy to get the child's neural network (or geneome). All you have to do is run through the child's genome and select genes for it's father or mother randomly to fill the same location the gene would have filled in the parent, like

Father: A, B, C, D
Mother: 1, 2, 3, 4
Child: A, 2, 3, D or 1, B, C,4 or any random configuration.

If the topology of the neural network evolves along with the weights of the network then things are a little more difficult. You would need to use what is called the NEAT algorithm, it's a little tedious to read up but if you just google it you should find pdf documents explaining it. I would suggest just sticking with fixed topologies. I don't know Java so I can't actually help you out with code, but I could explain algorithms maybe in more detail.

### #12laztrezort  Members

1058
Like
1Likes
Like

Posted 25 March 2012 - 08:45 PM

Artificial life simulators often use two mechanisms called "crossover" and "mutation" when combining genomes, which are simplifications of the biological processess similarly named.

Crossover is simply splitting the genome (or gene, or chromosome) of each parent at a random position, and combining them together to form a new one. An example of a simple crossover:

Parent 1: ABAB|BBAA
Parent 2: AAAB|BBBA
where the "|" is a randomly determined position, could produce 2 children:

child 1: ABABBBBA
child 2: AAABBBAA

Mutation, which usually has only a small chance of occuring, is just a single random change (for example, flipping an A to a B somehwere in the genome).

The point of crossover is to keep successful genes in the population, while mutation would supposedly create entirely new genes.

I'm no expert on NN, but I would guess one of the challenges would be how to implement the crossover function properly. This would probably depend on whether your goal is training (in which case this is probably a bad method for that) or alife simulation. Assuming the latter, I would personally start here: http://en.wikipedia....able_simulators and see how other NN based simulators are doing it. Framsticks in particular, IIRC, has pretty thorough documentation.

### #13kirkd  Members

505
Like
1Likes
Like

Posted 26 March 2012 - 10:18 AM

Cryo,

There are a number of problems that you're going to run into along the way using this approach. I think that other posts have addressed most of the issues quite well, but I want to add a few details.

First, refer to Dagz's post regarding the topology of the networks. If the network structure does not change, then it should be quite easy to recombine two networks to get a child network. If you are also trying to evolve the network structure - number of nodes, specific connections, and weights - follow Dagz's advice to look at NEAT here: http://www.cs.ucf.edu/~kstanley/ He has a number of very good publications that describe NEAT itself, applications of NEAT, and extensions. The reading is somewhat technical, though.

Assuming that your network structure is not changing, crossover is only part of the story. If you only use crossover to recombine your networks, you will only be shuffling the existing weights around from place to place. You will want some sort of additional mechanism to apply changes to those weights. You could do crossover to generate your child network and then run backpropagation to optimize it, keeping the resulting weights for future generations. You could apply a mutation operator that modifies a very few weights slightly. You could specify a crossover operator that does this weight modification for you - maybe averaging some of the parent weights instead of copying them directly. There are many, many different ways to approach this, and you'll need to do a lot of experimenting to get the one that works for you.

Speaking of cross over, there are lots of different ways to do this. The one mentioned by laztrezort is the most common method, but in practice it does perform very well. A better option is 2-point crossover where you specify to random positions within each the genome, and swap the middles. Note that the sizes must be the same. Something like this:

ABAB | ABABA | BABAB
CDCD | CDCDC | DCDCD

gives ABAB | CDCDC | BABAB
and CDCD | ABABA | DCDCD

Another commonly used crossover operator is one that randomly swaps single elements between genes. You might get something like this using the first two genomes I mentioned above:

And there are dozens of other ways to do this.

Finally, the behavior you're shooting for - entities that learn to seek food - will require a fairly long time to evolve. You'll need to be very patient and it may come down to just the right crossover and mutation operators run for an adequately long time.

Feel free to ask questions if you have them.

-Kirk

### #14CryoGenesis  Members

528
Like
0Likes
Like

Posted 26 March 2012 - 01:33 PM

Thanks for all the replies guys, Really helped. I'm going to have to rewrite the code so I'll get back to you if I run into any more problems.
I'll be sure to give you guys +1 rep

Old topic!

Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.