GA Question
Lets say i run 10000 iterations on some data, twice. Now, the first time I do this I use both single point crossover and I choose three genese at random and mutate these to a random value in a certain range.
If on the second time I iterate my algorithm 10000 times I only use the three point gene mutation method, should my answers be similar but slightly better (which is what I am getting) or dramatically worse?
I was wondering if someone could tell me to see whether I am implementing my GA correctly.
Thanks
Sounds about right, I'm no GA expert, but it's been my experience that mutation does most of the work. Try it without the mutation, probably won't get anywhere. I would try some more experiments, like try it with fewer generations and see if the crossover helps then, or try different crossovers.
So yeah, if you ask me it sounds like things are working ok.
So yeah, if you ask me it sounds like things are working ok.
The power of the GA is in crossover. If you arent leveraging crossover then you might as well use a single-member stochastic hill climber because thats all mutation is going to do for you.
Population diversity defines the effectiveness of crossover. If mutations are doing most of the work, than it is proof that your population isnt diverse enough (probably because its too small or you are allowing multiple copies of the same successful member to fill the majority of the population)
A GA can be enhanced with mutation but it is not absolutely necessary with a highly diverse population.
There has always been two camps in the GA debate. Some say mutation is the driving force, while others say its crossover. They're both right, depending on the type of problem you're solving. For some, mutation will get you where you need to go faster, while for others crossover will do it better.
Try to run your results more than once or twice before coming to a conclusion. GAs are stochastic to a certain extent, so, starting population matters as well. The usual method is to at least run it 30 - 100 times on the same problem and get the average performance. So, if you are testing which is better, you can run the problem with the GA with crossover about 100 times, then do the same with the mutation only solution (which really isn't a GA) and compare their average performance.
Also, when you say single point crossover are you doing fixed point single pooint crossover (same crossover point everytime) or random point single point crossover? In general, many times uniform crossover works better than plain single point crossover. However, if you use uniform cross, your mutation rate may have to be lower.
Now, the other way to look at the mutation vs crossover debate is through the idea of exploration and exploitation. Mutation is a much more local operator, which exploits the current candidate solution's location. Crossover, on the other hand, given a diverse population will give you exploration as the candidate solution is capable of jumping longer distances. So, to effectively attain good performance on any search, exploitation and exploration has to be well balanced.
Try to run your results more than once or twice before coming to a conclusion. GAs are stochastic to a certain extent, so, starting population matters as well. The usual method is to at least run it 30 - 100 times on the same problem and get the average performance. So, if you are testing which is better, you can run the problem with the GA with crossover about 100 times, then do the same with the mutation only solution (which really isn't a GA) and compare their average performance.
Also, when you say single point crossover are you doing fixed point single pooint crossover (same crossover point everytime) or random point single point crossover? In general, many times uniform crossover works better than plain single point crossover. However, if you use uniform cross, your mutation rate may have to be lower.
Now, the other way to look at the mutation vs crossover debate is through the idea of exploration and exploitation. Mutation is a much more local operator, which exploits the current candidate solution's location. Crossover, on the other hand, given a diverse population will give you exploration as the candidate solution is capable of jumping longer distances. So, to effectively attain good performance on any search, exploitation and exploration has to be well balanced.
Quote:Original post by WeirdoFu
There has always been two camps in the GA debate. Some say mutation is the driving force, while others say its crossover. They're both right...
Says who?
Mutation only:
Repeat many times
..For each member
....try a small change to the member
....if "mutated" member is better than the original
......replace the original.
No GA junk! But it accomplishes the same thing as 'mutation' does it?
Its called stochastic hill climbing and theres an entire field of study around it. The above algorithm is *exactly* what a GA without crossover accomplishes, and is almost the worst hill climbing algorithm possible.
You have been misinformed.
True, but you still get somewhere. If you just have crossover, you'd need either a really huge population or a really tiny problem space to get decent results. So like WeirdoFu said, it kinda depends on the problem.
I guess the reason I think mutation does most of the work is because in my experience I've seen noticable differences in performance when messing with the mutation rate, and had a hard time getting improvements when just messing with the crossover (other than 1-point, they all seem to do about the same, depending on the problem). But like I said, I'm no GA expert.
However, if you ask me, I can't really see using a GA if you aren't going to have both crossover and mutation.
I guess the reason I think mutation does most of the work is because in my experience I've seen noticable differences in performance when messing with the mutation rate, and had a hard time getting improvements when just messing with the crossover (other than 1-point, they all seem to do about the same, depending on the problem). But like I said, I'm no GA expert.
However, if you ask me, I can't really see using a GA if you aren't going to have both crossover and mutation.
Quote:Original post by Rockoon1Quote:Original post by WeirdoFu
There has always been two camps in the GA debate. Some say mutation is the driving force, while others say its crossover. They're both right...
Says who?
Mutation only:
Repeat many times
..For each member
....try a small change to the member
....if "mutated" member is better than the original
......replace the original.
No GA junk! But it accomplishes the same thing as 'mutation' does it?
Its called stochastic hill climbing and theres an entire field of study around it. The above algorithm is *exactly* what a GA without crossover accomplishes, and is almost the worst hill climbing algorithm possible.
You have been misinformed.
You misunderstand. When I say "driving force" I don't mean to take one and leave the other. What I mean is that some believe a GA can work on crossover only with little or no mutation, while others believe in using a simple crossover like fix single point and then go with a higher mutation rate. So saying that mutation or crossover doesn't matter in a GA doesn't mean that you take it out, but rather that you feel varying one will effect the results more for the better than varying the other. In the end, its all about the capability to create new schemata.
Quote:Original post by Rixter
True, but you still get somewhere. If you just have crossover, you'd need either a really huge population or a really tiny problem space to get decent results. So like WeirdoFu said, it kinda depends on the problem.
Im not saying that mutation isnt a part of the common GA strategy.. but I am saying that its not necessary and really its a totally different methodology piggybacked onto crossover that has far better alternatives.
When the population diversity approaches zero (they are all on the same hill) then you can either watch the terribly inefficient mutation based hill climber built into the GA, or you can stop using the GA and take your best member and use a decent hill climber that will get the job done in a fraction of the time.
Often these other algorithms are highly likely to find the local maxima. Thats important. With mutation you never know.....
Quote:
I guess the reason I think mutation does most of the work is because in my experience I've seen noticable differences in performance when messing with the mutation rate, and had a hard time getting improvements when just messing with the crossover (other than 1-point, they all seem to do about the same, depending on the problem).
You see, the added search space via mutation is controlled by the mutation rate..
..but the added search space via crossover is controlled by population diversity!
Changing the crossover function wont help much in this regard. The crossover function should be tailored to the search space, thats all.
To increase diversity, you need to increase the size of the parenting population. There is really no other way around it.
Quote:Original post by Rockoon1
To increase diversity, you need to increase the size of the parenting population. There is really no other way around it.
Often just increasing the population size doesn't help very much. Speciation with fitness sharing, on the other hand, provides a nice mechanism to increase population diversity.
Also, I'm no expert, but I don't believe equating mutation to hill-climbing is correct. In particular, neutral and slighly deleterious mutations are sometimes preserved by a mutation-only GA (often for quite sometime, if they happen to occur on an otherwise-fit individual), but not in strict hill-climbing. Some have argued that the power of GA's comes from their ability to explore nearly neutral fitness spaces in this way.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement