• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
noppanit

genetic programming or evolutionary algo for game AI

38 posts in this topic

willh I am not sure what you are saying. Your arguments are correct but your conclusion doesn't fit. You say you always implement crossover and yet are trying to argue that crossover and mutation are the same thing. When it is clearly not the case and certainly *should* not be the case. Otherwise you are doing it wrong. The only way they could be the same is if you were taking a broad sweeping view of both mean creating new lists, say. But that is missing the forest for the tree, the point of multiple solutions is crossover (recombining elements across lists vs. selecting elements near a point in a list) to increase the chance of stumbling near a global optimum.

As for the original paper. Original papers are good to read because they are much less dry and provide insight and motivation than later distillations. But they are often much more bloated as they contain the raw, unpolished inspirations. As Kylotan said the biology is just a metaphor that guided the original work. Just as Immune systems insipire the Negative Selection Algorithm. I could just as easily call crossover as cross list recombination and mutation as Rearrangement via Random Neighbourhood Point Sampling. This doesnt make GA biological any more than playing Operation prepares you for heart surgery.

Quote:
Original post by Kylotan
Without crossover you're not sharing useful information across the population, so there is little point maintaining that population. If you have a problem where there is no useful information to share, that's fine, but I can't think of one myself. If such a problem did exist then it would probably be better to run many instances of a cheaper single-member optimisation algorithm serially rather than dressing it up as a GA.


Its a matter of runtime. Clearly both would eventually arrive at the same result giving enough time but there are some problems where maintaining multiple solutions are distracting. One example from personal experience is when tuning some model where there is only one 'answer' and taking even a few step away results in wildly divergent answers. Here, keeping multiple solutions is detrimental and increases runtime drastically. Where as something like optimizing a schedule is better fitted to the GA approach.

Nonetheless to get around the weakness of not having multiple solutions in SA I retain a best second solution where the probability of restarting from that point increases as the time without a better find increases. Also attenuating range of search as Temperature cools is useful.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Daerax
willh I am not sure what you are saying. Your arguments are correct but your conclusion doesn't fit.


If we agree that:
a. Mutation can give the same results as crossover.

b. Omitting crossover does not necessarily invalidate some of the other positive (and fruitful) aspects of GA/EP/'Dice rolling' for a specific problem.

then I don't think we have any disagreement. I didn't say 'never use crossover'. I just said that it's not always necessary. (I implement it in my own frameworks out of habit BTW).

I think the argument originated with the claim that 'its not a GA if there is no crossover'. I disagree with that claim for reasons stated previously but will elaborate:

Bob is using a GA to solve some problem.
1. Bob has implemented crossover and 5 types of mutation.
2. By generation 100 an acceptable fitness has been achieved for Bobs problem.
3. In all 100 generations, crossover did not once produce any fruitful result and did not actually contribute to the solution.

Given 1,2, and 3, did Bob use a GA to reach his solution? Would it have been any less a GA if Bob simply omitted crossover given that the results would be identical?

You could argue that the the crossover/genome was poorly implemented; for reasons which I've already provided this is not always the case.

0

Share this post


Link to post
Share on other sites
Again, No argument. You make sense. This is a matter of semantics. If you strip away the metaphors of mutation and crossover and instead look at it at a pure algorithmic level then you see that what you have is either

a) something similar but not equivalent to a genetic algorithm with cycles and memory wasted on a superfluous function (crossover) and maintaining multiple solutions

b) a badly implemented crossover function.

Thus in conclusion it appears your definition of GA is broader than everyone who disagrees with you (and most literature I have run into), you use it more as a covering term rather than focusing on the specific algorithmic properties.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Daerax
If you strip away the metaphors of mutation and crossover and instead look at it at a pure algorithmic level then you see that what you have is either

a) something similar but not equivalent to a genetic algorithm with cycles and memory wasted on a superfluous function (crossover) and maintaining multiple solutions

b) a badly implemented crossover function.


I haven't said anything to that effect.

Quote:
Original post by Daerax
Thus in conclusion it appears your definition of GA is broader than everyone who disagrees with you (and most literature I have run into), you use it more as a covering term rather than focusing on the specific algorithmic properties.


I can assure you that GA w/o crossover is very much in the literature; it has been a topic of debate for some time. Maybe you're not reading the right material? Pretty much everything I've written as part of this thread has already been talked about in 'the literature'.

You're insisting on a rigid definition that necessarily includes crossover. I reject this definition while maintaining that the baby is still safely in the proverbial tub. You're also insisting that this view is somehow concrete and the only legitimate one; while that may be true within the GD community, it is certainley not the case!

In your view, what type of crossover is required for something to qualify as a GA? How flexible are you willing to be with your crossover operator while still having, what you would consider, a GA?

In terms of your own GA implementation, what is the goal of crossover?

[Edited by - willh on July 26, 2010 1:34:15 PM]
0

Share this post


Link to post
Share on other sites
See kids, this is where GA/DP/ANN leads: semantics arguments over ill-defined methods that gives poor results and takes so much time and hacks and tweaks just to make it work that you could have coded a straightforward solution a hundred times over instead.
1

Share this post


Link to post
Share on other sites
Quote:
Originally posted by Daerax:
excellent post! one qualm, GA without cross over is not really crippled and is really not so different from Simulated Annealing - the latter of which I have used to tune an SVM with good result



@Daerax:

Off topic, but... why couldn't you just solve a quadratic program? I mean, from my point of view, that's the whole advantage to SVMs; they're built using convex optimization, so you don't need stochastic methods like SA/GA/etc...
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Steadtler
See kids, this is where GA/DP/ANN leads: semantics arguments over ill-defined methods that gives poor results and takes so much time and hacks and tweaks just to make it work that you could have coded a straightforward solution a hundred times over instead.

IF you don't know what you're doing, sure.

If you know what you're doing, you can get awesome results with a GA on the right problems. Personally, I think GAs are perfectly straightforward.

We don't quibble over the need for or the worth of a decent heuristic for A*, or tell people to stick with DFS because it's more straightforward. So why do we do so over the need for decent operators for GAs? Perhaps it's because we don't have so many concrete applications with concrete heuristics, unlike A* where we have the pathfinding scenario with geographical estimates of distance.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by willh
I can assure you that GA w/o crossover is very much in the literature; it has been a topic of debate for some time. Maybe you're not reading the right material? Pretty much everything I've written as part of this thread has already been talked about in 'the literature'.

You're insisting on a rigid definition that necessarily includes crossover. I reject this definition while maintaining that the baby is still safely in the proverbial tub. You're also insisting that this view is somehow concrete and the only legitimate one; while that may be true within the GD community, it is certainley not the case!

In your view, what type of crossover is required for something to qualify as a GA? How flexible are you willing to be with your crossover operator while still having, what you would consider, a GA?

In terms of your own GA implementation, what is the goal of crossover?


Originally I believe the argument was crossover is equivalent to mutation. This is correct enough, for a sufficiently general definition of these terms as to offer little information. Any meaningful definition of the terms must partition them. e.g. when considering the mathematical aspects of trying to minimze the probability of destruction of schema, we want to reduce the length of the 'chromosomes' to reduce p_c(S) that crossover will destroy some schema S and reduce the size of the "gene" with regards to mutation affecting S.

Now you are stating that GA does not require crossover. My contention is that while this may make sense GA without crossover loses the main benefit and core idea of what makes a GA: unlike gradient descent (which often takes a long time to converge and gets stuck in valleys) and the like it benefits when you can get good improvements to search by combining good results from different locations.

Without this the GA loses its bite, enough so that it reduces to a poor imitation of an algorithm such as simulated annealing. You may as well use a method such as Best First Search. This is not to say that this method (no crossover) won't work, it will, especially if combining from differing locations is detrimental but it will converge slower than say momentum based methods or SA and may well get stuck at a local extremum.

I think I have exhausted what I can say on the matter. I will leave with this, if you consider the biology metapohor that originated GA then the benefit of crossover vs is mutation is the benefit of sexual vs asexual reproduction. And then this argument reduces to a subject that is older even than computers.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Emergent
Quote:
Originally posted by Daerax:
excellent post! one qualm, GA without cross over is not really crippled and is really not so different from Simulated Annealing - the latter of which I have used to tune an SVM with good result



@Daerax:

Off topic, but... why couldn't you just solve a quadratic program? I mean, from my point of view, that's the whole advantage to SVMs; they're built using convex optimization, so you don't need stochastic methods like SA/GA/etc...


Excellent question, so maybe I was not clear. We are talking about two different things. Using a stochastic method would be very dumb of me when trying to optimize the margin around the dividing hyperplane but I wasn't talking about this.

See SVM are extremely robust, if you define the problem well and have selected and transformed features appropriately they are very, very hard to beat. They can classify even more complex spaces than a MLP Neural net. But their main weakness is that they don't generalize so well without correct selection of parameters and live and die by correct tuning. Selecting a Kernel function can be done in terms of understanding the problem domain but then after that you have to search for what parameters (e.g. Cost and gamma for a RBF kernel) lead to the best results. This is what I used SA for.

See http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1333843 for more
0

Share this post


Link to post
Share on other sites
I just can't resist the urge.....I have to say something.

There really is no argument when it comes to defining terms like crossover, mutation, selection, etc. because they have all been defined in literature and they all fall under the umbrella of evolutionary computation. There is also an agreed upon definition of an evolutionary algorithm (EA) out there (dating back to 1993).

Just to point out a few things. A GA without crossover is not a GA. It is actually an evolution strategy (ES). And the cool thing about ES is that it varies the amount of mutation based on successes of previously applied mutations. However, later on, crossover was added into ES in one form or another to increase its robustness. Other additions to ES include complex covariance matrices to properly guide search, which at that point, I find it pretty much overkill.

Now as to the merits of crossover, that is arguable, but for all EAs to work properly, it all boils down to exploitation vs exploration. The point of crossover is to promote exploration of the search space by incorporating partial solutions from other candidate solutions. Mutation is simply a form of exploitation of the current solution and local solution space. However, depending on the convergence of the population, the roles of the two can flip around. So, going back to a mutation only EA. To make it work well, you must somehow counterbalance the exploitative nature of mutation with some other mechanism that promotes exploration. In the case of the original ES, it varied the amount of mutation to try and promote exploration, but it seems that comparably, crossover created more exploration, which is why few stuck with the original ES structure.

Back to the original posters question about genetic programming. As cool as that would be to have GP in games, chances are slim that it would become practical. Though there may be some use for it somewhere in the development pipeline. Also, note that GP does not have to evolve actual pieces of code. Most practical applications of GP out there involve using it to do curve fitting by evolving the actual mathematical equation for the curve or data in question. More useful for game AI would probably be Evolutionary Programming, which, if memory serves me correctly, was originally designed to evolve finite state machines, which has a much higher level of relevance when it comes to practical game AI. So, for example, evolving the state machine that defines an AI agent's behavior on the fly through gameplay feedback.

In the past decade or so, people have been trying to improve on EAs by pretty much mixing and matching everything they can come across, so no one really works exclusively with any of the 4 originals anymore (GA, ES, EP, GP). At this point, the lines are so blurred that it is quite pointless in arguing whether something is a GA or not. The fact is that they're all evolutionary algorithms in one shape or form.

Oh yeah, and for people who are obsessed about gradient ascent/descent, there is the Society of Hill-Climbers and the evolutionary society of hill-climbers that is definitely worth checking out.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by noppanit
I'm just curious that has anybody used genetic programming for Game AI before? I mean I'm doing my project and I just need some experience.

The paper and dice game Traveler has ship combat, where the players design their own ships, where the ship's stats are encoded as a sequence of numbers. Someone programmed a genetic algorithm to generate the ship design sequences and pit them against each other.

The winning combination was a swarm of medium fighters with heavy armor and no capital ships. The game designer was pissed that his game had been 'solved' by computer simulation, and next year he released the updated rule books just three days before the annual competition, to try preventing computerized solutions.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Daerax
[...]search for what parameters (e.g. Cost and gamma for a RBF kernel) lead to the best results. This is what I used SA for.

See http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1333843 for more


Oh, ok; that makes sense. Have you seen the work that directly optimizes the kernel matrix subject to the constraint that it remain positive definite (e.g. this)? That keeps the problem convex, and is more general than tuning, say, RBF parameters. (Of course, I appreciate that "more general" is a double-edged sword in learning, so I understand why tuning as in your case might be preferred.)
0

Share this post


Link to post
Share on other sites
Quote:
Original post by noppanit
I'm just curious that has anybody used genetic programming for Game AI before? I mean I'm doing my project and I just need some experience.


I'm not going to get into the genetic programming vs genetic algorithms vs gradient descent argument, but I will say that as I am typing this I am running genetic algorithms to learn strategies for the base set of the card game Dominion: http://en.wikipedia.org/wiki/Dominion_%28card_game%29 . The game has a lot of cards with varying effects and relatively simple effective combos. It would be quite a pain to write intelligent AI for every card, and doing it this way lets me generate multiple AI players that all play differently while still playing fairly well.

In my experience genetic algorithm are not good at finding the "best" answer to a problem, but they are good at finding okay answers. What makes them fun is that they are good at finding unexpected answers.
0

Share this post


Link to post
Share on other sites
Quote:
Original post by Emergent
Quote:
Original post by Daerax
[...]search for what parameters (e.g. Cost and gamma for a RBF kernel) lead to the best results. This is what I used SA for.

See http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1333843 for more


Oh, ok; that makes sense. Have you seen the work that directly optimizes the kernel matrix subject to the constraint that it remain positive definite (e.g. this)? That keeps the problem convex, and is more general than tuning, say, RBF parameters. (Of course, I appreciate that "more general" is a double-edged sword in learning, so I understand why tuning as in your case might be preferred.)


No I haven't but it looks really interesting. The ability to learn model classes without local minima is great! And the fact they "have also shown how optimizing a linear combination of kernel matrices provides a novel method for fusing heterogeneous data sources" and another paper from MSR I just scanned that tells how to account for missing data in sources is excellent.

But I can't really justify implementing it - it looks like it will take weeks to a month or more of full time work at an unquantifiable gain. I mean the technique itself is more general but there is nothing in the paper that leads to me believe that it learns a more general space. In particular they say its error rate is lower than typical generally but how do I know it is not general only for the specific data they generalized it on? And IMHO the gain improvement is marginal enough to be not worth the effort. Do you have any advice?

See the problem I've faced in implementing a bunch of these advanced methods is that when you are not looking at the iris data set or whether it will rain or amino acids data the methods tend to crumble and barely outperform the simpler methods (out of the box). The methods then aren't so performant (time , space and learning generalization rate) when dealing with fairly large datasets *for a single typical computer* - leading to a lot of time spent "improving by compromising" the method to get better results to your problem. Machine learning is currently part art isn't it?

I've been consider moving a bunch of these matrix based methods to a GPU but my target audience don't have the latest in GPGPUs (i already parallelize so when 8-16 cpus become norm I expect to reap those benefits but with 2 the average the speed gains are not very nice).

Anyways you know alot about this stuff, do you have any specific advice on learning kernels via SDP versus typical methods?

EDIT: To clarify, I am not saying that the more advanced methods aren't worth it. Far from it, they tend to outperform the simpler methods but only after a lot of work done to optimize it to your problem. Where as naive bayes, a decision tree or say kNN just work; Bayesian Nets, SVM or NMF require a significant amount of time 'adjusting' - I find my self reading the proofs of the methods to gain a better understanding for tweaking whereas the other stuff is intuitive - so require alot more work but then blow the simpler ones out of the water. Most of the time. So I always find myself questioning if the improvement will justify the extra work.

[Edited by - Daerax on August 16, 2010 12:17:41 AM]
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0