About this blog
An ending an a new beginning
Entries in this blog
I did some evaluations on the time I need to spend on AI. It is not that difficult to create AI. It’s not even difficult to create impossibly difficult AI. What is really hard, is to create interesting AI. So – I started researching neural evolution. I spent a day reading through the Neural Evolution of Augmented Topologies (NEAT) by Kenneth O. Stanley a couple of times. A paper can be found here:
The algorithm is not that difficult to understand, but you need a bit of knowledge about neural networks and how they work to understand this paper. Luckily, I had already taken a course in Machine Learning by Andrew Ng and Stanford University. I would really recommend this to all interested in the field.
Concretely for my case, I wanted to see if implementing this for the Moose in this game (and potentially all other species) would be a viable solution to create some interesting AI. I could then run a learning simulation for some time to create some base for these creatures, then add them with some mediocre intelligence to the version I ship to the end-users, but keep the ability to evolve the genome for the creatures alive in the game.
NEAT in short
For those who don’t know much about the learning algorithms, I just want to explain a little about how they work and the purpose of them. Neural networks are a computer model imitating the neurons, axons and dendrites that the brain consists of. Every neuron is a kind of computer that does a single calculation based on the input it gets. It then sends the calculated value to all the neurons it is connected to scaled by the connections.
Here’s a picture of a classical neural network:
This network can be trained over many iterations of measuring how far off from the goal the network performs, (called the error value or cost value). You then alter the connection values to try and better the performance for what you want. This could be looking at an image of a letter or number and trying to classify which number or letter it is. Or it could be analyzing an image of a road and training a car to drive itself. Common for all the purposes is that you have a large list of training data that tells the system how far off from the actual case the system is. The pixels of a picture here is the input values on the left, and the prediction of what the hand-written number is comes out on the right.
For the case where you don’t have any training data, you can use something like neural evolution. The requirement here then is, that you supply the system with a Fitness function. The fitness function will supply the system by awarding “points” to the fitness of the performing genome every time it has randomly evolved to do something that has made it survive.
For a very “simple” example, I’d recommend you watch Seth Bling’s version on YouTube:
Here, he creates a system to teach a computer to play Mario. The fitness function is just a value of how far over time Mario has moved. This works fine for this case. The inputs are the fields of blocks on the screen which he has classified to be two different types (standable, and enemy).
It is a rather impressive implementation.
So - an evolution algorithm works by creating a genome, mutate it to add random neurons and random connections between them, then measure how well they do before they die (or some other kill function). We then take the best few of the saved genomes and fuse them together. The fusion compares the genes in the genome and adds the matching genes from the parents with a value from either parent. It then takes the genes, that doesn’t match up, from the best performing parent (the one with the highest fitness number) and puts it into the new child.
What is essential to get NEAT to work, is understanding how a genome is built up. Kenneth Stanley has some quite smart ideas on tracking where the different genes comes from. A genome here is two lists describing the neurons in the system and the connections between the neurons in the brain.
The connections have an extra feature. They have what he calls an Innovation number attached, which describes when in the simulation this connection has appeared. It is this value that is used to compare the genes when fusing the genomes.
As the simulation runs, and you breed the best performing creatures/genomes, you get better and better performing creatures. How long this simulation will take to run depends on the complexity of how you want the system to perform.
Implementing the System
Running the Neural Network
I started by planning the interface for the system that should be run.
To make the evolution system generic, the “brainstem” must be defined for each creature. Analogous to the real brain, this system will need a coupling between the actual brain and the inputs/outputs that each creature can have.
The brainstem works as a relay, interpreting the signals from the body and other senses and forwards these signals to the brain. Simultaneously, these signal are dampened or enhanced depending on the severity of the signal and possibly also per specific signal. The brainstem also converts signals from the brain to nerve impulses to the body and muscles.
It is this module, that should be defined for each creature in order to make a generic learning algorithm that all creatures can be run through.
So a general model for how the brain iteration could work is something like this:
Every iteration start with the creature “Sensing” what is around it. For the Moose, I wanted to include stuff like its hunger and thirst level, because I wanted to see if it could evolve to be smart enough to find food or water when it was almost dead. I had a long list of stuff as the sensory.
The Sensory is gathered and sent to an adapter that inputs these values to the neural network. The network is iterated through and the output values is again sent to an adapter that interprets each of the output signals. The output signals are translated into actions and sent to a Resonator module. The function of this is to save the last output from the neural network and perform the given actions until the next output from the neural network comes. The reason I created this module was that I wasn’t sure my little laptop could keep up with running the network every frame. This means I can turn down the speed to something like ten times per second, but the creature would still act in between iterations.
The Evolution System
When the creatures die or are terminated for some reason, the evolution system come into play. I started by making what you could call a cradle for the evolution to start in. This was just a terrain where I put up four walls to create and area of something like a square kilometer. To start with, I made 26 copies of the same creature that was “protected” by the evolution system – meaning that when they die, it’s body isn’t deleted, only its brain. This way, I made sure that no matter how much bad luck they would have, at least 26 entities would be simulating at any given moment.
When the creatures did nothing for a wanted amount of time, I terminated their brain and created/fused/mutated a new one and injected this brain into the creature along with resetting its vitals.
The Neural Network Manager attached to each creature records the performance of the performing creature and saves its fitness number to a file when it dies. The Neural Evolution Manager is then responsible for finding the genomes for the specific creature and breeding the best of them, mutating them and instantiating the new brain and injecting it into the creature again. The same algorithm is used when a natural birth occurs, only these individuals die for real when they run out of food or water.
For debugging any system like this, testing and unit tests are necessary. Although, with real-time simulations, you cannot necessarily test every scenario that the system will experience, mainly because it can be hard to imagine every scenario. So visualizing the actual neural network is also vitally necessary.
Here’s a video of the performing system from the beginning.
And another session: https://drive.google.com/file/d/0BxnLa_qsqQBoVGpQd3ZEMTBsNzg/view?usp=sharing
What you can see is me starting by setting some parameters for the evolution system. These are values such as how long the creature can stand still and do nothing before termination or how much its fitness has to change to be recorded. There’s also a value that controls how many loop iterations the performing neural network can take. This value is needed, because two nodes can feed each other, creating an infinite loop that never returns.
The next thing that happens is… nothing. Nothing happens for the first 10-20 seconds. Eventually, a creature starts reproducing, running, turning or eating, which is awarded fitness points. So the next generation of creatures all inherit these features and half of them starts doing that.
The green orbs in the system are input values and are shown for the current creature being viewed on the left. The red orbs are the output values for the current creature being viewed. The yellow orbs that begin to appear in between are “hidden neurons”. The lines between them are connections and its scaling value is representing by how much color it is shown in (black/grey means close to zero).
Eventually they develop some behavior that gets them forward. at generation 10, they start to be able to detect obstacles in front of them, turn and conserve energy based on the steepness of the surface they stand on. This is very interesting to me. I really feel like this has great potential, once I clean up the input values and normalize them.
If any of you are interested in seeing this early build and reading the source code, I’d be happy to share it.
I need to supply a hunter/predator to the training simulation. This is just going to be a placeholder – most likely an animated red box with the “Player” tag. This being a hunter following and killing the creature when it is close enough. This will eventually train the creatures to flee, turn or attack at the right moment so that the creature doesn’t get killed, thus survive and increase its fitness. This requires the creatures to know the relative direction to the predator, and perhaps the distance. Another critical optimization for this system is normalizing the input values to be between zero and one. Some of these values are vectors and requires vector normalization, and I fear these calculations may be hard on the system and require me to turn down the iterations per second.
I will optimize this further and begin some game tests when I have the time.
So. I've been away for a while. I had promised some people to take a vacation with them. It was fine. Problem is, though, that I get bored when I don't develop for a while. So I made sure my short vacation was active enough to occupy my mind.
The good thing about breaks is that you get a chance of perspective. I got a lot of new ideas that I made sure to write down.
I started animating the Tiger-Hawk-hybrid ( which I still need a good name for ) and it took me a couple of days. I still need a few animations. They are: Swimming and Diving, Otherwise I think I am there!
The animation list looks as follows:
Name frameStart frameStop doesLoop speed
Idle1 40 60 1 0.5
Idle2 65 110 0 0.5
Idle3 115 185 0 0.5
WalkSlow 195 215 1 0.5
WalkNormal 195 215 1 0.5
Run 220 240 1 1.0
Sprint 245 270 1 ~1.8
Break 285 300 1 ~0.6
Sneak 305 325 1 0.25
IdleLowSneaking 330 350 1 1
Sit 355 375 1 0.5
LieDown 380 395 1 0.5
RightPawAttack 400 413 0 2.0
LeftPawAttack 415 423 0 2.0
BiteAttact 430 440 0 1.0
BraceLanding 445 460 1 0.5
Hover 465 480 1 0.5
VerticalHang 485 505 1 0.5
VerticalCrawl 510 530 1 1.0
FlyingPose 540 550 1 1.0
Landing 555 566 0 0.8
Jump 680 685 0 0.5
FoldedFull 575 580 1 0.1
FoldedSemi 585 590 1 0.1
FoldedTip 595 600 1 0.1
Spread 570 572 1 0.1
LandingFlapping 445 460 1 0.5
Flapping 605 625 1 1.0
flying 630 670 1 0.7
StillFlyingPose 670 675 1 0.1
I know - Quite the handful. As you can see, I planned the layering a bit in advance.
There's a bit of coded animations done real-time as well. Fx. I wanted the wings and feather bones to bend correctly when turning while flying. :
Here - the red lines being the normal direction of the feather-bones and the blue lines being the coded, extra turning of the bones. This is applied when yawing and pitching to give and extra effect of control and it looks really smooth. There's a link to a demo video of the controls at the bottom of this post.
So I spend some time coding the controls and flying physics, and I really think I've hit some sweet spots.
A big game-play relevant topic for me is the limits you are presented with and forced to abide in a game. I'm talking about the mechanics that make a game fun and forces you to act a specific way. I want to talk a bit about this.
An example of a simple game where the entire game-play is decided by mechanics is Counter-Strike. Very simple game. The game essentially consists of the player knowing the different weapons( bullet amount, reload time, spread by recoil, projectile power, move speed with specific weapons ) and the levels and how to act in each situation (which varies because you often play against humans, who adapt to other players style). This makes a simple game very dynamic, with very simple game mechanics. The entire game relies on the player feeling like he's getting better at the game, which is up to his/hers brains capacity to create synapses and reducing reaction time - developing hand-eye coordination. So the player is limited by his weapon and all of these parameters that makes the game-play are actually hidden! - except for bullet count. the game-play is made fun by adding damage depending on where you are hit and bullet force. When you are out of HP - you die.
Many players keep coming back to this game because the player level essentially is decided by the ability of the actual player. In contrast - lets look at almost every MMORPG ever. In most of these games - you get in higher level by just investing enough time and pushing the same button a million times. I'm not saying that is a bad thing - but it is very different from a game where being good is actually caused by the player being better at the game.
So since the resources I can provide into this project are limited ( I can't spend thousands of hours building beautiful levels and rich dialogue), I am going to try and aim my focus on building something where the player level is decided by his reaction time and skill in using the game-mechanics he has at hand. This means I have to come up with some intuitive game mechanics. I am still developing ideas and this is definitely one of the fun parts about being a game developer.
The most obvious mechanics are obviously something like Life/Health and Energy/Stamina. Since I am not a fan of numbers in graphical experiences, I am toying with some different ways to represent these internal numbers without doing it explicitly. The one I have come up with so far is having a heart-icon for health. As the health deteriorates, so does the heart.
I placed this in the lower right corner.
To denote the energy level, I created an indicator that let the heart beat. When the energy is very low, the heart beats very fast. When the energy is high, the heart beats slowly. By doing this, I combined two indicators into one icon. Also - to emphasize the energy-level, I added a sound of a single beat to the heart. This gives the player some audio-feedback. This, I feel, is necessary to give the player a chance to become good at a game. Like counter-strike - many players skills rely somewhat on his decision to spend time reloading. Doing this strategically time-wise gives the player an advantage. Knowing when to reload relies on the players knowledge about how many rounds are still in the clip. Good players will know this by knowing the type of gun and how many rounds he has already fired. Here - by taking breaks at the right time when hunting could decide between failure or victory.
Other topics I debated with myself is the terrain. Naturally the vision in my mind is a picture of a mix of a beautiful jungle and a vast savanna, both rich with rich foliage and vegetation. This is a priority, but not so much as getting a running game. What I have made so far is a spatially partitioned terrain with two Levels of detail each. There's a script on each of them deactivating the collision detection from the rigid-body whenever there's no living entity in it. This is just an optimization, but it is necessary if trees and other landmarks should ever have to find their way into the physical world of this simulation. Also, while Unity is a nice and neat engine, that provides so much functionality that it feels like not moving a finger compared to the shit one has to set up and code if creating your own engine from bottom up, Unity does have its performance issues. I definitely have to develop a shader to render the grass and other foliage. Normally, I would go with Unity's own Terrain editor using height-maps. But there are some disadvantages of this. The built-in terrain suffers from a limit in scaling. Furthermore, it does not render foliage fast enough for this project. it has to update very fast because the player can move very fast over the terrain. So I'm designing other algorithms, but it will take time - especially when it's not in the main-development time.
The next objective is to implement the hunting-aspect of the game.
Firstly, this requires the AI aspect for all prey of learning and evolving. So this means that I have to sit down and program some evolution-algorithms for all these creatures. I think I want to try and use Neural evolution (or Neural Evolution of Augmented Topololies(NEAT) invented by Kenneth O. Stanley) , even though I predict that the learning curve is too mild for use in this application, but in return, if I can get it to work, I get a map of the development of the different brains, which is worth it, I think.
Another thing is functionality for camouflage and tracking of specific prey. This could be smell, night-vision or heat-vision/infrared vision when hunting at night. Also - I want to implement the auditory aspect of the hunting as much as the visual. So I'm looking forward to all of this.
Even though I would like to spend all my time on this project - there are a lot of things I need to take care of too. Currently, I'm looking for a new place to live, and I have to do so by the 15th of November at latest. I have to keep a job so that I can live and even though I want to keep that at a minimum so I can develop this, I have to earn some money for moving and to get by. The next update may be a little delayed - but what the heck.
I hope you enjoy this little video (there's not much to show yet, apart from the general flying and animations. As you can see, the ragdoll mode needs some serious work as it just spazzes extremely, but the idea is just that when the player hits something very hard, he looses life and becomes numb for a short period of time.)
One thing I want to redo or renovate is the player model and it's animations. The first model was on a time limit and was mostly for showing off skills in modelling and character design. The assignment I handed in looks like this in a mandatory position to show that it is skinned:
The body model and skinning I am fine with. It has enough bones and links to be flexible enough for what I want. What I absolutely hated (but got very good grades for anyway) is the wings.
Notice how sloppy the wings are made. The main problem for me is the look, but actually, the look is not the only problem. Let me show you an x-ray of how I built this model:
Notice the bones going from the wings. For every feather there's a bone controlling it. That means approximately 30 bones, which is not so bad since there's not necessarily a need for every feather to be skinned, but it is a hell to animate. The only good thing about having every feather controlled individually is that you can have 3-5 different feather textures and make them very hi-res without so much memory cost. A good engine would batch the rendering and good performance would not be a problem.
But. No feathered wing looks like this. If an ornithologist saw this, he would rip my guts out or castrate me on the spot, everyone know how vicious they are. So to be safe, I wanna change this.
Every bird wing has a number of layers and approximately 90 different feather types and shapes. So to make the wing look more real and voluminous, I wanted to do it "right".
For reference, here's a drawing of the wing anatomy of a beautiful little bird called a "wall-creeper" (Tichodroma Muraria)
There are many different feather types and wing shapes custom for the requirements of the specific bird. As you can see, there are the feathers that lift (the secondaries and the primaries) and some covering feathers that help micromanage the flight.
So I wanted to try and model something that I liked the look of and matched more what a real wing looked like. Here's an initial model:
Here you can see what it looks like in viewport (don't mind the depth errors of the secondaries on the right wing). Here, I have two bones to control the spread of the primaries(the outer-most feathers) and two bones to control the secondaries. The Coverts are controlled by the wing inner bones. There's also an alula on each wing to make it look more like real wings.
These feathers were taken from a reference picture of feathers from a red-tailed hawk, since they have some more camouflage in their colouring. Since most of the hunting is gonna be in desert/Savannah/jungle environment, I wanted something that could blend in when sneaking around in tall grass.
Please note that I am not an artist and I don't pretend to be. But I mostly know what I want.
Here's a rendering so that you can get a feel without the depth errors in the viewport:
In my eyes, this is much more pleasing to look at and if I were playing this character, I'd be less irritated.
So the next few steps for me now is animating the thing and then coding some solid but fluent controls for it in a suitable test-scene. So I'm gonna do that now.
The Next step is implementing a learning system for the creatures using a neural evolution algorithm.
After that I think I'll continue on the world design. More on that later.
Thanks for reading.
So - the last couple of weeks I have been working on building a framework for some AI.
In a game like the one I'm building, this is rather important. I estimate 40% of my time is gonna go into the AI. What I want is a hunting game, where the AI learns from the players behaviour. This is actually what is gonna make the game fun to play. This will require some learning from the creatures that the player hunt and some collective intelligence per species. Since I am not going to spend oceans of Time creating dialogue, tons of cut-scenes and an epic story-line and multiple levels (I can't make something interesting enough to make it worth the time - I need more man-power for that), what I can do, is create some interesting AI and the feeling of being an actual hunter, that has to depend on analysis of the animals and experimentation on where to attack from. SO.. To make it as generic as possible, I mediated everything, using as many interfaces a possible for the system. You can see the general system here in the UML diagram. I customized it for Unity so that it is required to add all the scripts to GameObjects in the game world. This gives a better overview, but requires some setup - not that bothersome.
If you add some simple Game Objects and some colors, it could look like this in Unity3D:
Now, this system works beautifully. The abstraction of the Animation Controller and Movement Controller assumes some standard stuff that applies for all creatures. For example that they all can move, have eating-, sleeping and drinking animations, and have a PathFinder script attached somewhere in the hierarchy. It's very generic and easy to customize. At some point I'll upload a video of the flocking behavior and general behavior of this creature. For now, I'm gonna concentrate on finishing the Player model, creating a partitioned terrain for everything to exist in. Finally and equally important, I have to design a learning system for all the creatures. This will be integrated into the Brain of all the creatures, but I might separate the collective intelligence between the species.
It's taking shape, but I still have a lot of modelling to do, generating terrain and modelling/generating trees and vegetation.
Thanks for reading,
Through my adventures of Computer Science and my voyage through learning about electrical engineering, software architecture, modelling, and animation, I have been testing everything in various real-time engines to satisfy my curiosity. One of the bigger test scenarios have become so interesting that I decided to make a game out of it.
The idea sprang from nothing more than a character model that I was required to do for the Game Institute (tm). It was just a model with some skinning, but I wanted to try and animate it and put it into a real-time engine. The model is a tiger-bird hybrid, and the idea of flying and hunting interested me, so I started programming some effects and controls for flying. Everything of course got pushed aside because of other courses and I forgot about it for a while. I have so many ideas sometimes, it is a bit hard to keep track of them all. I write them down and then they are forgotten.
A year later and I'm sat with more time on my hands now. This project is small but requires good programming on the controls and the AI. The work, that has to be done, is affordable enough that I can produce it myself. The world requires vegetation and terrain, and some wild-life. Everything requires sound-effects and atmosphere. I need some very solid controls on the actual character and some good collision handling/animations ( falls, crashes , catches and take-downs ). Last, but not least, I need some cut-scenes to lead the player in and immerse them (All my fav games have some good cut-scenes!). Story is everything, and even though I can't afford voice-actors or mo-cap animations, I can definitely create lively cut-scenes using the graphics and sound I can muster up for my requirements.
I have everything planned out and half the assets done. The rest should come swiftly as I get into routine, and I look forward to the first tests!
I will keep posts here on GameDev, partly as a diary and partly as a way to get critique and new ideas. Test-versions might also come up for better feed-back, but I'm still green and, I must admit, shy to what the internet might spout in my face about my baby ( why should it matter, though? The point is to have fun ).
It's not gonna be 50 hours a week, 30 is more likely. I have a full-time job to attend, a search for a new job to keep up and pheromones to apply to a hungry female of my species. Time-frame is about 2 months of work and I intend to keep as close to the deadline as my head will allow.