For any long-time readers, you might remember the automatic door in the "moonbase" test scene. This door opened when anything came near it using a ProximitySensor behaviour. Well, that's only partially true, as I decided I didn't want bullets to open the door, so I put a size threshold on the ProximitySensor so it ignored things smaller than a certain size.
Now, let me admit that, even if you remember that door and how it was implemented, I had totally forgotten. The compiler wasn't about to let me off that easily though, and was happy to spit out a duplicate symbol link error when I tried to add my new AI version of a ProximitySensor. So my first task was to bring the old ProximitySensor class over to the new AI Task framework in a way that would allow it to still work on non-AI things (like the door). This turned out to be pretty nasty in itself, but I won't bore you with the details of it.
Anyway with the refactoring done, here's a little ProximitySensor running as a Task in the Elephant's AI:
The observant amongst you may notice there's no longer a "Threshold" parameter on the ProximitySensor object anymore. So going back to our door example, how would we stop the door opening for bullets? And while we're at it, how would I stop my Elephant attacking other Elephants? Or make this pressure pad only work when you drop one of those 3 crates on it? Or what about setting up a trigger volume that only triggers for the player, not any NPCs?
Where I'd originally started to build some of these criteria into the sensor itself (using the size threshold), it was starting to look like it would be quite cool to be able to plug in lots of different criteria here - and even combine them together to form more complex criteria (e.g. find the closest enemy, or the heaviest object, or the most valuable treasure). Based on this, I decided to split my sensor system into two halves: the sensor itself (which is a totally unfiltered) and a network of filters that describes the types of things this program is interested in. The system is designed to be pretty extensible - you can add game specific sensors and/or filters (or perhaps even write scripted filters which just evaluate some script code to decide what to do) to add game specific AI (e.g. filter based on hit points, or mana, or faction, etc).
Currently, I have filters for object type, object name, object dynamics (size/weight/movability), and a group filter that checks to see if it's one of N specific objects. Any filter can be assembled into a tree (so you can branch on one condition, and then another, and then another - so you can attach behaviours to as specific a conditions as you want). And I'm planning on adding higher-level filters that pick the best of the current inputs based on some criteria (like Weakest, Strongest, Closest, Furthest, Newest, Oldest) - which will allow your AI to use a Type filter followed by a Closest filter to target the closest enemy say.
Here's a cow-phobic Elephant that turns around whenever the any game object called "Milkshake" gets too close to him:
With this many "events" defined, I can "crash" the AI when both sensors detect and event at the same time, making two turning "Moove" blocks fight with each other. To go much further with this one I really need to make some progress on the Task scheduler ... but more on that next time I think.
And as an added bonus (not really something I thought about when designing it), all these filters work on non-AI elements too - so you can now build a filter that specifies who/what the door will open for. Here's a little test scene where the door is hooked up to only open for the Space Prawn (as my son calls him).
And here it is in action: