LifeAI open source C++ library

Started by
13 comments, last by ajaytemp_55190 12 years, 5 months ago
LifeAI is an open source C++ library for Artificial Intelligence decision-making. It can be integrated in computer games, robotics, or anything that requires simulated intelligence.

Entities and actions are created in C++ and text files with LifeAI's syntax. Entities have characteristics that describe them called "stats". They consider their value of the stats and how much they are modified to assess the "pros" and "cons" of an action. They also consider their value of other entities involved in the action and how it affects them. They can then determine an overall value for performing the action. As stats in the simulation change, LifeAI updates the best action options for the entities. The actions can then be implemented as needed in a simulation.

LifeAI is licensed under the MIT X11 license. The library and a demonstration application can be downloaded at this link:

[color="#000000"]http://www.lifeai.com/download.html
Advertisement

LifeAI is an open source C++ library for Artificial Intelligence decision-making. It can be integrated in computer games, robotics, or anything that requires simulated intelligence.

Entities and actions are created in C++ and text files with LifeAI's syntax. Entities have characteristics that describe them called "stats". They consider their value of the stats and how much they are modified to assess the "pros" and "cons" of an action. They also consider their value of other entities involved in the action and how it affects them. They can then determine an overall value for performing the action. As stats in the simulation change, LifeAI updates the best action options for the entities. The actions can then be implemented as needed in a simulation.

LifeAI is licensed under the MIT X11 license. The library and a demonstration application can be downloaded at this link:

[color="#000000"]http://www.lifeai.com/download.html


An EmotionAI would be pleasant too.
So what all is in this library? Goal hierarchies? Fuzzy logic modules? Or is this everything wrapped into one package?
Thanks for the responses.

@ VJo1 - LifeAI can already simulate emotions to some extent. :) You can make stats that represent emotions such as "happy, sad, fear, anger", ect... You can then design actions that involve or modify their emotion levels. LifeAI also contains a function laiGetSat() that attempts to retreive an overall satisfaction level of an entity based on their stat levels and values of those stats.

@ _orm_ - LifeAI is not based on a common approach to AI, but it probably has similarities. A basic description is provided in the original post, but a more detailed description can be obtained here: http://www.lifeai.co...es/general.html I recommend viewing the video of the demonstration application here: http://vimeo.com/28839032
You can also download it http://www.lifeai.com/download.html if you'd like to try it. It includes a text file that demonstrates how entities, stats, values, and actions all contribute to the logic of the system.

Thanks for the responses.

@ VJo1 - LifeAI can already simulate emotions to some extent. :) You can make stats that represent emotions such as "happy, sad, fear, anger", ect... You can then design actions that involve or modify their emotion levels. LifeAI also contains a function laiGetSat() that attempts to retreive an overall satisfaction level of an entity based on their stat levels and values of those stats.

@ _orm_ - LifeAI is not based on a common approach to AI, but it probably has similarities. A basic description is provided in the original post, but a more detailed description can be obtained here: http://www.lifeai.co...es/general.html I recommend viewing the video of the demonstration application here: http://vimeo.com/28839032
You can also download it http://www.lifeai.com/download.html if you'd like to try it. It includes a text file that demonstrates how entities, stats, values, and actions all contribute to the logic of the system.


@ VJo1 - LifeAI can already simulate emotions to some extent. You can make stats that represent emotions such as "happy, sad, fear, anger", ect... You can then design actions that involve or modify their emotion levels. LifeAI also contains a function laiGetSat() that attempts to retreive an overall satisfaction level of an entity based on their stat levels and values of those stats.[/quote]

I think it is emulate.
@ VJo1 - LifeAI can already simulate emotions to some extent. :) You can make stats that represent emotions such as "happy, sad, fear, anger", ect... You can then design actions that involve or modify their emotion levels. LifeAI also contains a function laiGetSat() that attempts to retreive an overall satisfaction level of an entity based on their stat levels and values of those stats.


Always an interesting subject (probably because it ended up my diploma thesis). Unfortunately the biggest challenge wasn't coming up with a model that would combine personality, mood and emotion (and different "spaces"), but the part where the system needs to know how to evaluate an event. The concept of "something with negative consequences happens to an agent I have high sympathy for causes pity" is easy, but judging the consequences of events was lots of tedious work with highly arbitrary weights. Even worse when you have to consider the cause of an event or morally judge an action.

To get an idea what I'm talking about: http://www.uni-ulm.de/~hhoffman/emotions/downloads/simplex.pdf
It's a heavily edited and shortened version for some book.
f@dzhttp://festini.device-zero.de

[quote name='Jeremiahg' timestamp='1317444128' post='4867872']@ VJo1 - LifeAI can already simulate emotions to some extent. :) You can make stats that represent emotions such as "happy, sad, fear, anger", ect... You can then design actions that involve or modify their emotion levels. LifeAI also contains a function laiGetSat() that attempts to retreive an overall satisfaction level of an entity based on their stat levels and values of those stats.


Always an interesting subject (probably because it ended up my diploma thesis). Unfortunately the biggest challenge wasn't coming up with a model that would combine personality, mood and emotion (and different "spaces"), but the part where the system needs to know how to evaluate an event. The concept of "something with negative consequences happens to an agent I have high sympathy for causes pity" is easy, but judging the consequences of events was lots of tedious work with highly arbitrary weights. Even worse when you have to consider the cause of an event or morally judge an action.

To get an idea what I'm talking about: http://www.uni-ulm.de/~hhoffman/emotions/downloads/simplex.pdf
It's a heavily edited and shortened version for some book.
[/quote]


OCC model is good and implementable right to left without mistake for video games and RPG books.

The term SIMPLEX may have trademark infringement because it is famous in linear programming.

About Fig 2:
imo, Memory -> Mood-state is more accurate then Memory -> Appraisal.
Latter implies self-study of mind with reflection.
Entities have characteristics that describe them called "stats". They consider their value of the stats and how much they are modified to assess the "pros" and "cons" of an action. They also consider their value of other entities involved in the action and how it affects them. They can then determine an overall value for performing the action. As stats in the simulation change, LifeAI updates the best action options for the entities. The actions can then be implemented as needed in a simulation.

This is the basis for what many of you have heard referred to as utility theory around here.


An EmotionAI would be pleasant too.

Utility can represent anything that can be put on a continuum. Whether it be bullets in your gun, hunger, or a level of happiness, anger, etc. It's all numbers.


@ _orm_ - LifeAI is not based on a common approach to AI, but it probably has similarities.

I seriously disagree with this premise. While it is not in as wide of usage as FSMs, BTs, etc., utility architectures have been around in game AI for a long time. In fact, there are entire games that are built on this foundation... e.g. The Sims. Admittedly, it has only recently been coming to the forefront... mostly in the past 3 years -- coincidentally since the release of my book which is almost entirely about modeling behavior through utility.


Always an interesting subject (probably because it ended up my diploma thesis). Unfortunately the biggest challenge wasn't coming up with a model that would combine personality, mood and emotion (and different "spaces"), but the part where the system needs to know how to evaluate an event. The concept of "something with negative consequences happens to an agent I have high sympathy for causes pity" is easy, but judging the consequences of events was lots of tedious work with highly arbitrary weights. Even worse when you have to consider the cause of an event or morally judge an action.

You are correct in pointing out that this is the major pitfall. With no sleight to the OPs project, putting together a routine to crunch numbers is actually a trivial process. Many AI programmers can build a utility-based decision architecture in a day or two. The trick is how to model what numbers to use. It's somewhat of a dark art to many people.

While dealing in concrete numbers is relatively simple (e.g. "how many bullets to I have left?"), taking a concrete number and overlaying an abstract "meaning" to it is more difficult... e.g. "how does the number of bullets I have left relate to my necessity for reloading?" Further still, attempting to codify something that is, in essence, an abstract concept from the start is even more amorphous -- e.g. "how angry am I?" The ultimate is converting an abstract into another abstract -- "how does my overall perception of danger affect my anxiety level?". Now THAT is the part that is tough for people to get a handle on.

So again, while the tool presented might shave a few hours off your workload, it is just a tool. And tools don't create believable AI... they only enable it. That said, many people sure could use some more tools.

Dave Mark - President and Lead Designer of Intrinsic Algorithm LLC
Professional consultant on game AI, mathematical modeling, simulation modeling
Co-founder and 10 year advisor of the GDC AI Summit
Author of the book, Behavioral Mathematics for Game AI
Blogs I write:
IA News - What's happening at IA | IA on AI - AI news and notes | Post-Play'em - Observations on AI of games I play

"Reducing the world to mathematical equations!"

Sounds to me like it is being lauded as a silver bullet approach to AI then, to which I would be inclined to agree with IADaveMark; while being able to model emotions and overall human satisfaction could greatly enhance any AI, it still has to compliment other AI techniques such as goal or state driven systems in order to turn these numbers into meaningful action on the part if the agent.

I like the work you are doing, and it could actually prove to be a viable alternative to fuzzy data sets and other forms of decision making, being based on emotions rather than on sets of data such as aggression, guts, and determination. I actually can see much more human like behavior coming from such techniques.

Why does that prospect send a chill down my spine? :blink:

@IADaveMark - Ah, okay. So there IS an established term for this approach, "Utility". I appreciate the info.


This topic is closed to new replies.

Advertisement