# Getting your AI opinions on what makes a good AI

This topic is 2135 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

So I really like AI- not path finding, but the most amazing part to me- personality. Ive started developing/coding AIs, and what my goal is something along Fallout AIs. But something more than that. I have put together ideas, and was wondering about this kind of idea:

In the game, AIs get set random personalities. This is just the first version for testing:

float Math,Prediction,Creative,Geography;//Smarts float Arms,Legs,Core,Endurance,Speed;//Physical float Respect,Empathy,WillPower,Reactive,Leadership,Greed,Paranoia;//Personality

There are also going to be random events. I really love the concept of the unknown. So explaining the idea, lets say the AI has high empathy but low will-power. If it gets a message that someone is in need of help, the brain of the AI will use these values (with some noise) to choose whether to help them or not. There are alot of possibilities too, when you add more variables. But does anyone like the concept of realistic personalities in a game?

##### Share on other sites
I like the idea, although I'm not sure Fallout is the best shining example in terms of AI. Most AI in the games that I can think of seems scripted.

I like the idea of the different factors interacting, and I like the idea of being able to appeal to those different qualities by your actions or words. Probably those should be base values, with faction/person or situation modifiers. For example, if told that either they or someone else must die, perhaps some aspects of smarts would decrease from the fear, and willpower and paranoia would increase. Based on that they may or may not betray the other person and just look out for themselves. Perhaps you need a set of challenges a person faces and how they affect the attributes is a part of the personality too, e.g. some people buckle under pressure, others it makes their will stronger.

##### Share on other sites
Exactly. If they had low courage, those kinds of things would be affected. I want it to be quite realistic, but id like to also here what others think about it.

##### Share on other sites
I would go for a low-dimensional personality model. Perhaps 5 numbers is all you need. But I would really start with maybe 2 to mess around. Assigning personalities to agents only makes sense to the extent that your player can perceive them. So the game should involve decisions that show character. If your model is too realistic, players might not be able to accurately gauge the personalities of the agents, and the way the agents act might end up seeming random.

On the other hand, if you managed to make complex personalities and communicate them to the player, you would be able to make games about why Mark left Sally, which is something completely unattainable with current methods, and which would tap into a huge reservoir of people that don't currently play games.

You may want to read Chris Crawford on Interactive Storytelling.

##### Share on other sites
*wince* Fallout AI? Seriously?

If you want to incorporate personality, mood, and emotion in your AI, the Sims (especially Sims 3) is the go-to standard of it. This utility-based method can easily be transferred over into RPGs or even shooters with the caveat that it needs to be perceivable by the player as Alvaro noted.

##### Share on other sites

*wince* Fallout AI? Seriously?

I assume the OP is talking about the conversations with AI's that occur during Fallout (and are entirely scripted). Not the (atrocious) combat/movement AI.

##### Share on other sites
XD well, as in Fallout AI: like Bethesda AIs- always have tasks assigned and are doing things even without you there.... if im correct. And also, im trying to create the AI so you can store basic memories- types like reputations, or past events. These come to play in decision making later.

EDIT:

And conversation like fallout, but are completely unscripted (meaning different personalities have different word banks for the same meaning).

##### Share on other sites
Hah, Fallout 3 != Fallout.

Bethesda's version of Fallout is substantially different from the original two games.

</ot>

You're talking about basic knowledge-representation concepts. This is a very broad field and you can spend a lot of time digging into it; but if you can clarify your design goals a bit, we can probably guide you to some common/useful techniques.

Also, we seem to have different definitions of "scripted." Just because Bob says "potato" and Joe says "potahtoe" doesn't mean they aren't using some kind of conversation script.

##### Share on other sites

XD well, as in Fallout AI: like Bethesda AIs- always have tasks assigned and are doing things even without you there.... if im correct. And also, im trying to create the AI so you can store basic memories- types like reputations, or past events. These come to play in decision making later.

You are really gonna want to scale this back a bit until you get your hands around it some. A utility-based AI is one thing. One that is based on longer-term things such as memories of past actions really can tip out of balance quickly. Get a UBAI system a la Sims 3 up and running first.

And conversation like fallout, but are completely unscripted (meaning different personalities have different word banks for the same meaning).
[/quote]
There's been quite a lot of work done on this already. However, it is more complicated than word choice. I believe Baylor Wetzel has an article on it in AIGPW 4 or GPG 8. Can't remember which.

##### Share on other sites
XD Yeah, i know its going to be huge. By scripted, i mean... lets say a meteor lands. Someone with low intelligence says "Oh my god is that a alien or somethin?!", and someone with a higher intelligence says "There seems to have been an impact." Both have the same meaning, but are different in word choices.

##### Share on other sites
Well it depends... are those two whole sentences synonyms for "a meteorite hit"? Is the low intelligence person's word for "meteorite" "alien"? Could get awkward. Myself I see those as two different concepts, not just words. The first NPC is spinning a theory which is only tenuously connected to the facts. If on the other hand the low intelligence NPC said "ma beating stick" instead of "my baseball bat"... well, fair enough (assuming baseball bats are primarily used as weapons in the game).

##### Share on other sites
Thats exactly what i meant.

Thats a better example than what i said. Would anyone like to have this kind of AI in a game? I really like the concept of being "god" if you will.

EDIT:

the fact that things can occur that you did not plan to happen, or that you couldnt plan on happening.

##### Share on other sites
bennetbugs,
Sounds like we (you & I) are trying to achieve basically the same objective: 'realistic' NPC actions determined by individual personality traits and dynamic environment variables. In exploring this thought, I have determined, so far, that a hierachial task network based on a version of Maslov's hierarchy of needs (modified by personality traits) to determine a small list of 'goals' that are then GOAPed through an action/skill tree to come up with a plan (series of specific actions to undertake to achieve a particular goal) which an agent/npc then uses to figure out 'what do I do now'.

Though this is all based on known AI routines and techniques, I have come up against several obstacles which really hinder 'realism'. First and foremost is that of 'agent memory' - having each agent retain and have access to their own perception of world objects/dangers/etc. This, by itself, greatly increases the amount of physical memory each agent consumes, as well as processing time to access/interpret the data. Without memory, agents will continually repeat mistakes (choosing to go through 'dangerous' areas, etc) in an unrealistic fashion. I have yet to solve these types of memory issues.

I also recommend Dave Mark's book, "Behavioral Mathematics in Game AI" which has helped me in a variety of ways, mostly to figure out alternate ways to use the personality traits through use of different math functions so that responses are more 'human' and less linear.

##### Share on other sites
Thanks, ill take a look at that book. On actions, i thought of how steps can be used to make bigger steps that form goals. What came to my mind first is how a computer works: using basic commands (move, waiting) you can form actions and goals easily. So far it has worked really well to use simple tasks. I think a way to save memory for memories (XD) is by having basic IDs for generic memories. For example, there could be types, like "Place", "Person", "Event", each of which have a ID. based on that general ID, you could have a few numbers that describe it. An example would be a place. It has the location, and reaction too it. i would say a place only needs 1 number, from "Ok to go" to "never go there". And later on, you can use the simple description of that memory too do the rest of the AI calculations. I would make a few more descriptors probably, but i have yet to start coding the memories. That reminds me.... i should probably also do stress tests before i code too much of the AI all at once...

EDIT:

I also thought of how the speaking works. The only time they need to use the work bank is for communicating to the player. Otherwise, the AIs just send concepts too each other, like numbers and such.

##### Share on other sites

I also thought of how the speaking works. The only time they need to use the work bank is for communicating to the player. Otherwise, the AIs just send concepts too each other, like numbers and such.

Humans communicate to each other via concepts too. Language is just a carrier medium for concepts.

The sentence "I am hungry" is just a collection of words that maps to the conceptual chain 'current entity' -> 'possesses' -> 'hunger'.

If your concepts are sufficiently well defined, your language support can be pretty much be a front end filter, that takes AI concepts and expresses them as words.

Unfortunately, it's doubtful if your AI will track more than a handful of concepts, which won't make for very interesting conversation.

##### Share on other sites
Thats also exactly what i mean. And it doesnt have to be the raw conversation, you can script in (or by default) words that fill the sentence.

##### Share on other sites
Gawd... if only I could show you what I'm working on for my current client. Personality, mood, emotion, event memory... *sigh*

##### Share on other sites
I have totally avoided the entire 'speech' issue because it seemed way too overwhelming at this point. I was viewing player interaction as being restricted to: combat, trading, and jobs. But the "jobs" part would be built dynamically from the NPC's current goals... if it (the NPC) had a goal of "harvest the field" or "build a cottage" or "protect/defend an area", then a "job" could be offered to the player based on the value the NPC had determined through its GOAP search. I don't see 'conversation' as really game enhancing, but it could be I guess.

##### Share on other sites

Gawd... if only I could show you what I'm working on for my current client. Personality, mood, emotion, event memory... *sigh*

frickin' tease!

##### Share on other sites
XD. Maybe when im done coding the basics of my AIs i can post the code here for anyone that wants to see an example.

##### Share on other sites

But the "jobs" part would be built dynamically from the NPC's current goals... if it (the NPC) had a goal of "harvest the field" or "build a cottage" or "protect/defend an area", then a "job" could be offered to the player based on the value the NPC had determined through its GOAP search. I don't see 'conversation' as really game enhancing, but it could be I guess.

Sims 3 originally tried to use a planner for the Sims dealing with their career paths, jobs, etc. Until one time they couldn't figure out why a Sim was cooking fish all day. They finally figured out that it was because he had determined that it was the best way at the time to work towards his "life goal". Now they certainly could have balanced it out a lot better than that example, but they decided to go simply with a more utility-based method of forward chaining rather than the inherent back-chaining of a planner.

##### Share on other sites
XD cook fish for a living...

Conversations include giving goals and such. i should have said interaction. But the way they express interaction the most would be by conversation.

EDIT:

Im also not liking the idea of having to use different threads for each AI... I should try as hard as possible to use triggers than checking everything.

##### Share on other sites

Sims 3 originally tried to use a planner for the Sims dealing with their career paths, jobs, etc. Until one time they couldn't figure out why a Sim was cooking fish all day. They finally figured out that it was because he had determined that it was the best way at the time to work towards his "life goal". Now they certainly could have balanced it out a lot better than that example, but they decided to go simply with a more utility-based method of forward chaining rather than the inherent back-chaining of a planner.

Obviously such a situation (cooking fish all day towards a 'life goal') indicates a Goal balancing issue where Life Goals have less of an immediate need. It seems to me (and I would in NO way compare my inept skills to any in the professional realm) that back-chaining is superior to forward chaining in 'realism' and also in accomplishing multiple goals simultaneously.... by using a utility/desire weighting system that would encourage the following example:
An NPC has a strong need/desire to satisfy hunger. In addition, it has a less strong desire to make a shelter. Since the food goal is higher than the shelter, the NPC plans accordingly and sets off on a path to the nearest berry bush. Near to the path lies some tree branches which would satisfy a portion of the 'build shelter' goal so the AI incorporates a short detour on his way to the berries and picks up the tree branches.

Being able to do such multi-tasking is important in my own AI design goals, but probably not so much to the AI developers of the Sims - their agents seem to do a single goal at a time which can be interrupted by higher priority events, etc.

##### Share on other sites
I should have stated that the aforementioned Sim had a life goal of becoming a Master Chef. His cooking the fish was trying to increase his cooking skill.

That said, back chaining is far more efficient for planning purposes... however what was happening was that players could not immediately intuit why a Sim was taking a particular action because the action was often many steps removed from what he was trying to solve. Therefore, despite being "correct", it looked out of place and inexplicable.

Regarding triggers vs. threads... the two are not mutually exclusive. That said, constant polling of the environment is computationally expensive if done incorrectly but also yields some more subtle behavior than having things entirely trigger-based. There are pros and cons. You can also have "immediate action" triggers show up as high-priority decisions in the landscape (so to speak) so that they can't be ignored.

##### Share on other sites
Well, this is how i think of it:

There is alot in the environment. Only a fraction is what the AI uses. If you can flag certain things in the environment as needs to be checked, you might be able to have both in one.