Getting your AI opinions on what makes a good AI

Started by
33 comments, last by BennettSteele 12 years ago
Well it depends... are those two whole sentences synonyms for "a meteorite hit"? Is the low intelligence person's word for "meteorite" "alien"? Could get awkward. Myself I see those as two different concepts, not just words. The first NPC is spinning a theory which is only tenuously connected to the facts. If on the other hand the low intelligence NPC said "ma beating stick" instead of "my baseball bat"... well, fair enough (assuming baseball bats are primarily used as weapons in the game).
Advertisement
Thats exactly what i meant. tongue.png

Thats a better example than what i said. Would anyone like to have this kind of AI in a game? I really like the concept of being "god" if you will.

EDIT:

the fact that things can occur that you did not plan to happen, or that you couldnt plan on happening.
bennetbugs,
Sounds like we (you & I) are trying to achieve basically the same objective: 'realistic' NPC actions determined by individual personality traits and dynamic environment variables. In exploring this thought, I have determined, so far, that a hierachial task network based on a version of Maslov's hierarchy of needs (modified by personality traits) to determine a small list of 'goals' that are then GOAPed through an action/skill tree to come up with a plan (series of specific actions to undertake to achieve a particular goal) which an agent/npc then uses to figure out 'what do I do now'.

Though this is all based on known AI routines and techniques, I have come up against several obstacles which really hinder 'realism'. First and foremost is that of 'agent memory' - having each agent retain and have access to their own perception of world objects/dangers/etc. This, by itself, greatly increases the amount of physical memory each agent consumes, as well as processing time to access/interpret the data. Without memory, agents will continually repeat mistakes (choosing to go through 'dangerous' areas, etc) in an unrealistic fashion. I have yet to solve these types of memory issues.

I also recommend Dave Mark's book, "Behavioral Mathematics in Game AI" which has helped me in a variety of ways, mostly to figure out alternate ways to use the personality traits through use of different math functions so that responses are more 'human' and less linear.
Thanks, ill take a look at that book. On actions, i thought of how steps can be used to make bigger steps that form goals. What came to my mind first is how a computer works: using basic commands (move, waiting) you can form actions and goals easily. So far it has worked really well to use simple tasks. I think a way to save memory for memories (XD) is by having basic IDs for generic memories. For example, there could be types, like "Place", "Person", "Event", each of which have a ID. based on that general ID, you could have a few numbers that describe it. An example would be a place. It has the location, and reaction too it. i would say a place only needs 1 number, from "Ok to go" to "never go there". And later on, you can use the simple description of that memory too do the rest of the AI calculations. I would make a few more descriptors probably, but i have yet to start coding the memories. That reminds me.... i should probably also do stress tests before i code too much of the AI all at once...

EDIT:

I also thought of how the speaking works. The only time they need to use the work bank is for communicating to the player. Otherwise, the AIs just send concepts too each other, like numbers and such.

I also thought of how the speaking works. The only time they need to use the work bank is for communicating to the player. Otherwise, the AIs just send concepts too each other, like numbers and such.

Humans communicate to each other via concepts too. Language is just a carrier medium for concepts.

The sentence "I am hungry" is just a collection of words that maps to the conceptual chain 'current entity' -> 'possesses' -> 'hunger'.

If your concepts are sufficiently well defined, your language support can be pretty much be a front end filter, that takes AI concepts and expresses them as words.

Unfortunately, it's doubtful if your AI will track more than a handful of concepts, which won't make for very interesting conversation.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Thats also exactly what i mean. And it doesnt have to be the raw conversation, you can script in (or by default) words that fill the sentence.
Gawd... if only I could show you what I'm working on for my current client. Personality, mood, emotion, event memory... *sigh* huh.png

Dave Mark - President and Lead Designer of Intrinsic Algorithm LLC
Professional consultant on game AI, mathematical modeling, simulation modeling
Co-founder and 10 year advisor of the GDC AI Summit
Author of the book, Behavioral Mathematics for Game AI
Blogs I write:
IA News - What's happening at IA | IA on AI - AI news and notes | Post-Play'em - Observations on AI of games I play

"Reducing the world to mathematical equations!"

I have totally avoided the entire 'speech' issue because it seemed way too overwhelming at this point. I was viewing player interaction as being restricted to: combat, trading, and jobs. But the "jobs" part would be built dynamically from the NPC's current goals... if it (the NPC) had a goal of "harvest the field" or "build a cottage" or "protect/defend an area", then a "job" could be offered to the player based on the value the NPC had determined through its GOAP search. I don't see 'conversation' as really game enhancing, but it could be I guess.

Gawd... if only I could show you what I'm working on for my current client. Personality, mood, emotion, event memory... *sigh* huh.png


frickin' tease!
XD. Maybe when im done coding the basics of my AIs i can post the code here for anyone that wants to see an example.

This topic is closed to new replies.

Advertisement