Jump to content

  • Log In with Google      Sign In   
  • Create Account


Adaptive Virtual Game Worlds: Where to Begin?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
90 replies to this topic

#61 Nathaniel Hammen   Members   -  Reputation: 136

Like
Likes
Like

Posted 04 April 2004 - 06:17 PM

Another thing that you need to do is combine similar memories. For example, Jack hears that Empire A has attacked a small village on the outskirts of Empire B. His memory now contains "Empire A attacked town of whatever in Empire B." The next day, Empire A makes another attack. And the next day. And the next. And pretty soon Jack''s memory will contain "Empire A frequently attacks Empire B." Sure, if the village of Asdfgh was attacked, he would remember, and tell other people for the next few days. But only when this memory is in his short term memories. When it enters his long term memories, it joins with "Empire A frequently attacks Empire B." He can''t keep track of every single little town that was attacked after all.

--------------------------------------
I am the master of stories.....
If only I could just write them down...

Sponsor:

#62 Timkin   Members   -  Reputation: 864

Like
Likes
Like

Posted 04 April 2004 - 07:45 PM

quote:
Original post by irbrian
I.E., when an NPC stores a memory, he technically stores it in small chunks, and some of those chunks may be recalled more easily than others. If a chunk seems to be missing as the NPC Agent isn''t able to recall the complete memory, the NPC either conveys the incomplete information as is, or treats the memory as an observation and tries to piece together what might have happened. Maybe the resulting belief will be correct, maybe not. Over time, these beliefs may even begin to replace the original beliefs, thus causing different NPCs to recall events with subtle variations.



You might be interested to know that accepted cognitive theory regarding memory holds that sensory information is not stored in memory as a faithful reproduction of the original information. Indeed, experiments have shown that memories display reordering, recontruction and condensation of information. The subjects were interpreting the original material so that it made sense upon recall. Indeed, it is beleived that recall involves a process whereby representations of past experiences are used as clues in reconstructing the model of the event. Different cognitive strategies, like comparison, inference, guesses and suppositions are used to generate a consistent and coherent memory.

This partly explains why memory changes with age, since our methods of recall and our ability to utilise them change with our experiences. So, a memory of your 3rd christmas, recalled when you are at age 10, would probably be different to that same memory recalled at age 50.

As to whether you want to build such a model into an NPC? Well, it would be an extremely complex system, even for basic memories. It would also need to be individual, or tunable to each agent. That''s a LOT of work and a lot of computational load required to train and utilise the system.

Cheers,

Timkin

#63 Jotaf   Members   -  Reputation: 280

Like
Likes
Like

Posted 05 April 2004 - 01:21 AM

Actually, it wouldn''t be that hard. You don''t need to simulate a real brain, because even though some parts really need to be close to how a brain works, others can be kinda emulated. Specially in a game world, where information is not so varied as in the real world. Some approximations will work just as fine. Like, the difficult topic of inference and logic. With a database of a few hundred logic assumptions (assuming that you have a structure that models knowledge correctly), you can achieve almost the same results if you just compare some pieces of knowledge with these logic rules. Like, Empire A attacked Town B. Town B is close to our town. So, we could be attacked by Empire A at any moment! With a powerful scripting engine this wouldn''t be that hard to do IMHO =)

#64 Timkin   Members   -  Reputation: 864

Like
Likes
Like

Posted 05 April 2004 - 02:50 AM

quote:
Original post by Jotaf
With a powerful scripting engine this wouldn''t be that hard to do IMHO =)


Actually, it''s by no means a trivial feat. One of the problems with such systems is that they cannot deal with contradictory facts. Additionally, one something has been asserted as true, its nigh impossible to draw the conclusion that it is false given new information. That is, you cannot retract things that you currently know. There are lots of other problems, but I wont go into them... any decent AI text dealing with First Order Logic in knowledge bases should give a good coverage and explanation of why such systems are no longer in use.

Furthermore, the example you just gave involves induction as well as deduction, which is computationally difficult, particularly in FOL systems.

Probability models using Bayesian calculus solve many of these problems, however such systems (such as Bayesian Belief Networks) are computationally expensive to operate.

Finally, I think you might be underestimating the complexity of the task of even approximating how memories are stored and retrieved in the human brain, even for a limited domain such as an artificial world.

Cheers,

Timkin

#65 irbrian   Members   -  Reputation: 130

Like
Likes
Like

Posted 05 April 2004 - 03:20 AM

quote:
Original post by Timkin
One of the problems with such systems is that they cannot deal with contradictory facts. Additionally, one something has been asserted as true, its nigh impossible to draw the conclusion that it is false given new information. That is, you cannot retract things that you currently know.


Contradictory facts should be allowable, and should be the basis for questions. If the NPC believes that John robbed the store, and then stores a new belief that John DIDN''T rob the store, all other factors being equal, he now has a question: Did John rob the store or not?

For this to work though, it seems to me that NPCs should rarely be given enough information to assert something as "True" (as in Universal Law). Even we humans cannot say that something is universally true (except under very specific circumstances of manifestation, which I will not get into here).

In fact, I would suggest that Universal Truths are to be "hard-wired" by the programmer, and the NPC should be incapable of forming beliefs about new Truths on his own. All NPC beliefs should be based on fuzzy logic -- "I believe that someone has robbed my store." Sure, it seems pretty obvious when the NPC walks into his store and finds things missing -- or, let''s say, when the NPC personally witnesses someone break into his store and steal things. That would be a pretty strong belief. But ultimately this is all based on perception/perspective. While it is probably true, it could also theoretically be the case that the whole thing is an act, and the robber was only pulling a prank, and will return the items at some point in the future. It''s a long shot, sure, but the NPC cannot KNOW with absolute certainty that it is not the case, or for that matter he cannot know that it is not an illusion or delusion.

This way, if an NPC ever is informed that something is false when he has already assumed it to be true, he will not "break." It would take a lot of evidence to convince him, of course.

quote:
Finally, I think you might be underestimating the complexity of the task of even approximating how memories are stored and retrieved in the human brain, even for a limited domain such as an artificial world.


Everything we''ve been discussing here is, I believe, monumentally difficult, especially considering the sheer amount of research that has gone into getting us to our current level of collective knowledge on AI. As I stated originally, I don''t expect this whole thing to come into being in the very near future, and certainly not without vast efforts made by many highly knowledgeable individuals. Additionally, the amount of processing power that I think would be required for a system like this would even today be incredibly expensive.

My idea is to begin to model small parts of this system with one and two NPCs for experimentation.

****************************************

Brian Lacy
ForeverDream Studios

Comments? Questions? Curious?


"I create. Therefore I am."

#66 Jotaf   Members   -  Reputation: 280

Like
Likes
Like

Posted 05 April 2004 - 08:57 AM

Hmm... Timkin you didn''t get my point. I know it''s really hard to have an AI like that. The example I gave you should be very complex for an AI agent to come up with it by itself. But that''s on purpose. What I''m suggesting is that there''s a database with lots of rules like this and the NPC just has to access these rules and use them to make inferences and draw conclusions. Of course they wouldn''t cover everything... but it would take a lot less effort than trying to get the same results with emergent behaviour, especially if you''re using neural networks. It would also solve the robbery problem: "evidence A" -> "It''s likely that [John robbed the store]"; "evidence B" -> "It''s not likely that [John robbed the store]" - so "John robbed the store" has 2 conflicting informations attached to it; when the NPC wants to know if it''s true or not, it sees the contradiction and answers "I''m not sure".

#67 irbrian   Members   -  Reputation: 130

Like
Likes
Like

Posted 05 April 2004 - 09:42 AM

I''m afraid I still don''t understand your point. Could you give a more detailed example?

#68 Timkin   Members   -  Reputation: 864

Like
Likes
Like

Posted 05 April 2004 - 02:52 PM

quote:
Original post by Jotaf
Hmm... Timkin you didn''t get my point.



Actually, I did get your point. The sort of system you are describing has long been discarded in AI as insufficient for a knowledge base for an agent. Please go and look up production systems and first order logic, rule-based systems specifically.

irbrian, you used the term fuzzy logic in your last post. If you meant Fuzzy Logic (as in Fuzzy Set Theory) then this is an inappropriate tool for this task. Fuzzy Logic is an alternative to Aristotlean (First Order) Logic and deals with set membership. FL does not offer a representation of uncertainty. If, on the other hand, you simply meant fuzzy to mean uncertain, then a) you shouldn''t really use the words fuzzy logic, because they mean something else; and, b) you need a representation like Dempster-Shafer Theory or Probability Theory, both of which enable the representation of uncertainty of beliefs, the incorporation of evidence to update beliefs, the ability to hold paradoxic beliefs, etc.

Given the recent discussions, I think you really should take a look at NAG. It might be a good first project to create the argumentation and inference mechanism for the NPC.

Cheers,

Timkin

#69 Nathaniel Hammen   Members   -  Reputation: 136

Like
Likes
Like

Posted 05 April 2004 - 04:28 PM

Actually Fuzzy Logic could work, if you had the certainty represented by a fuzzy variable. 1.0 meaning absolutely certain and 0.0 meaning not certain at all. But I do agree that fuzzzy logic is not very useful for the task. I just wated to point out that it COULD be used.

You know what? More and more, I want to get my hands on a copy of AI Game Programming Wisdom. I want to buy it, but I don''t want to pay $50 for a book, when I am living off of my parents good will.

--------------------------------------
I am the master of stories.....
If only I could just write them down...

#70 irbrian   Members   -  Reputation: 130

Like
Likes
Like

Posted 05 April 2004 - 04:35 PM

quote:
Original post by Timkin
irbrian, you used the term fuzzy logic in your last post. If you meant Fuzzy Logic (as in Fuzzy Set Theory) then this is an inappropriate tool for this task. Fuzzy Logic is an alternative to Aristotlean (First Order) Logic and deals with set membership. FL does not offer a representation of uncertainty. If, on the other hand, you simply meant fuzzy to mean uncertain, then a) you shouldn''t really use the words fuzzy logic, because they mean something else; and, b) you need a representation like Dempster-Shafer Theory or Probability Theory, both of which enable the representation of uncertainty of beliefs, the incorporation of evidence to update beliefs, the ability to hold paradoxic beliefs, etc.
I have always heard (and read) the term Fuzzy Logic used simply to refer to logic that was not simply True or False. I understood this to be the basic definition of FL. If that is incorrect, then so be it.. just means I''ve been reading a whole lot of inaccurate material, including in a rather official-looking textbook on Set Theory (the name of which I cannot recall but can locate if you care).
quote:
Given the recent discussions, I think you really should take a look at NAG. It might be a good first project to create the argumentation and inference mechanism for the NPC.
I have no idea what NAG is or where to find it. Google, here I come...!

#71 irbrian   Members   -  Reputation: 130

Like
Likes
Like

Posted 05 April 2004 - 04:37 PM

quote:
Original post by Nathaniel Hammen
Actually Fuzzy Logic could work, if you had the certainty represented by a fuzzy variable. 1.0 meaning absolutely certain and 0.0 meaning not certain at all. But I do agree that fuzzzy logic is not very useful for the task. I just wated to point out that it COULD be used.
How is what you''re describing as Fuzzy Logic any different from the way I was describing FL as logical uncertainty?
quote:
You know what? More and more, I want to get my hands on a copy of AI Game Programming Wisdom. I want to buy it, but I don''t want to pay $50 for a book, when I am living off of my parents good will.
Step 1) Get a Job
Step 2) www.amazon.com


#72 irbrian   Members   -  Reputation: 130

Like
Likes
Like

Posted 05 April 2004 - 04:40 PM

What is this, my fourth post in a row? Sorry guys.
quote:
Original post by Timkin
Given the recent discussions, I think you really should take a look at NAG. It might be a good first project to create the argumentation and inference mechanism for the NPC.
As luck would have it, NAG just isn''t a specific enough term for Google to give me any useful results. Can''t imagine why. Can you give me more info? Or a URL, if that''s not too much to ask?

#73 darookie   Members   -  Reputation: 1437

Like
Likes
Like

Posted 05 April 2004 - 05:38 PM

quote:
Original post by irbrian
I have always heard (and read) the term Fuzzy Logic used simply to refer to logic that was not simply True or False. I understood this to be the basic definition of FL. If that is incorrect, then so be it.. just means I''ve been reading a whole lot of inaccurate material, including in a rather official-looking textbook on Set Theory (the name of which I cannot recall but can locate if you care).

Your interpretation might have been wrong.


#74 fup   Members   -  Reputation: 463

Like
Likes
Like

Posted 05 April 2004 - 06:14 PM

A lot of people misunderstand FL. Fuzzy logic is deterministic. The membership value (between 0 and 1) of an element to a fuzzy set represents the confidence of its membership to that set, not the probability.



My Website: ai-junkie.com | My Book: AI Techniques for Game Programming

#75 Timkin   Members   -  Reputation: 864

Like
Likes
Like

Posted 05 April 2004 - 09:30 PM

irbrian, FL is about logic... and you are correct, it''s a logic in which there are other values besides 0 and 1. But this is all set theory... As fup pointed out, the membership value is not a probability or likelihood of membership in a set. It''s a degree as to how much the item belongs to that set. In Aristotlean logic, items belong to one set or another. Statements are either true or false. In Fuzzy logic, statements can be partly true and partly false at the same time . This, however, has absolutely nothing to do with uncertainty in a statement. Fuzzy logic makes statements about things in the world. Uncertainty formalisms - such as Bayesian probabilities - make statements about the what we believe to be true or false in the world. Consider an example satement: John is a thief. Uncertainty in this statement might be represented by saying that there is a 70% chance that this statement is true. Fuzzy logic however would say that either John is a Thief, John is not a Thief, or to some degree, John is both a Thief and not a Thief.

Do you see the distinction?

As to Nathaniel''s comments that you could use Fuzzy Logic to represent uncertainty... well, I could use a hammer to crack and egg, but would it be the right tool for the job. Fuzzy Logic suffers from the same problems as all other Truth Maintenance Systems, in that the Fuzzy Calculus doesn''t produce the results we expect when performing inference.

As to NAG: Check out the websites of Ingrid Zuckerman and Kevin Korb, both from Monash University. You should be able to find sufficient information and pointers there to get you going.

Cheers,

Timkin

#76 irbrian   Members   -  Reputation: 130

Like
Likes
Like

Posted 06 April 2004 - 07:28 AM

quote:
Original post by Timkin
irbrian, FL is about logic... and you are correct, it''s a logic in which there are other values besides 0 and 1. But this is all set theory... As fup pointed out, the membership value is not a probability or likelihood of membership in a set. It''s a degree as to how much the item belongs to that set. In Aristotlean logic, items belong to one set or another. Statements are either true or false. In Fuzzy logic, statements can be partly true and partly false at the same time . This, however, has absolutely nothing to do with uncertainty in a statement. Fuzzy logic makes statements about things in the world. Uncertainty formalisms - such as Bayesian probabilities - make statements about the what we believe to be true or false in the world. Consider an example satement: John is a thief. Uncertainty in this statement might be represented by saying that there is a 70% chance that this statement is true. Fuzzy logic however would say that either John is a Thief, John is not a Thief, or to some degree, John is both a Thief and not a Thief.

Do you see the distinction?
I think I''m beginning to see the distinction now... Fuzzy Logic is about propositions being both true and false to some degree. I.E., "It is sort of hot outside" can be interpreted as "It is hot outside AND It is not hot outside." To account for degree of truthfulness, the statements are represented by a value between 0.0 and 1.0, I.E. the first proposition "It is hot outside" might be a 0.6 and the second "It is not hot outside" might have a value of 0.4.

Thus, considering the following two statements:
A) "John is sort of a thief."
B) "John might be a thief."

You are suggesting that proposition A is Fuzzy Logic, because John is both a thief and not a thief; and proposition B is more of, I dunno, a boolean probability I guess you could say.

If I''m understanding so far, I''ll re-evaluate my original statement:
"I believe someone has robbed my store."

Perhaps then this would be best broken into two statements:
A) "I believe that my store was robbed.
B) "Someone robbed my store."

Seems to me the following is true:
1. The evaluation of B is predicated upon the truthfulness of A.
2. A is a boolean probability:
There is a high probability the store was robbed.
3. B is neither probability not fuzzy logic, because its not a true or false value. It is simply an unknown, a variable -- a Question Needing an Answer in the mind of the NPC.

Alright.. I think I get it now. Someone please tell me I''m wrong.. otherwise thanks for the clarification.

#77 irbrian   Members   -  Reputation: 130

Like
Likes
Like

Posted 06 April 2004 - 07:42 AM

Anyway so my original point was that NPCs never have enough information to form absolute conclusions about things.

That said, does Fuzzy Logic still comes into play? How exactly do NPCs handle the situation when they acquire conflicting information?

Given that the NPC Frank has no opinion yet on the subject, consider the following Observations:
A) "Sal believes that John is a thief."
B) "Joe believes that John is not a thief."

So how would this be best handled in order for the NPC to begin to form an opinion?
1) "It is believed that John is a thief (50%) AND It is believed that John is not a thief. (50%)" (True FL)
2) "John might be a thief. (50%)" (Probability)
3) "I trust Joe more than I trust Sal. THUS, John is a thief (0.4) AND John is not a thief (0.6)." (FL, with weighted inputs)
4) "I trust Joe more than I trust Sal. THUS, it is probable that John is not a thief." (Probability, weighted)
etc.


#78 Nathaniel Hammen   Members   -  Reputation: 136

Like
Likes
Like

Posted 06 April 2004 - 11:41 AM

I would use fuzzy logic, but not for the beleifs or the memories, but for the personality. "The pre-generation of John''s character defined him as greedy, lazy, opportunistic, sneaky, and cowardly, yet charismatic." In other words, John has a greedyness of 0.99, activeness of 0.05, etc... Values closer to 0.5 would represent more normal people, and extremes would represent people who would have a larger impact on this "civilization." Since almost all of John''s stats are extreme, it is expected that he would have a large impact on this city, which is just what happened in your scenario.

--------------------------------------
I am the master of stories.....
If only I could just write them down...

#79 Timkin   Members   -  Reputation: 864

Like
Likes
Like

Posted 06 April 2004 - 01:36 PM

quote:
Original post by irbrian
Thus, considering the following two statements:
A) "John is sort of a thief."
B) "John might be a thief."

You are suggesting that proposition A is Fuzzy Logic, because John is both a thief and not a thief;



Yes.

quote:
Original post by irbrian
and proposition B is more of, I dunno, a boolean probability I guess you could say.



No. Boolean suggests only 1 of two mutually exclusive values. That would be First Order (Aristotlean) logic. Probabilities are values in the range of [0,1] of a variable that satisfies the axioms of probability.

quote:
Original post by irbrian
If I''m understanding so far, I''ll re-evaluate my original statement:
"I believe someone has robbed my store."

Perhaps then this would be best broken into two statements:
A) "I believe that my store was robbed.
B) "Someone robbed my store."



This is a hard example to deal with, because it''s really hard to describe how a store was both robbed and not robbed. It''s a little nonsensical. However, given this example, I would personally write it as:

A) My store was sort of robbed
B) Someone may have robbed my store.

A) Is clearly now a statement suggesting that the store was both robbed and not robbed.
B) Is now a clear statement relating a belief held by an agent.

quote:

Seems to me the following is true:
1. The evaluation of B is predicated upon the truthfulness of A.



Not necessarily. While it may be true that the store was not
robbed, an agent can hold false beliefs. That is, they can beliefve something to be true,even though in reality, it is not true (and vice versa).

I hope this helps to further clarify the issue for you

Cheers,

Timkin

#80 irbrian   Members   -  Reputation: 130

Like
Likes
Like

Posted 06 April 2004 - 06:12 PM

quote:
Original post by Timkin
quote:
Original post by irbrian
and proposition B is more of, I dunno, a boolean probability I guess you could say.

No. Boolean suggests only 1 of two mutually exclusive values. That would be First Order (Aristotlean) logic. Probabilities are values in the range of [0,1] of a variable that satisfies the axioms of probability.
Alright, so boolean probability is a contradiction. You didn''t state it, but it seems clear that prop. B in that case was an issue of probability. Incidentally, unless its a common usage, I wouldn''t define probability as a range of values [0,1] as that really causes some confusion with the whole Fuzzy Logic 0.0-1.0 thing. Can''t we just use percentages for probability like the rest of the world and make the distinction as clear as possible?
quote:
quote:
Perhaps then this would be best broken into two statements:
A) "I believe that my store was robbed.
B) "Someone robbed my store."
This is a hard example to deal with, because it''s really hard to describe how a store was both robbed and not robbed. It''s a little nonsensical. However, given this example, I would personally write it as:

A) My store was sort of robbed
...
A) Is clearly now a statement suggesting that the store was both robbed and not robbed.
Ugh, now you''re trying to turn it back into Fuzzy Logic. I thought we agreed to stay AWAY from Fuzzy Logic in this case, as I now agree that it really doesn''t apply. Seems to me it''s a simple case of probability. What you''re saying up there simply doesn''t make any sense at all -- FL strikes me as a statement of fact just as Boolean logic is a statement of fact. Of course it''s possible I''m totally off-base (and I''m sure you''ll correct me if I am), but I don''t think even FL should allow two mutually exclusive conditions to co-exist.

Going back to probability-based beliefs, we could say that at a certain level of probability, an NPC forms a belief that something is about something is true, even though the NPC will never understand something to be 100% true. For instance:

0-10% --- Invalid Range for Belief formed by AI
10-20% -- NPC Believes the proposition is FALSE
21-40% -- NPC Believes the proposition is "probably FALSE"
41-60% -- NPC Believes the proposition is EITHER True OR False -- NOT True AND False.
61-80% -- NPC Believes the proposition is "probably TRUE"
81-90% -- NPC Believes the proposition is TRUE
91-100% - Invalid Range for Belief formed by AI

quote:
quote:
Seems to me the following is true:
1. The evaluation of B is predicated upon the truthfulness of A.
Not necessarily. While it may be true that the store was not robbed, an agent can hold false beliefs. That is, they can beliefve something to be true,even though in reality, it is not true (and vice versa).
I agree that NPCs can believe something to be true or false. I meant that the ultimate reality of statment B is predicated upon the TRUE OR FALSE value of A.
quote:
I hope this helps to further clarify the issue for you
I sort of understand -- let''s call it a 0.7.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS