Artificial Intuition

Started by
56 comments, last by Paul Cunningham 23 years, 1 month ago
I''m trying to get a good explaination on what intuition is or how we could understand it. I''d like to work out how it can be artifically created and implemented. But its had enough just trying to understand how it works. Definition of intuition is the ability to make desicions or conclusions without the use of knowledge on the subject. I love Game Design and it loves me back. Our Goal is "Fun"!
Advertisement
quote:Original post by Paul Cunningham

Definition of intuition is the ability to make decisions or conclusions without the use of knowledge on the subject.



I don''t agree with the definition
So I''ll start by stating my interpretation of intuition:

Intuition is the ability to make decisions or conclusions without the conscious use of knowledge on the subject.


How do you simulate it? Hmm, I''m not sure you can, but you can EMULATE it. That''s what expert systems were designed to do. They "capture the implicit knowledge and intuition of an expert into an automated system."
The problem with it is, that you have to know WHAT you are trying to store intuition about.

What exactly is it that you want the computer to be intuitive about?


Give me one more medicated peaceful moment.
~ (V)^|) |<é!t|-| ~
ERROR: Your beta-version of Life1.0 has expired. Please upgrade to the full version. All important social functions will be disabled from now on.
It's only funny 'till someone gets hurt.And then it's just hilarious.Unless it's you.
I think the important word with intuition is ''association''.
Because you never have NO knowledge at all about something.

Suppose you have some sort of creature in an RPG and you want to give it intuition. At one moment it becomes hungry. If it has never encountered this state and if it would have no knowledge at all it might try sleeping, but it wouldn''t work. Talking wouldn''t work either. Now if this person has some form of association patterns (based on neural nets for example) like

HUNGER ~ FOOD ~ EAT

then it might do something useful. This way you don''t have to program the fact that it has to eat when it is hungry. You only say that FOOD is associated with HUNGER, something we do a lot every day. The exact actions might be going to a bakery and buying food, then eating it. Again you might build a pattern like

STORE ~ BUY ~ MONEY ~ FOOD

After time the creature would be sufficiently trained once it has found the correct associations. (The intuition is verified and it becomes knowledge).

Maybe my explanation looks a little cofusing but I hope you get the idea.


******************************
Stefan Baert

On the day we create intelligence and consciousness, mankind becomes God.
On the day we create intelligence and consciousness, mankind becomes obsolete...
******************************
******************************StrategicAllianceOn the day we create intelligence and consciousness, mankind becomes God.On the day we create intelligence and consciousness, mankind becomes obsolete...******************************
In the example above when creature is hungry it may try as well eat itself. In that case that whould be correct intuitive solution except it will create new problems to solve:pain and missing part of the body. I think intuition can''t be effective without experience, experience without mistakes and ability to learn from them.
If, to be more specific, I have RPG creature which has n possible actions to choose from (some of them it never tried) and creature given some task, it can: see if it ever tried to solve that task before and if not try to go through every possible combination of actions to see if that whould solve a problem. It similar to game of chess where the easiest solution whould be to go through every possible move and choose the best one. Of course that whould take long time but here is fun part where you can programm AI to make judjment. May be like that:
does trying action will give similar result as if trying actions[k] + action[j] +.... In other words it can try to group things in categories.

I have no words of wisdom.

AI v.2000.07.24.17:36
The goal i''m tring to achieve by using computer intuition is more a solution to the problems arising from using conventional ai routines for games. The primary problem I see with the current ai system is: That the player in "A" game will eventually/usually adapt strategies and skills which are more derived from weaknesses and predictabilities in the computer ai rather than using his/her creative and logical smarts to beat the computer in a game.

Although i am mainly speaking from my own personal experience. The one problem i have with playing games which rely on computer ai is that once i realise a way to beat the computer then i know that i can always beat the computer at this game. So i''ve come to the conclusiion that the AI in these games relies more on evironmental forces to make there decisions and less to none on their internal forces (computer intuition).

To wrap up, i''m tring to work out a way that the computer will behave differently / unpredicatable which each new start of a game rather than just being beaten because of the computers ability to calculate faster than me.

I love Game Design and it loves me back.

Our Goal is "Fun"!
What you could do is give the entities in the game an "alphabet" of basic actions. These are probably very similar to what the player has, things like "Move, Take, Use, Drop". Then you might have some basic "words" with definitions such as "Eat[ Use Food ]".

To generate new kinds of behaviours you might train the entity to try out different things, and it could ask you to name them if you consider something good... or it could evaluate its own actions and keep learning during the game, but then you''d have to set up an evaluation system as well, that could be based on pain or something.


Give me one more medicated peaceful moment.
~ (V)^|) |<é!t|-| ~
ERROR: Your beta-version of Life1.0 has expired. Please upgrade to the full version. All important social functions will be disabled from now on.
It's only funny 'till someone gets hurt.And then it's just hilarious.Unless it's you.
Since we talking here about game with hardcoded rules, there is not much space for intuition but rather computer has to have library of various strategies. Strategy should be based, like some wrote above, on the basic actions. Also you need satisfaction criteria - how id chain of basic actions more successfull then other. Lets consider example:
Unit A has no personal experience buffer, library of precoded actions (set by programmer) and some formula for calculating its success (some units have different goals and thus different formulas) in range from 0.0 to 1.0. Unit A tried randomly or selected scenario from library (say here is chain of actions: A,B,C) and achieved satisfaction result of say 0.7. Unit A writes that chain of actions with satisfaction result in experience buffer. If human adopts then satisfaction result will eventually fall down to say 0.5. Then Unit A will start looking for another strategy, itterating process. On top of that should be another AI spesializing in processing of existing experience buffer.

Cheers
Dictionaries are wonderful sources for reflexion. Intuition means in LAtin "to look,examine carefully".
Paul, you seem to think that ituition is sixth sense that give you the answer to a problem. you seem to imply that it should come "from the inside".
I''ll agree partly on this. According to the dictionary again, intuition is a mean of understanding/solving a problem by sheer empirical method. That is, without using reason, logic. You''ll agree with me that it''s a bit the opposite of what a computer is.
So I was gonna stop my reasoning here, but I thought that maybe we could emulate this somehow.
I agree a bit MadKeith in that intuition is a non conscious phenomenon if you will. We find a solution but we don''t seem to know where it came from.
The way I see it, we have 5 senses, we receive continuous electrical stimuli from the outside, our brain is constently processing. But all this raw data that we receive is/has to be filtered. And Reasoning, is such a filter. Instead of reacting to stimuli directly, we add a step of processing, that usually brings a solution without the experience of trial and errors that a non-reasoning species would encounter (a bird desperatly trying to pass through my window to fly outside, instead of reasoning that, if it gets stopped suddendly, for sure there must be an invisible obstacle ahead).
Know intuition is a shortcut to the normal "human" process. The raw information suddenly "connects" its solution without having to go through the normal pipeline. We duck when we hear a frightening noise, we extend our arm to prevent a fall if we are pushed backwards (try not to), these would be physical intuitions, wouldn''t they ?
More intellectual intuition would be found in the domain of human relationship. For instance the way someone says "hello", the way to look at you. The eyes are the mirror of the soul, they say, and I firmly believe in this. You know someone is lying to you, but how do you tell ? Because of the tone of the voice, the way the eyes look at you, you don''t really know, but you KNOW. That''s intuition. You don''t explain it because it''s not reasoning, it''s perception to its best.

Now how do you emulate this ? Well, I hope I have given enough keywords to give ideas to you, but here are mine.

I think the main problem we have is that our AI are done in a way where they just CAN''t have intuition, because they process ALL the data they receive, since they only receive the data they want to process.
What I am saying is that the main problem is that the AI should be listening to the greatest number of things possible. For example, we only "listen" to the position of the mouse cursor, because we only deal with this. But know, if we where "hearing" the position, the speed of the mouse movements, the frequency at which the player clicks, the frequency at which the mouse is moved, the fact that the mouse is moved for nothing, or to issue an order, we could discover things like the nervousness of the player, the impatience (typically, I move the mouse in circles when the system begin to slow down), or the fact that he is doing pretty much nothing ...

We already have the filtering part, the brain, the reasoning. But we don''t have much to filter. Everything is meaningful. If we begin to have more *RAW* data, then we can have things like intuition appearing.
We also need a memory to our AI.
Say we have a very well coded AI for a RTS game. We have some harvesters (resource collectors) in a field, working peacefully. Enemy tanks are passing nearby, approaching, at close combat distance, then they start to attack. The superbly coded AI decide to retreat its harvesters and call in support. OK, I think we can have this already.
Now if we add intuition. The next time the tanks pass nearby, or any enemy for that matter, the harvesters AI, if it had access to data such as the direction, the apparent level of aggresivity of those units, the nature of those units, would intuitively deduce an incoming attack (association of the "ENEMY" concept with the "DANGER" concept) and either flee (the direction of the attack is towards them), or warn commanding units, passing the direction and nature of the enemy units (the AI would then decide upon an appropriate answer to the attack).

I think this is a bit what avoden offers in his first answer, actually. Make association when the create consequences (hence the need for a memory), and try new association when you have the time/opportunity.

As well, you need rules to make correct associations between concepts.
The association of DANGER and ENEMY would provoke the civilian units to run away. Which is an intuitive answer. But a soldier shouldn''t !
Maybe follow the "pyramid of needs" system (I forget the author''s name, Chomsky ?). First a human has to survive. Then he has to feed, which helps survival. Then it has to find shelter (it helps feeding, if you have a fixed source of food nearby). Then there are other need but I forget them. The last one being the need for self accomplishment.
A soldier will prioritize the protection of his nation rather than its survival. Of course, morale issues can change this behaviour drastically.
But I''ll stop going off tracks now... I hope my ideas are a bit helpful. At least, this thread just gave me new ideas

youpla :-P

(ps : I know I am a bit lengthy, but I believe I answer your questions Paul ... well, I hope)
-----------------------------Sancte Isidore ora pro nobis !
I was going to write a reply yesterday to this but everything i wrote was crap, don''t you hate that. Ahw, i couldn''t have wrote a better post than your last one i wish it was mine.

What about treating Artif-Intui like a plugin. We know the goal but to make it happen maybe we could develope plugins to achieve it. Could the first plugin be "Deduction"? The controlling logic that handles all the plugins could revolve a "Relativity logic" which could work such as: The "Deduction" plugin is used in "Relative" to the "Size of the computers army"?

The more plugins the better the artif-intuition.

I love Game Design and it loves me back.

Our Goal is "Fun"!
quote:Original post by Paul Cunningham
I was going to write a reply yesterday to this but everything i wrote was crap, don''t you hate that. Ahw, i couldn''t have wrote a better post than your last one i wish it was mine.


Now, THANKS =) *blushes*

But I have to criticize your proposition. Well, maybe I am just being picky. But as I said, Intuition by itseld IS the basis, then Reason (logic, whatever) IS the plugin !

What we have at the moment is :
-filtered inputs
This prevents us from having intuition because it prevent us from having things (data) we don''t expect.

what we need is more input data, and a better way to filter it, and a mechanism that can do relevant and fast connection of ideas.
The decision to switch to Reasoning comes when nothing conclusive shows up during the low level (instinctive level if you wish). Then we make complex association of ideas, which takes more time, but can be more directed.

Maybe if we take the example of the harvesters again.

Our harvesters are on the field for the first time. When they see enemy units they get info such as "appearance of units", "heading", "speed", "distance". Which by themselves don''t mean anything.
Trying to connect Unit(unknown) with anything won''t be very meaningful. So the harvesters ignore those units. If those units come and attack, an association Unit(some very specific id)->Danger is made.

Fleeing in front of the incoming enemy is an Intuition for the harvesters, that they''ll refine after a while to include a set of ID (all enemy units, not jsut the ones that attacked).
Just switch to Runaway mode or something.

Fleeing and calling in support or transmitting the units positions is a Reasoning process, it takes a bit more time.

Maybe intuition is just a match of 2 ideas provoking a reaction.
While reasoning is a match of more ideas/concepts (ARTILLERY ENEMY I_AM_CLOSE_COMBAT_UNIT ARTILLERY_CANT_FIGHT_CLOSE_COMBAT) -> ATTACK.
using more complex operations such as induction, deduction, and other basic logic statements (OMG, my Maths lectures ARE useful ! )
RED_TANK -> DANGER, RED_TANK -> ENEMY, hence ENEMY -> DANGER.

Does that make any sense.

I am just thinkin that you are right Paul. We can make intuition be a plugin.
Vanilla AI jsut hear what they want to listen to. Which is very effective, but non-creative.
Add aplugin that listen carefully to much more data, make fast connections between ideas and make a sort of database of ideas, and there you go, you can have a more "creative" AI.

In fact, up to now our AI was Intuitive because it only connected data and executed actions, jsut like reflexes. Very branched, but reflexes nonetheless.
When we add the "concept matching" process, we get a creative AI.
Hence a Reasoning one.

I have to save that post, I am getting idea for my project

youpla :-P
-----------------------------Sancte Isidore ora pro nobis !

This topic is closed to new replies.

Advertisement