What is really AI?

Started by
80 comments, last by Emergent 16 years ago
Interesting experiment: The program of following the trail left by other ants seems to be intelligent behavior, but only if we consider the colony's survival, or possibly the species of ant. From one ant's perspective, it might not be so intelligent.
--"I'm not at home right now, but" = lights on, but no ones home
Advertisement
Quote:Original post by AngleWyrm
Sorry this is gonna meander a bit, but I don't know how else to put it.

Theorem 1, on Self Direction and Choice, may be over-rated. I offer two examples to point to extremes:

1). Ants gathering food follow paths drawn by other ants. They don't have a choice, they are programmed to do so; and yet it seems an intelligent behavior.

2). People often try to lose weight/quit smoking/drugs/gambling. They say "for real, this time". Again. Like they were kidding last time. Like maybe they weren't sincere 'enough' last time. And even prayer and tears don't help. The suffering person wished to be free, and decided to do something about it. Yet they aren't.

Hm. Predictable, and also somewhat short of reasonable. There must be something different: I wouldn't for instance repeatedly promise to take better care of the [whatever] next time.


I do not quite understand what you are trying to say but the framework I gave can handle your ants extreme. I stated:

Here the notion of direction is weak. What is meant by this is that the entity in question is believes it can direct its actions. It does not matter whether it can or not, simply that it feels that it can and there is some other entity that can agree with it. Where this belief can be observed (per axiom 2) to be in a way that suffices axiom 1 in that this belief is emergent and not built in.
Quote:Original post by owl
Quote:Original post by AngleWyrm

Theorem 1, on Self Direction and Choice, may be over-rated. I offer two examples to point to extremes:

1). Ants gathering food follow paths drawn by other ants. They don't have a choice, they are programmed to do so; and yet it seems an intelligent behavior.


You might want to review this experiment I performed some time ago.


hehe I like how the story end as kind of a fable. The moral of the story is...
...Self direction is a belief system?

Part two of my extremes example above illustrates cases where people believe that they have free will, that they direct their actions. Actions their friends and society also believe they have control of, and even hold them accountable for. Actions that directly impact their personal health and social status. And yet even in spite of their own desires on the matter, they still fail to accomplish inaction -- just NOT doing something.

As for the ant: What happens if instead of shampoo, we draw a circle of ant-scent?

[Edited by - AngleWyrm on March 30, 2008 4:42:49 PM]
--"I'm not at home right now, but" = lights on, but no ones home
Quote:Original post by AngleWyrm
...Self direction is a belief system?

Part two of my extremes example above illustrates cases where people believe that they have free will, that they direct their actions. Actions their friends and society also believe they have control of, and even hold them accountable for. Actions that directly impact their personal health and social status. And yet even in spite of their own desires on the matter, they still fail to accomplish inaction -- just NOT doing something.

As for the ant: What happens if instead of shampoo, we draw a circle of ant-scent?


No, an "intelligent" (sentient, sapient, cogent, fat,etc) self directed entity in my system must have a set of beliefs. This is because this entity cannot know everything due to physical limits. Now within these set of beliefs must be an emergent belief in which this entity can think that it has the ability to make free choices and also there must be some entity with which it can communicate such. Borrowing from modal logic again, each set of entities encompasses a local world . Thus for example with respect to any given entity the proposition this entity is intelligent is a contingent one.
Quote:Original post by Rixter
Quote:Original post by Timkin
Quote:Original post by Rixter
AI is search.


Ignorance is bliss


I figure while we're assigning arbitrary definitions to an apparently ill defined concept, why not take the simplest?


Except that saying that "AI is search" is like saying that a house is a hammer. Search is a tool that can be used to create the end result (along with other tools, skill and creativity), but that doesn't magically transform it into the final product.

...and having said that we must accept that a 'so-called AI' that uses only search to solve a problem is not 'AI', but rather just an intelligently designed implementation of a solution to a computational problem.

This is the most common objection raised about AI: that it's just intelligent design, rather than an embodiment of intelligence... but then, are we any more (and here I state that I believe in 'design by evolution' rather than 'design by God', just to make my position unequivocally clear). So where do we draw the line?
Quote:Original post by Daerax
No, an "intelligent" (sentient, sapient, cogent, fat,etc) self directed entity in my system must have a set of beliefs. This is because this entity cannot know everything due to physical limits.
Beliefs as a behavior/knowledge heuristic?

Quote:Original post by Timkin
...and having said that we must accept that a 'so-called AI' that uses only search to solve a problem is not 'AI', but rather just an intelligently designed implementation of a solution to a computational problem.

This is the most common objection raised about AI: that it's just intelligent design, rather than an embodiment of intelligence...

and
Quote:Original post by Daerax
Now within these set of beliefs must be an emergent belief in which this entity can think that it has the ability to make free choices and also there must be some entity with which it can communicate such.

This brings up an interesting point: What exactly is a free choice? Is selecting the best option a free choice, or is it simply an optimized relationship to the environment? Is choosing randomly from a probability distribution of personal biases over the options a free choice?

Something I've noticed: When people are presented with a set of alternatives, they often express their intellect by attempting to step 'out' of the alternatives, and view the problem from a more global perspective. Searching for a solution on a higher plane, or composing other alternatives from past experience with similar problems.

Also, sometimes people will choose an alternative that is clearly suboptimal by my implied score system, but could be ranked as superior by their scoring system. For instance, by dispensing with some presumed code of conduct. When done well, it comes across as victorious, proving that my scoring system could be better if it were unburdened by needless rules.

[Edited by - AngleWyrm on April 2, 2008 6:48:17 AM]
--"I'm not at home right now, but" = lights on, but no ones home
Quote:Original post by AngleWyrm
This brings up an interesting point: What exactly is a free choice? Is selecting the best option a free choice, or is it simply an optimized relationship to the environment? Is choosing randomly from a probability distribution of personal biases over the options a free choice?

We have a pretty hardwired dualistic view of the world, where all objects obey the laws of physics, but some seem to have "souls", or "behaviour". This gives us an illusion of free will that probably has nothing to do with how the world really works, but it's a powerful metaphor that helps us understand and predict events around us. I don't think this illusion has to necessarily be present in an agent to be able to call it intelligent. It's just a byproduct of the way we are implemented.
A tall cool glass of beer calling out "drink me, driiiinnnk meeee" -- an anthropomorphism in jest.
Sometimes my computer doesn't want to cooperate -- an implied metaphor used to simplify what is likely a tangle of dreary detail.
Long ago, the word 'angel' meant 'messenger' -- an artistic license that should have been revoked for malpractice.
--"I'm not at home right now, but" = lights on, but no ones home
What is AI, really? A misnomer!

Intelligence doesn't exist. It is a human abstract concept that tries to poetically add some mystery to the idea that the world is a series of chemical reactions dictated by the laws of physics (and maybe some mystical forces from another dimension).

The definition of intelligence is as filmsy as the definition of life. Anything that grows, including a crystal, can be considered alive to some, others will tell you it has to have DNA and reproduce, that it has to have at least one cell...

So that's the problem. There's no set definition, you have to pick a side.

I would say you can't create an artificial version of what doesn't exist. Call me a nihilist if you will :P

But in practice... AI is a set of patterns that attempt to emulate behaviors. Those behaviors can be predictable or not. Their purpose is simply to allow non-software things, like humans, to interact with a machine in a certain context.

The start menu and the office paperclip are in fact artificially intelligent. In their own way.

This topic is closed to new replies.

Advertisement