[Slightly Offtopic] John Searle and computational intelligence

Started by
85 comments, last by MDI 20 years, 2 months ago
quote:Original post by Timkin
Thus, I would argue that the agent did perceive something when its axon weight changed and that this perception did alter some beliefs (even if they are beliefs about internal state), even though it didn''t alter the agent''s behaviour.


Which is fine. As I said perception only requires change, it''s only people''s theories of other creatures perceptions that require behavioural change, which is a problem with approaching a definition from an ethological perspective as you can have an infinite number of theories as to how a given behaviour originated, so can call almost any behaviour intelligent/non-intelligent depending on how you wish to argue.

So, is there some form that this change has to take to be considered perception? Presumably we''d generally agree a rock being broken by the fall of a tree isn''t perception. Does the change have to be linked to some form of behaviour to be perception? Could perception be defined as any change to the processing mechanism of an input output mapping?
Advertisement
Argus2, under those circumstances I would say that yes, the contraption does perceive something. The sensor model (as engineers would call it) is hidden in the proportional response of the platform to pressure waves in the ground. Essentially, you have a binary state seismic sensor!

I think its possible that we could classify all possible contraptions as either sensors or not sensors. One question though: is perception different to sensing? I certainly think sensing is at least a subset of perception. Lets, for the moment, just consider what sensing is as opposed to causal reactions...

quote:Original post by MikeD
So, is there some form that this change has to take to be considered perception?


As I said earlier in this thread, I think it has to be an informative change. That is, the sensing system has to decode information from the signal and encode it in its internal state (resulting in a change of internal state). This though is not sufficient for a definition of perception, for the rock broken by the falling tree stores some information about the mass of the tree in the fracture properties and the properties of the peices of the rock. Of course, this information is not available without having other knowledge of the rocks original state and the way in which massive objects damage rocks (cause fractures).

So, perhaps sensing requires not only an internal state change that encodes information about the percept (signal) received, but also an internal model regarding possible percepts and possible external states that generate different percepts, so that the entity perceiving can relate the percept back to the real world and form an opinion, via an internal model (the inverse of this evidence-state model) as to the state of the world it sensed.

This sort of makes sense from the AI perspective. In Bayesian systems, we use a sensor model of the form P(Evidence|External_State) and then invert this to arive at P(External_State|Evidence), which becomes the internal model of the external state (and is combined with the prior probability over states to arrive at the posterior beliefs over the states).

So now, I think I want to say that a percept is actually a (possibly forced) selection of one of the possible states described by a sensor model, based on the reception of an input signal from the outside world.

So perception is more than just transmission of information. The information must be informative in such a way as to permit the identification of a state of the world from an internal model of it. Of course, this means that Argus2''s contraption senses, but the WebCam doesn''t! Mmmm...

Thoughts...???

Cheers,

Timkin
quote:Essentially, you have a binary state seismic sensor!
Yes, exactly. Now, the question is : are you going to permit the existence of unary state seismic sensors?
quote:So perception is more than just transmission of information. The information must be informative in such a way as to permit the identification of a state of the world from an internal model of it.
I think I get it : An object that perceives must contain some internal representation of the world (its environment). Information which changes this representation is ''perceived'' by the object.

This sounds like a good start, but if we really want a rock-solid definition, we have to ask : how complex does this internal representation have to be, and how do we recognise one? For example, a rock seems to contain some information about its environment in its physical structure (its temperature/pressure phase etc).
I think what this teaches us is that the delineation between perceiving and changing is an arbitrary one defined by an external intelligence and projected onto the system, rather than being an intrinsic property of that system. I don''t think an ANN has beliefs in the sense I have beliefs (though I''m sure Timkin''s definition of "beliefs" is quite specific, care to elaborate?), so I can argue that it is simply like a rock in most senses but with a different domain of interactions with its environment.

I think the definition of perception suffers the same flaws as the definition of intelligence. It''s subjective to the intelligence defining it and wholly relies on that intelligence''s inability to deconstruct it (and their affinity with its behaviour), hence considering it complex enough to be perceiving/intelligent.

IMO everything that changes is on the scale of perception, everything that exhibits behaviour (which itself falls under the same problems as the other definitions) is on the scale of intelligence, it all depends on the definitions of the observer.
MikeD, I have to agree with you.

Take starfish for example; to most people they don''t perceive or react to anything. It''s only when you watch them via time-lapse that it becomes evident that they are very very active, hunting, and evading, just in extreamley slow motion. It''s quite amazing how, to me anyway, they became much more intelligent when I finally got to see them living in starfish-time.

quote:
So perception is more than just transmission of information. The information must be informative in such a way as to permit the identification of a state of the world from an internal model of it. Of course, this means that Argus2''s contraption senses, but the WebCam doesn''t! Mmmm..


Timkin: I think the webcam still meets your cirteria. The webcam has an internal model of the world by way of firmware. It would ''know'' things like light intensisty, contrast, brightness, exposure, and dimension (spatial and color). Having modifed a few webcams for astronomical purposes, I can tell you that small changes to the webcams version of reality (at a hardware level) can drastically affect the way the world is perceived.

Cheers,
Will
------------------http://www.nentari.com
quote:Original post by Argus2
Yes, exactly. Now, the question is : are you going to permit the existence of unary state seismic sensors?


No, I don't think so, since it would only ever be in one state and hence its state could not be affected by an external signal.

quote:Original post by Argus2
I think I get it : An object that perceives must contain some internal representation of the world (its environment). Information which changes this representation is 'perceived' by the object.

That is what I said, but I think that goes too far as an explanation... there should be a simpler explanation of sensing, since it implies that sensing requires some level of intelligence. Perhaps this is the case... perhaps only intelligent entities sense... but I don't like where that takes us, so I want to consider the alternative... otherwise AI has no hope (or we need a drastically new way of measuring intelligence).

quote:Original post by Argus2
This sounds like a good start, but if we really want a rock-solid definition, we have to ask : how complex does this internal representation have to be...


Agreed. Personally, I think that we should be able to accept that the internal representation need only be as complex as is necessary to represent the information in the signal that the entity will use. So, for your seismic sensor, it only needs to have a binary state model, since it is only concerned with trees having greater than or less than a specific mass. What's the simplest sensor? I would say something that is binary state.


quote:Original post by Argus2 ...and how do we recognise one? For example, a rock seems to contain some information about its environment in its physical structure (its temperature/pressure phase etc).


Only if you know how to read that information (sense it)!

quote:Original post by MikeD
I think what this teaches us is that the delineation between perceiving and changing is an arbitrary one...


I'm not convinced that this is the truth. One should be able to imagine an objective notion of perception, since many different entities with different evolutionary paths can do it, using many different sorts of sensor systems. There must be a universal property of all of those systems that could be said to be the basis of any sensing system. I think to say that it's arbitrary is a bit of a cop out.


As to the question of beliefs... in AI they are typically taken to mean an internal model of the domain (environment plus agent state), which might also include predictions about future states along with estimates of the current state and past state. In some AI systems, beliefs may also extend to cover the underlying evolutionary dynamics of the domain. In humans, beliefs are just everything we know or 'beleive' to be true and factual statements about the universe. This might not be the best definition, but it's worked for me for a very long time!

Personally, I think there is worth in persuing the question of objective perception and the properties that a sensing system must have, since perception is a fundamental part of any agent that wants to interact with its environment, be it a game agent, a robot, an ant or a human.

Cheers,

Timkin

[edited by - Timkin on February 5, 2004 8:29:48 PM]
quote:Original post by RPGeezus
Timkin: I think the webcam still meets your cirteria. The webcam has an internal model of the world by way of firmware. It would ''know'' things like light intensisty, contrast, brightness, exposure, and dimension (spatial and color).


It''s not that I was saying that the WebCam doesn''t sense (because I think it does) but that the definition I gave above didn''t include the WebCam. In hindsight though I think it does, simply because the CCD receptors are designed to change output state given a particular input state (signal intensity). So my definition is still acceptable... but needs improvements, for the reasons I gave in my previous post.

quote:Original post by RPGeezus
Having modifed a few webcams for astronomical purposes


Hehe... sounds like we have a similar background... I came from an astrophysics and astronomy background before getting into AI... I worked on early CCDs as part of my observing experiments... some of my work included photographing (with band-limited CCDs) the Schumacher-Levy impacts on Jupiter! That was a great time to be doing astronomy!!!

Cheers,

Timkin

This topic is closed to new replies.

Advertisement