quote:Original post by Argus2
Yes, exactly. Now, the question is : are you going to permit the existence of unary state seismic sensors?
No, I don't think so, since it would only ever be in one state and hence its state could not be affected by an external signal.
quote:Original post by Argus2
I think I get it : An object that perceives must contain some internal representation of the world (its environment). Information which changes this representation is 'perceived' by the object.
That is what I said, but I think that goes too far as an explanation... there should be a simpler explanation of sensing, since it implies that sensing requires some level of intelligence. Perhaps this is the case... perhaps only
intelligent entities sense... but I don't like where that takes us, so I want to consider the alternative... otherwise AI has no hope (or we need a drastically new way of measuring intelligence).
quote:Original post by Argus2
This sounds like a good start, but if we really want a rock-solid definition, we have to ask : how complex does this internal representation have to be...
Agreed. Personally, I think that we should be able to accept that the internal representation need only be as complex as is necessary to represent the information in the signal that the entity will use. So, for your seismic sensor, it only needs to have a binary state model, since it is only concerned with trees having greater than or less than a specific mass. What's the simplest sensor? I would say something that is binary state.
quote:Original post by Argus2 ...and how do we recognise one? For example, a rock seems to contain some information about its environment in its physical structure (its temperature/pressure phase etc).
Only if you know how to read that information (sense it)!
quote:Original post by MikeD
I think what this teaches us is that the delineation between perceiving and changing is an arbitrary one...
I'm not convinced that this is the truth. One should be able to imagine an objective notion of perception, since many different entities with different evolutionary paths can do it, using many different sorts of sensor systems. There must be a universal property of all of those systems that could be said to be the basis of any sensing system. I think to say that it's arbitrary is a bit of a cop out.
As to the question of beliefs... in AI they are typically taken to mean an internal model of the domain (environment plus agent state), which might also include predictions about future states along with estimates of the current state and past state. In some AI systems, beliefs may also extend to cover the underlying evolutionary dynamics of the domain. In humans, beliefs are just everything we know or 'beleive' to be true and factual statements about the universe. This might not be the best definition, but it's worked for me for a very long time!
Personally, I think there is worth in persuing the question of objective perception and the properties that a sensing system must have, since perception is a fundamental part of any agent that wants to interact with its environment, be it a game agent, a robot, an ant or a human.
Cheers,
Timkin
[edited by - Timkin on February 5, 2004 8:29:48 PM]