Programming the 5 Senses

Started by
12 comments, last by Tutorial Doctor 10 years, 3 months ago

I need some feedback on something I call a Senses System. Someone said that the way I am doing it is wrong, so I need some suggestions on how to do it right. This is how it works:

The senses system is a physics based system. Each sense is made up of basically one sensing object, one function, and two variables. All senses will eventually be part of a Senses Class.

SIGHT
Sight is just a collision test between a cone primitive that represents eyes. If there is a collision between the cone and an object, it makes a Boolean named seen equal to true. An arguent is used so that you can easily make something see whatever object you type in the parenthesis. The transparency of the cone represents the quality of sight. The wideness of the cone represents the field of view. The height scale of the cone represents nearsightedness or far farsightedness. The eyes variable is the cone primitive.

Here is some a pseudocode skeleton of sight.


function See(object)
{
seen = true;seeing = true;
if isCollisionBetween(eyes,object){}
if (seen){}
while (seeing){}
}

SMELL
Smell is a collision test also, represented by a sphere primitive, but instead of detecting collision with an object, it is a collision with a particle system. The amount of particles that go inside of the sphere represent the strength of the smell. The type name of the particle system determines the type of smell. The nose variable is the sphere primitive.


function Smell(object)
{
smelled = true;smelling = true;
if isCollisionBetween(nose,object){}
if (smelled){}
while (smelling){}
}

TOUCH
Touch is a collision test also. It requires no primitive shapes. The character or object itself detects the collision. It can be used for multiple game objects:




function Touch(object1,object2,object3)
{
touched = true;touching = true;
if isCollisionBetween(toucher,object1){}
if (touched){}
while (touching){}
}

HEARING
I decided to make hearing work a little differently. Hearing is just a volume and distance test. It checks if the distance between the hearing object and the sound object. It also checks if the volume is above or bellow a certain value.


function Hear(object){
heard = true;
hearing = true;
if (heard && volume >= 30 && distance <= 3)
{}while (hearing){}
}

TASTE
Taste works just like touch, except it has a primitive shape that does the collision witih another game object.



function Taste(object)
{
tasted = true;tasting = true;
if isCollisionBetween(tongue,object){}
if (tasted){}
while (tasting){}
}

This systems is easy to implement in any language or game engine that has particle systems, collisions, and sounds. I was even able to implement it in the Little Big Planet game using their tag system. When there was a collision, it would display a speech bubble that said either "Seen, Heard, Touched" etc.

Update:

Okay, so this was so easy to implement. Only took about 3 lines of code:

Screenshot%20%28940%29.png

Screenshot%20%28941%29.png


if isCollisionBetween(eyes,box) then
setText(text, "I see a ".. readout)
else setText(text,"I see nothing")


function See(object)
	if isCollisionBetween(eyes,object) then
		readout = getName(object)
		setText(text, "I see a ".. readout)
	else setText(text,"I see nothing")
	end	
end

The above code is how the final function looks. And the way you use it is simple:


See(box)

They call me the Tutorial Doctor.

Advertisement

Are you accounting for obstructions like a wall preventing you from seeing, smelling, etc. what is on the other side of the wall? I can't tell since you might or might not have it in your collision detection. Other than that, the only issue I see is that your solution won't scale very well. There's nothing wrong with poor scaling since it's always a balancing act between how detailed you want versus how many objects you can support.

You're also not supporting things like the wind that impacts how far you can smell or a sound masking another sound. None of that necessarily matters since you always have to choose what details to ignore.

Yeah, the system won't scale well.

I had a system for sight, that was also slow, but did account for obstruction. I basically would render to a texture from a viewpoint of the character, but with the objects color coded. (And in a game with hi poly objects, one could render them as low poly versions of themselves.) Then I would check that texture for what the character could see. It wasn't too bad as long as the render texture was small, and the number of objects was small, but obviously wouldn't scale well for real time games. (I could get away with it as I was using it for a turn based game, I was mimicking those TLOS systems in various tabletop games like Warhammer 40k)

You're also not supporting things like the wind that impacts how far you can smell

Actually, if his particle system is affected by the wind, that would work for smell.

Thanks! I didn't consider obstruction for sight., but that should be easy since I can check collisions against a wall object, and if a wall is seen, then make the object not seen.

In the little big planet game, I was able to put tags on every object along with a label (this is sorta how we do it in real life) so it actually looks like the character is seeing, hearing, touching, etc because in the little big planet game, the puppet was getting all sorts of flags for collision going off.

When he was touching the ground, a "ground" label kept going off.

Cool thing about wind is that it also looks like he is sensing real wind, because the count of particles increment how much he can smell it. If the amount of particles in the sphere reach a certain number, it is "smelled." Each particle system also has a label.

However, I need to really think about this obstruction thing more, cause I can see where things won't work. I'll post an update when I think of a way.

What does it mean that it won't "scale" well? I need to work on that too.

They call me the Tutorial Doctor.

Not strictly feedback, but this is very interesting, I've been thinking about something similar recently (though only for vision at the moment) and I'd be really happy to see the results you arrive to. Please do update us of your progress through this thread or a developer journal smile.png

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Good idea bacterius! I can do a developer journal. Didn't think about that.

I want it to be sort of uncanny. Right now it works surprisingly well, and actually feels more like an AI system. But now that I think of it, all of our intelligence is processed in a labeling sort of way.

I was thinking about also letting the object use the material properties of an object to describe the object. So not only can it detect the object, but form a description about it based on its material. Hmm. Getting more ideas as I type.

They call me the Tutorial Doctor.

What does it mean that it won't "scale" well? I need to work on that too.

It means you won't be able to use this solution to handle real-time calculations for large numbers of interactions. Like you probably wouldn't be able to have 1000 different people smelling and seeing 1000 different objects and still be generating results quickly enough to get 60+ fps. As I said, that's not really a bad thing. It is just the trade-off you are making.

Oh okay. Hm, I need to ask the developer of the engine I'm using about this then. There is a feature in the engine that allows you to make objects a "ghost object." As a ghost object, thye go through other objects but can still detect collisions. I have used ghost objects in the past to check collisions.

His engine does run at a smooth 60fps for most things I have tried so far. I'll see what he'll say about the scaling. Thanks for the tip.

They call me the Tutorial Doctor.

What does it mean that it won't "scale" well? I need to work on that too.

It means you won't be able to use this solution to handle real-time calculations for large numbers of interactions. Like you probably wouldn't be able to have 1000 different people smelling and seeing 1000 different objects and still be generating results quickly enough to get 60+ fps. As I said, that's not really a bad thing. It is just the trade-off you are making.

The sense of smell doesn't have terribly good time resolution, so it's probably OK if you only update an agent's smelling input every couple of seconds.

Here's another idea for smell: Make it closer to reality, by having objects drop chemicals on the scene and agents pick them up. Since smell also doesn't have terribly good spacial resolution, you can divide your scene into chunks (1m x 1m square blocks, say) and keep a small array of densities in each chunk, each density corresponding to a different chemical. Objects can drop chemicals, chemicals can decay over time, they can be carried by wind... Then agents simply look at the densities in the chunk where they stand. This is a classic space-time tradeoff, where the use of the additional data structure allows you to scale linearly with the sum of the number of objects and agents, instead of their product.

How many chemicals to keep track of would depend a lot on the game. The coolest application of this to a game I can think of is dogs: detection dogs (they detect explosives, drugs and blood), rescue dogs (they can find survivors after a disaster) and tracking dogs (which can track individual people).

This topic is closed to new replies.

Advertisement