chase and evade AI applied to facial tracking

Started by
-1 comments, last by jeremy duncan 10 years, 9 months ago

What I have to say here is an idea for tracking in 3d space for use in the oculus vr.

Now bear in mind that I'm learning c# and unity right now and in a years time I should be able to try some things to do what I want. So this is really just a few questions to more experienced developers on what is possible.

I have thought through what I am trying to do and will describe it now.

A cube is beside a mirror and the mirror is above the cube, and the mirror is facing the cube, so the mirror is facing downwards.

A light souce like a laser pen is shining upwards and the laser pen is beside the cube and below the mirror, and the mirror is recieving light from the laser pen and shining the light it recieves from the laser pen onto the cube.

It looks like this;

See picture A.jpg

Now the laser is shing onto the cube the cube also has a mirror so the laser beam is shing onto a different surface, this is where it gets tricky.

See picture B;

The surface the cube reflects the laser light onto is called "destination".

"Desctination" has a moving dot that is chased by the laser light from the cube, so that the laser light from the cube sits on top of the dot on the "destination".

See picture C to see the dot;

So in pictures A,B,C, you see the overall mechanism I want to create.

Now the tricky thing is what happens when the cube redirects the laser beam light.

In order for the cube to redirect the laser beam light onto "destination", the cube must have a surface that moves in the xy axis positions, so the moving surface must sit on top of the cube.

See picture D;

Now as the moving surface redirects the laser light onto the "destination", the moving surface has changing xy axis coordinates.

Now what I want to know, is is it possible to create this moving surface that chases the dot on the "destination" and as it chases the dot the moving surface has accurate xy axis coordinates that change every time the moving surface moves,

so if the moving surface moves the xy axis coordinates change?

That is the basic design I have in mind.

Now the "destination" has a dot that the moving surface chases with the laser beam it is reflecting onto the "destination".

That moving dot on the "destination" is a facial point that is being tracked using facial recognition tracking point.

So the problem now is to video a face, apply facial tracking points to the face, feed the face with the points on it to the "destination", and then the moving surface sends the laser light onto the facial tracking point, whichever point I choose.

And as the facial point moves the moving surface sends light onto that point and then the xy axis coordinates change so the face moves has xy axis coordinates data in the moving surface as the moving surface sends light from the mirror onto the "destination".

I was thinking of using unity and c# to get the facial points, and then somehow getting unity movie texture to show the face with the points on it in a video game environment, then setting up the cube and shing the laser onto the movie texture and somehow getting xy axis coordinates from this setup.

But I'm not sure the movie texture can be used like this. So since I am learning c# and unity and I will put a lot of time and effort into this I thought I would ask if what I want to do is possible or not before I put too much effort into this project I want to do.

This topic is closed to new replies.

Advertisement