Real time motion capture

Started by
7 comments, last by eMove 14 years, 2 months ago
Hi everyone! My name's Cash and I'm part of a team working on a project at the University of Sussex here in the UK called eMove. It's a real-time motion capture system for use in gaming. We're far enough along now that we think it's a good time to start getting developer feedback from people like yourselves :) We've got a few demonstration videos which should help give a feel for what we're doing:
">Suit run down (bit of a commercial video, sorry about that!)
">Motorcycle tech demo (created using the Unity engine)
">UDK tech demo
">UDK development progress 1 (over the shoulder iphone video, please excuse the quality!)
">UDK development progress 2 Basically what we're doing at the moment is controlling the rig of our robot character using the data from the eMove suit. The suit we're using is an early prototype, so there are still a few software issues to iron out, but we're getting there and I think you'll get the idea. The first production suits are in the final design stages and will look a lot like the suit you see in-game during the first UDK tech demo. We first started the eMove project a few months back using the Panda3D engine, then moved to the Unity engine for a while, where we put together the motorcycle demo. We were over the moon when we discovered the Unreal 3 engine was now free for non-commercials though.. it's been great working with it :) We'de love your feedback, positive or negative. Ideas, suggestions, whatever :) Myself and my colleague Jake (who you see using the suit in most of the videos) are the guys working on the games side of things, so it's only a small team, and the more brain-picking we can do the better! Thanks for reading! - Cash & the eMove team
Advertisement
Quote:Original post by eMove

It's a real-time motion capture system

Real time is such misused term.

What is input latency of your project? What is the sampling resolution? What is sampling jitter?
Quote:Original post by Antheus
Real time is such misused term.

What is input latency of your project? What is the sampling resolution? What is sampling jitter?


The underlying software captures 120 samples per second, at <5ms latency (using a serial connection to the suit). This software is currently run separately to the game itself, in the background. Local TCP is used to communicate the mocap data between game and background software, this ups the latency a bit, to approximately 50ms.

Over the coming months, the background software will be collapsed into a DLL and used directly by the game, bringing the latency back down to <5ms give or take.

Basically game framerate is our primary bottleneck, and the last development video uses a spring driven ragdoll to follow the mocap. That currently suffers from high latency (100ms or more) due to the heavy physics, we're working hard to bring that down. Non-physical animation is quick though, and in most cases this is what's used.

Edit: Regarding sample jitter, with a properly calibrated suit (which is something we do manually at the moment, but will be automatic in future) this is very low. The suit's joint angles use high accuracy rotation sensors, and hasn't given us any problem with jitter (unless we drop the suit on the floor, it doesn't like that!)
A little bulky looking rig, but in general looks good. What about non gaming apps? Have you prototyped anything with fine motor control? What is the degree of accuracy? And why no data gloves? Is this because of the gaming focus?

In general what are your thoughts on the different engines used?

I am not sure the complaint on 'real time'? He is not doing capturing and offline processing.. ie real time.
Quote:Original post by stake


I am not sure the complaint on 'real time'? He is not doing capturing and offline processing.. ie real time.


That is on-the-fly processing.

Real-time means that each sample will be provided in 5 ms and never more than 5 ms. Imagine controlling a scalpel with such a tool, or a crane. And you get lag while making a precision move, causing to cut an artery, or swing the load hanging from the crane for 150ms longer, causing positive feedback to create a pendulum.
Quote:Original post by stake
A little bulky looking rig, but in general looks good. What about non gaming apps? Have you prototyped anything with fine motor control? What is the degree of accuracy? And why no data gloves? Is this because of the gaming focus?

In general what are your thoughts on the different engines used?

I am not sure the complaint on 'real time'? He is not doing capturing and offline processing.. ie real time.


The prototype suit is indeed bulky, but the production suit (which we'll be moving to in the next 2-3 months) should be much lighter and more pleasant to use.

There are certainly many non-gaming application for the suit, but gaming is something we're focusing on for this project.

The joysticks on the suit are designed to be interchangable on the production suit, so there is potential to swap them out for data gloves or other controllers in future. We use joysticks because they offer a nice solid grip and work well when controlling player movement (with 2 analog sticks) as well as have triggers for firing weapons and what not.

In terms of engines, I found Unity very nice to use. It has an extremely rapid asset pipeline, a choice of scripting languages (C# and &#106avascript) and a very intuitive way of using those scripts to control game objects. It is however lacking in the really high end side of things: advanced shader/post-processing control and low-level animation (which was what really hurt us).<br><br>UDK (the free Unreal 3 engine) has a steep learning curve, what with it's purely UnrealScript codebase, and a at time painfully slow asset pipeline. But it's in an incredible engine, and can't recommend it enough. The animation system in particular is superb, and offers very low level bone control (which has made our lives much much easier). The Kismet visual scripting language also makes it really easy to expose your lower level code to quick changes by non-coder types in the editor.
Quote:Original post by Antheus
That is on-the-fly processing.

Real-time means that each sample will be provided in 5 ms and never more than 5 ms. Imagine controlling a scalpel with such a tool, or a crane. And you get lag while making a precision move, causing to cut an artery, or swing the load hanging from the crane for 150ms longer, causing positive feedback to create a pendulum.


There is very minimal cleanup or post-processing done to the mocap data as it comes in, since it's in effect just a pile of joint angles. In the simplest case, you can literally just dump these angles onto your character rig and you're away.

Any latency in the system is purely at a communication or physics level.
50 ms is quite high latency! Are you running these on loopback or across a LAN? You should be able to get much lower latency across a LAN. Are you using VRPN or straight up TCP? Maybe is you could easily switch to UDP and you could bring the latency down. Depends if you want to try that out.

Also with a high amount of jitter the sampling rate goes way down unless you plan on doing a lot of buffering.
Quote:Original post by stake
50 ms is quite high latency! Are you running these on loopback or across a LAN? You should be able to get much lower latency across a LAN. Are you using VRPN or straight up TCP? Maybe is you could easily switch to UDP and you could bring the latency down. Depends if you want to try that out.

Also with a high amount of jitter the sampling rate goes way down unless you plan on doing a lot of buffering.


We're just doing vanilla TCP localhost comms, no network involved.

50ms is a conservative figure, and the main reason it's not much lower is UnrealScript's sockets aren't that hot (Epic have actually recommended against their use for any real-time applications).

We could pull this TCP code out into a compiled library to speed it right up, but this need for TCP communication is only a temporary measure until everything lives in a library and the serial comms to the suit is done directly by the game process. At this stage our latency becomes negligible.

This topic is closed to new replies.

Advertisement