Low cost motion capture

Started by
5 comments, last by Scouting Ninja 6 years, 4 months ago

Hello guys! 

I was recently researching about motion capture systems and found many methods and hardware, some of them are expensive, some are cheap. As we are indie, we are looking for Mocap for like 2k$ or like this. I know we need to clean up animations anyway, please do not tell about it. What I want to hear is what way will give us cleaner animations? 

 

First option we have is to use some Kinects v2.0 + PS Eye and Move. It will cost around 500$ + 1000$ for IPI Soft wich can record and convert all the animations.

 

Another option is to use 8+ cheap FullHD 60fps cameras (~500$), any black sport suit (~100$), small balls (paint them with reflective paint)(~50$), and some lights(~50$). Use it as studios do with Motion Bluilder (1000$).

I know there are speacial cameras for Mocap, but they are really expensive..

 

Some say kinects + IPI are better choice, no need to wear any suits or work with marks.

But my guess it that using cameras with suit and marks will give us cleaner animations, but what is better in your opinion?

Advertisement
1 hour ago, Flakky said:

First option we have is to use some Kinects v2.0 + PS Eye and Move. It will cost around 500$ + 1000$ for IPI Soft wich can record and convert all the animations.

This is a good path but you don't need expensive software. Blender is free and lots of motion capture enthusiasts have made software for it.

The quality you get depends on your capture device and artist, but you will only get large animations. It works well for games.

1 hour ago, Flakky said:

Another option is to use 8+ cheap FullHD 60fps cameras (~500$), any black sport suit (~100$), small balls (paint them with reflective paint)(~50$), and some lights(~50$). Use it as studios do with Motion Bluilder (1000$).

A few pains here. 8 camera's is a lot of work to clean the output from them. Without a full production team this is just going to be a pain.

You don't want reflective surfaces, not for your cloth or your trackers. The more reflective a object is the less it keeps it's original color.

 

Here is what I recommend for this kind of work:

Download Blender.

Get some colored stickers, the dots but the large ones. Ping pong balls work better, I have home made markers that I use for mine(colored paper with hooks).

Get a long sleeve shirt of one color, no pictures or anything on it. Get long pants of one color. The color should be a odd one like neon pink. Cheaper cloth is better.

A huge cloth of a single color, again the color should be odd. Don't get silk, a cheap cloth works best. When selecting a cloth it should reflect as little light as possible and shouldn't have very white edges when you fold the material.

You need 3-4 cameras. They don't need to be the same quality, your mobile will work. You can also use one if you do 2D animation or are willing to do the same action 3-4 times and cleanup the difference latter. I use the mobiles I keep around for testing my games.

Some skin safe markers is also needed for face capture, but you will find this is a huge pain to do; even with pro tools it's a pain.

 

This will be enough to get started. Start small, like capturing your arm and move up as you perfect it.

 

I really recommend Blender over any other software for this. It's used all over the world in professional productions and because of it's opensource nature, it works with all kinds of formats and files.

Blender also has a huge community of motion capture hobbyist, meaning that it has tools for dealing with less than perfect capturing results.

@Scouting Ninja 

Thank you for your answer!

What about software.. Are you talking about Blender for cleaning animations? Because IPI is made for recording, cleaning and exporting, as well as Motion Bluilder. For other purposes we use Maya.

I am talking about Blender for everything after you captured the data.

You clean the video and highlight contrast using Blender's composer.

You track and convert to 3D data of choice. Because Blender has tracking in the software you get data -> 3D instantly. You can use paths direct tracking and lots of other options.

This video I found quickly. Here you can see how it's working with just a single capture and no extras. That is how easy Blender is to use.

 

Then you re-target your rig using Blenders bone constraints and IK Rig and get your basic animation. Bake this and cleanup result.

@Scouting Ninja, thanks for the info!

 

Can anyone tell more about using iPi + kinects and PS Eye? There are lot of vids about good results, like this:

or this:

 

1 hour ago, Flakky said:

Can anyone tell more about using iPi + kinects and PS Eye?

http://docs.ipisoft.com/Main_Page

" rel="external">

The basic setup is you need a way to read the capture data from the Kenetic. That is the recorder it renders things in this bright colored depth pass for the ipisoft mocap.

 

Then you need to calibrate the data so it's synced so that all inputs are working in the same space to fill in missing info. This is where most auto motion capture software like ipisoft are weakest.

You will need to run the calibration multiple times just to sync 2 data inputs more inputs will take longer to fix. For Ipisoft to work you also need to make actions very clear, doing the baboon arms and broad legs like you can see in the video.

 

Last you need to pass the animation to the model. Ipisoft does this on a per point bases so if there is any data missing that part just stops moving. This is where auto software is at it's best. You will just point and it will follow.

This topic is closed to new replies.

Advertisement