In my opinion, getting into 'entry level' motion capture, isn't a trivial thing that you can pick up instantly, at least, not if you want to do it well and if your goal is to achieve 'pro-level' production animation. It's one thing to shoot random motions with a Kinect and retarget it to a rig, but to be able to knock out specific actions for a character for a project on a scheduled deadline will take a solid knowledge of the gear and its limitations, a good understanding of how to best perform to get the desired result, and knowing how to edit the data to make it work for your scenes. 'Home brew' mocap is a lot of work but if you have the will to do it, it can also be a lot of fun.
Here's the process in a nutshell:
1. Capture your performance.
2. If necessary, process and edit the data.
3. Retarget it to your character rig. If the character has different body proportions, you may need to edit the motion again.
4. Import the character (or just the motion data) into your scene, and edit the motion for the scene if necessary.
Here's my mocap workflow in more detail. I use iPi Mocap Studio, Motion Builder and LightWave. I also use Maya for a small part of the process but it's not absolutely necessary:
1. Capture the performance using iPi Mocap Studio. In my setup, I use two or three Microsoft Kinect for Windows devices and three PS Move controllers. Two of the PS Move devices capture the hand rotations and the third is attached the head to capture head rotation. You can see my head rig here:
Mocap Helmet Update.
2. Track the motion and make corrections. iPi Mocap Studio is not a realtime process. Realtime capture may seem ideal but, IMO, the results are not very smooth. This is subjective of course, and it also depends on the motions you're trying to capture. The advantage with a 'post process' system like iPi Mocap Studio is that your computer and GPU can dedicate its power to higher accuracy over speed. I've heard some people complain that it's too 'slow' for them but my computer, which is fairly old and has a modest GTX 460 card, can process the mocap data at 0.6 seconds per frame, and that's been fast enough for our productions. Naturally, newer computers with more recent graphics cards can do this much faster. After tracking the motion to the body, you run a final pass where you apply the wrist and head motions from the PS Move controllers. Finally, if you wish, iPi Mocap Studio lets you apply finger animation using the new hand animation tools.
When you're done, you have the option to import a rigged character and have iPi Mocap Studio retarget the motion to the character, or you can export the motion data as a BVH or FBX file. I prefer to export the motion as a BVH for Motion Builder.
3. Prior to capture, I prepare my character in LightWave. The rig I use is a standard joints based mocap rig that's compatible with Motion Builder. One problem with LightWave's joints system is that the weight mapping is not compatible with third party programs because the weight maps have to be offset to a child joint. For example, in LightWave the arm weight map needs to be applied to the forearm joint, the forearm weight map needs to be applied to the hand joint, and so on. No, it doesn't make any sense but that's the way it's been since LightWave 9.6.1. In other programs, like Motion Builder and Maya, the arm weight map 'shockingly' goes on the arm joint, and the forearm weight map goes on the forearm joint, and so on. Imagine that!
Anyway, this means you need to prepare a separate LightWave rig for third party programs to read properly, with all the weight maps applied to the correct joints. My 'short cut' is to simply export the rig via fbx to Maya, auto weight it, and make any necessary corrections using their weight painting brushes. Once you get the hang of it, you can do this in just a few minutes. It doesn't need to be perfect, just good enough to see correct deformations in Motion Builder. When I'm done, I export the rig from Maya as an FBX, which I can now use in Motion Builder for retargeting and editing.
Naturally, I only need to run through step 3 once for each character. (In the case of the Brudders shorts projects, the last time I had to do this was for
'Happy Box' a few years ago. We're still using these rigs in our current Brudders production, we just added FiberFX fur and hair to them.)
4. In Motion Builder, I import the BVH and the rigged character, retarget the motion to the character and make corrections. I might import the audio track and the camera and any essential props from my LightWave scene, to make sure the motion is exactly what I need for LightWave. When I'm done, I export an FBX of only the character rig with the animation.
5. Finally, in LightWave, I open my original LightWave rigged character and then open the FBX with the motion data using Load Items from Scene, enable Merge Only Motion Envelopes, and click okay. If everything has been done correctly, the motion is transferred to the LightWave rigged character and it's ready for rendering. If you build your rig in LightWave with layered controls, you can edit the motion again if you need to. I typically just go back to Motion Builder for changes and tweaks.
There are many variations of the above steps but that's what works for me. Other users may use Rebel Hill's rigging system (which has a mocap rig you can edit in LightWave,) Nevron Motion and Genoma, IK Booster, or any combination of these tools. Some alternatives to Motion Builder are the online service Ikinema Webanimate, Mixamo, or possibly iClone Pipeline with 3D Exchange.
For our productions, the iPi-MB-LW workflow works great. Here's a peek at the process:
Sister Mocap Test. Right now, we're using it for a music video production and you can see a small piece of it here:
'B2' Excerpt
I hope this is helpful. There's more info with regards to our Little Green Dog productions in the
'B2' thread. There are also other mocap threads in this forum with lots of good info--here are a few search terms to try: motion capture, mocap, Nevron Motion, iPi Mocap Studio or Desktop Motion Capture, Motion Builder, Kinect, PS3 Eye.
G.