View Full Version : The motion capture library DISCUSSION thread

01-12-2010, 11:08 AM
The motion capture library DISCUSSION thread

this thread is intended for general discussion on motion capture, performance capture, lipsync and motion capture releated hardware.

there's also a mo cap library thread which i/we hope to keep clean and just list links for motion capture libaries.
which yo can find here >>>>

main idea here is to talk about motion capture in all it's aspects from full body capture to lipsync to also setting up devices and techniques for head/eye blinks and also plugins and scripts....with Avatar i think it's about time we all took a deeper look into motion capture.

so....chatter away! :D

01-12-2010, 11:34 AM
of course to use mo cap you may want to use a 3rd party app/plugin with lightwave,
you can of course set up mocap in lightwave without plugins but it's just abit painful
if you mix n match up mo caps from different lib's as the rig will vary quite abit.

so to simplyfy there's this>>


01-12-2010, 11:49 AM
would be good to see some links to plugins and training vids in this thread as well as ideas/tutorials on getting mocap in and lipsync, eye bllinks, creating morph targets etc.

01-12-2010, 12:38 PM
Are the links for any and all mocap or LW friendly/compatible ones?

edit--- also... camera tracking/motion capture as well?

01-12-2010, 12:53 PM
Are the links for any and all mocap or LW friendly/compatible ones?

edit--- also... camera tracking/motion capture as well?

i've edited the links thread to give info on lightwave import now.

as for this thread yeah, add in chat on all things including tracking and hardware.:thumbsup:

01-12-2010, 01:39 PM
And to simplify it further:


This is a FREE product as opposed to a $150 or #300 product that will get motions into LW. Doesn't have the auto-weight-mapping feature - or many of the other features of JimmyRig, but then it doesn't cost anything. ;)

cool!...yeah! i was trying to remember it's name! :thumbsup:

01-12-2010, 02:22 PM
Well here's a link with much food for thought:


This specifically on how to import BVH into Lightwave in a PRACTICAL way (and maybe just stick to the subject and forget about ikb).

01-12-2010, 04:30 PM
But isn't IKBoost the ONLY way to seriously modify mocap within LW?

Yes, I believe this is true. But I'm thinking about getting it to work first, properly within Lightwave. For which, their is absolutely no instructions for a working method and no tutorial.

Also, think it may be much easier and efficient to alter actual BVH file outside Lightwave. Don't really know.

01-12-2010, 04:32 PM
But isn't IKBoost the ONLY way to seriously modify mocap within LW?

Yes, I believe this is true. But I'm thinking about getting it to work first, properly within Lightwave. For which, their is absolutely no instructions for a working method and no tutorial.

Also, think it may be much easier and efficient to alter, edit, actual BVH file outside Lightwave. Don't really know.

01-13-2010, 02:18 AM
Hi Cresshead, and All,

Thanks for the thread and info.
Just a thought with all of this.

Since LW has issues with being able to handle and edit this data natively.
What needs to be added (in CORE) to better address these issues?
Do we need another product like that of motion builder..or something that is integrated directly into CORE?

With so many LW users shy of using IKBoost, this also brings up the need or new ways to use this data.

This is an interesting topic...and one that whether you're a traditional animator or someone who uses motion capture/performance capture.

These techniques are here to stay...and will be part of production and effects work...with more and more being part of main storytelling and action.


01-13-2010, 11:48 AM
But isn't IKBoost the ONLY way to seriously modify mocap within LW?

Of course not.

Shows like Roughnecks were done predominantly with mocap, and that was long before IKB existed.

In my opinion, one of the key issues with modifying mocap is being able to layer modifications on top of the base mocap data. While there's a bit of setup involved, I would rather build a rig that uses expressions to connect the character's skeleton both to the imported BVH skeletal structure, as well as control nulls that allow you to animate additively.

Working this way means you can keep the integrity of your capture data, and have quick and easy tools to make the final animation conform to the needs of the scene.

If you wanted to take it a step further, you could create switches for key elements of the character so you could turn the mocap on or off for those elements at any point in the scene. that would be really cool.


01-13-2010, 11:55 AM
things that would help edit mo cap

data thining
animation layers
animation filters so you can turn off say the arms or arm
blending of 2 mocaps to say use the pelvis/legs feet of one mocap with the another mocap but just using the upperbody from it.
clip assembly using mo cap clips from many mocaps to make a new edited clip say like having a siting to standing, walking, sneezing clips to make a new master clip.

01-13-2010, 01:59 PM
While I respect your point of view, it's worth pointing out that expressions really aren't complicated or difficult to understand, especially at the level they would be needed for something along these lines.

It's really just saying "take the value of the heading channel for this bone, add the value of the heading channel of this null, and send the result to the heading channel of this bone in my character". You do have to repeat the process several times, but it's not like it's rocket science or anything.

It may take more time to set up than IKB, but in the grand scheme of things, I'll bet you'll save a bunch more time in the long run by creating a setup that's easier and more powerful to use.

Specifically, imagine having a setup where you can modify the final animation of the character without degenerating the original mocap. This gives you the freedom to experiment and refine your touchups without worrying about having to start over from scratch if things go off track.

01-13-2010, 02:35 PM
Saying you don't want to understand expressions is like saying you don't want to understand texture mapping so you just use procedurals instead. How limiting is that?
Why would anyone NOT have the knowlege on how to use something that has great potential benefits? Thats just odd to me -I'd love to know how to use every software package to expert level and be able to write and program what came into my head.

Expressions are just ways of you telling parameters in LW to change based upon other values and they are very useful.

I suspect that intimidation is the real reason behind the reluctance to use them.

01-13-2010, 03:45 PM
To be fair, I do understand your point of view as I found Lw's expressions highly confusing, and only started using expressions when I moved to XSI, where they are much simpler to use and yet far more powerful than LW.

I couldn't really operate without using them now. Maybe wait till core, but don't rule them out.

01-13-2010, 04:09 PM
About modifying the final animation... from what I have seen with IKB, there is no degeneration of the mocap at all. The mocap continues to work just fine after youj've made a slight alteration with IKB. What I will be using IKB for mostly, is for fine-tune positioning.

Okay, here's a hypothetical scenario: You've got your mocapped character in the scene, and he's supposed to walk to a certain spot and pick something up. But the mocap doesn't line up with the set, so you've got to shift something around. It could be his entire body or perhaps just the position of his hand.

If you move his hand on one frame using IKB, it will of course jump, and you'll have to use the "apply keys" function. This will smooth things out, but it will modify the base mocap that you imported. The more you do this, the more you'll lose the integrity of the original capture. Kind of like if you are working with the smudge tool in photoshop. After a few iterative strokes, you can irreperably damage the image.

On the other hand, if you had a control null that allowed you to additively adjust the position of the hand without changing the original mocap keys, this would provide several advantages. First, you could work and rework your adjustments extensively, and go immediately back to the original by just deleting the keyframes on the control null. Also, let's say you need to shift the hand two inches to the left for 50 frames (instead of just one). You can easily key that kind of adjustment with a control null by creating keys at the beginning and end of the frame range. Ultimately, you have complete control of how your adjustements are blended into the original mocap simply by how you adjust the keyframes on the control null.

When you look at it this way, the "apply keys" method using IKB is actually quite limiting and cumbersome in comparison. Although there may be other ways of smoothly adjusting ranges of keyframes on every frame that I'm not aware of.

Honestly, I would LOVE to learn how to create expressions and be able to wrap my head around them - I don't have the time and I don't want to make time since I can use that time better elsewhere. At least at the moment.

This is an understandable point of view, but in reality, it might take you 15 minutes to get the basics of how to create and connect an expression, and it could save you many many hours down the line. So in fact, using that 15 minutes elsewhere could very well mean not being able to spend those hours elsewhere when you are bogged down doing the work that the computer could be doing for you. If I get some time, I'll try to put together a quick tutorial or video that demonstrates how simple this can be.


01-14-2010, 01:07 AM

Here's some info on how we use mocap at the Box, and also what I've been dabbling with at home for my own personal projects.

At the Box, we have our mocap generally done by Giant (the people who did a lot of Avatar's mocap.) What they deliver is very clean, but we also add our own tweaks and keyframe animation on top to suite our shot requirements and personal taste. This is done by our animators using Maya, and the animation is exported to LightWave via PointOven.

There was a time when the whole process stayed in LightWave. To start, we would provide the mocap studio with a layered rig that basically had three skeletons. The first layer was for the mocap, the second for keyframe animation, and the third was for the actual deformation. When we got the rig back with mocap applied, we would tweak the second layer as needed. We also used LightWave's Motion Mixer to create new animations and loops. This system worked pretty well for the time.

At home, I've been playing around with a lot of low-cost and homebrew systems. So far, this has been experimental and not quite ready-for-primetime, but it's been very interesting. Here's a list of the tools I'm toying with (in no particular order):

iPi Desktop Motion Capture
BVH Hacker
Poser Pro and Poser 8
DAZ Studio 3
IK Boost

LightWave, of course, is my 'target' program for the mocap. Everything else on this list is for creating or editing mocap. Here's a little about each:

iPi Desktop Motion Capture (http://www.ipisoft.com/index.php). This is a markerless motion capture system that uses a laptop and up to four webcams. It's relatively inexpensive: currently you can buy into the beta program for $495, and get a free upgrade to the full Standard four-camera version when it's released. This saves you about $1000 from the release price. The webcams cost about $70 to $80 each., so in total the system is about $850 or so.

There are two parts to this system: iPi Recorder and iPi DMC.

iPi Recorder will run on a laptop with a modest graphics card, and is used to capture synchronized video from up to four webcams. They recommend a particular Logitech camera for Recorder, but with the latest beta, you can use the Playstation 3 Eye Camera. Being a 'markerless' system, no special suit is required, but for the best results you should wear close fitting jeans, a long sleeved red or green shirt, and a short sleeved black T-shirt over the long sleeved shirt. The idea is that the black shirt will separate the arms from the body and help keep shadows from confusing the tracker. (I found that you can buy ideal clothing for this system from Old Navy.)

iPi DMC reads the video footage, and basically matchmoves an IK enabled rig to the footage. It requires a beefier desktop computer and a fairly decent graphics card because it uses the GPU to calibrate the virtual cameras and to solve the motion capture data. It currently exports .bvh, .smd, and .dae (Collada).

This system works in theory, but it's still very much a beta product. The video capture is pretty reliable but the calibration process is a little tricky and time consuming. The matchmoving process either works or it doesn't. When it doesn't, you can do supervised tracking using the IK rig, but for now this feature is very limited. Also, the software currently doesn't track the head or hands, but the developers say it will before the final version is released. Recently, the developers have been focusing exclusively on adapting the code for the GPU acceleration, but they've said they will be working on adding and improving specific features soon.

I haven't used the iPi DMC system in a few months because of my work schedule, but now that I have some personal time again, I plan to get back into it this week. They recently released a major update, so I'll write more when I get to try it.

I got Jimmy|Rig (http://www.origamidigital.com/typolight/index.php/home.html) with the intention to use the Pro version as a 'cheap' alternative to MotionBuilder. The current version lets you import a model, and it will rig it for you automatically. Then, you can apply and mix from a large list of canned mocap data. The current 'Lite' version doesn't allow you to import your own mocap so it does quite serve my purpose; the 'Pro' version, however, will let you import your own data, but it's not available yet. The program is fun and very easy to use, and I'm looking forward where Origami Digital takes it with the Pro version.

Animeeple (http://www.animeeple.com/) shares some functionality of Jimmy|Rig, and it does allow you to import mocap. It doesn't have the cool 'auto-rigging' capability or the useful Grounder modifier in J|R, but it's surprisingly capable for a free program. The developer seems very interested in serving LightWave users, which is big plus. I haven't used it very much yet, but will be taking a crack at it later this week, after I've worked with the latest iPi software release.

BVH Hacker (http://davedub.co.uk/bvhacker/) is pretty basic, but it's also free.

I wasn't especially interested in Poser (http://my.smithmicro.com/mac/poserpro/index.html)before, but then I read about how some iPi DMC users were using it to edit mocap, and using a commercially available Python script called BVH Constraint for Poser (http://market.renderosity.com/mod/bcs/index.php?ViewProduct=66571&TopID=10626.) to fix feet-slipping problem that may occur when applying mocap to a character with different physical proportions from the original mocap actor. After playing with the current version of Poser, I was surprised to see how far it's come in the last 10 or so years. Poser Pro has very nice character model editor and a decent walk generator, and of course it can edit and output mocap data. And there are a number of tools available for getting Poser data into LightWave. Like with J|R and Animeeple, I'm just dabbling with Poser at this stage, but will have more to say about it in a few weeks.

DAZ Studio 3 (http://www.daz3d.com/i/3d_art_resource/free_software/?cjref=1) has some similar capabilities and it's free, but I haven't really tried it yet. It has some realtime 'puppetry' animation which I'd like to try at some point.

Finally, we have LightWave's own IK Boost. The last time I tried IK Boost was several years ago when it was first introduced, and back then the plug-in was so buggy that I abandoned it immediately. Recently, however, I was reading about Larry Schultz's IK Boost method for retargeting mocap data to a character in LightWave and editing with it. I haven't tried this workflow myself yet, but it sounds easy and straightforward with useable results, so I'm looking forward to taking another crack at IK Boost after my next iPi DMC session. Glendalough already posted this link on the topic, but it's worth repeating: What is the workflow with Bvh? (http://www.newtek.com/forums/showthread.php?t=96947). Lot of good IK Boost mocap info there.

If anybody here is interested, I'll be happy to continue reporting on the above software as I work through this 'hobby' project over the next few weeks.


01-14-2010, 10:57 AM
Nice info Greenlaw.

How goes things at the Box?
Are they keeping you busy?


01-14-2010, 09:01 PM
Are they keeping you busy?

We've been insanely busy but things finally quieted down yesterday. I hope I don't regret saying this but I'm looking forward to a long vacation. Of course, I'll spend a lot of it 'working' as usual. :)


01-21-2010, 09:56 AM
just looking into face rigger for mocap/hand keying...and found this news item on a plugin ove on cgtalk

currently only for max