View Full Version : Mo cap and Lightwave

09-22-2009, 03:50 PM
Ok so ive been looking around and have yet to find any motion capture sytems that dont cost thousands of dollars. So my question would be what home made kind of motion capture stuff has anyone successfuly used with Lightwave? Or at least a link to a place that goes over a ton of mo cap stuff.

09-22-2009, 05:02 PM
I'm experimenting with a markerless system called iPi Studio. It's about $495 and uses up to four webcams, which cost about $80 a piece, for a total cost of $815.

While I'm making progress with the system, I have to warn you that this is still a beta product. After iPi Studio is out of beta, the developers intend to raise the price to $1495 (or $995 for a three camera version,) which is still a good price for what it will do.

Right now I'm having a few issues with calibrating the virtual cameras in iPi Studio, but I think I'm close to solving that. The problem started when I moved up from three to four cameras. While four camera should, in theory, give me more precise data, it's also a lot more difficult to set up. My guess is that the space I'm using may be too small for this system, so I'm arranging to shoot a session in a much larger space at work.

If I can get past this next round of tests (hopefully this weekend,) I'm passing the output data on to the Jimmy|Rig guys so they can test for compatibility.

Go here for the official website: iPi Software (http://www.ipisoft.com/)

Here are some examples of what it does by a user who goes by the name LightDog: Basketball Movie (http://www.youtube.com/watch?v=o8pqjUVlcLg&NR=1) and Tai Chi (http://www.youtube.com/watch?v=glgRFJauBTE&NR=1)

You might have noticed that iPi Studio isn't tracking the head in these videos, but it will in the final release, and the developers intend to add face capture too.

So why so cheap? You should understand that this is not a realtime system. Here's the basic workflow:

1. Capture a calibration session. This means you set up the cameras and walk around your space with a Maglite.

2. Capture your action sessions. Exactly what it sounds like.

3. Calibrate the virtual cameras. This means having the software automatically position the virtual cameras based on your calibration video with the Maglite.

4. Tracking. In this stage, you position a 3D rig over the video views of the actor, and then the sofware will automatically track the rig and joints for you. (Currently there is no supervised tracking, but they will be adding this feature.)

5. Export your data. Currently, iPi Studio supports .bvh and Collada, but they will be adding more formats. The .bvh format seems to work with LightWave, but I'm still early in my tests. I don't think I've tried Collada yet.

If you're interested, I am eventually going to post detailed information and tutorials based on my own tests in the forums at my website.

There's a competing product from Natural Point (http://www.naturalpoint.com/) that uses six motion capture cameras, but it costs about $6000. I think it looks pretty good, but at six times the cost it was out of my price range.

AutoDesk also sells something like iPi Studio called Movimento (http://usa.autodesk.com/adsk/servlet/item?siteID=123112&id=11868979). It's probably more expensive, and I don't know enough about it to comment further.

Hope this info is useful.


09-22-2009, 07:23 PM
Theres really a couple of issues when it comes to using mocap in Lw:
1. Importing it
2. Editing it

The easy part is actually getting it into LW whether it be thru Motion Builder, jimmy Rig or otehr methods directly in LW. So far the best solution Ive found for editing/tweaking the mocap has been IKBoost. Motion mixer would be a close 2nd.

Ive also been looking at Ipisofts solution and also use a lot of libraries that are available. Its been generally pretty easy to get motions to loop using IKB or splice different motions together.

09-23-2009, 04:30 AM
I say the importing part is the easiest mainly because LW loads bvh files, fbx, etc.
Once you have the bones created by the mocap import you can fit that to your character or vice versa. That aspect is generally pretty well documented or understood.
Once you have the bones setup into the character you have various options to get the actual motion data onto the new rig.
Because of the way IKB handles motion data it makes it idea for editing mocap.
We did a webinar recently on getting mocap into Lighwave and editing it. The DVD should be available soon. :)
That DVD combined with the other IKB material should be pretty comprehensive.

09-23-2009, 07:49 PM
Contact me directly about that in the meantime. I know theres a bunch of webinsars that are in line being edited. :)