PDA

View Full Version : Mo capture in LW



Otterman
01-22-2010, 07:04 AM
Hey wavers-ive been told that i might have to produce some cgi animation to work with some real life footage. My understanding is that they will provide me with motion capture data (camera views, angle etc) then i can use this data in lightwave.

Is this right and whats the procedure? Just want to get the right info before i say "yeah i can do that".

Any pointers, advice welcome :help:

cresshead
01-22-2010, 11:26 AM
sounds more like you need to camera match and track footage than motion capture as such

>> syntheyes for tracking

http://www.ssontech.com/press/press033108.htm

http://www.ssontech.com/img/scncapfd.png

halley
01-22-2010, 12:24 PM
Ive worked with motion files that were exported from scientific modeling software to objects. I believe the same motion data can be loaded to a camera or Null that the camera is attached to. If they are only going to give you the footage to match the 3d camera to I would ask for as much documentation as possible IE camera angle, how high of the ground is the camera, What is the Focal Length of the lens etc.

Its been over eight years since I used any type of camera tracking software (in Maya) and it was tough back then but You could get close. I imagine it has gotten better.

SplineGod
01-22-2010, 12:30 PM
You can also try a free app called Voodoo Tracker which can help in that regard and does support LW. :)

Greenlaw
01-22-2010, 12:37 PM
I agree with Cresshead, sounds more like a tracking job than mocap. There was sort of an overview of what's available in a recent post. I'll search for it in a minute.

Found it:

http://www.newtek.com/forums/showthread.php?t=101367&highlight=PF+hoe

The more affordable packages (not counting the free ones) are PF Hoe Professional and SynthEyes; PF Hoe is a good entry-level program, and SynthEyes is much more advanced.

At work we use SynthEyes, Boujou, or R+H's own proprietary system; SynthEyes usually does the trick for us. R+H's system always works, but it's very expensive because it requires a specialized team to operate. For us, the system is usually overkill, but it has also saved our butts on a few occasions.

Greenlaw

erikals
01-24-2010, 04:09 AM
is there by the way any LW plugin that can track pixels inside Lightwave itself? http://erikalstad.com/backup/anims.php_files/question.gif

Philbert
01-24-2010, 03:31 PM
I had always wondered if it would be possible to load footage into LW and place some nulls on points like these programs do, then go to the next frame and adjust the camera so that the nulls positions are corrected, go to the next frame and repeat.

Greenlaw
01-24-2010, 08:02 PM
I had always wondered if it would be possible to load footage into LW and place some nulls on points like these programs do, then go to the next frame and adjust the camera so that the nulls positions are corrected, go to the next frame and repeat.

Yes, this is basically what I did before tracking programs like SynthEyes and PF Hoe became affordable. It's a lot more work to do it manually, but if the camera motion isn't too complicated, you can get very good results. What also works well is a trick David Ridlen showed me where you parent tracking nulls to your camera, and parent your camera to a plane which has the image sequence planar mapped it, and then use the camera's zoom to improve the precision of your tracking nulls. (Or something like that...it's been many, many years since I needed to do that, so I might have this setup mixed up, and I think he had IK involved somewhere. Maybe DR will see this and set me straight on his technique.) BTW, you can get very fine camera rotational control by placing a null very, very far away and having the camera target it in the Motion panel, and moving the null around to move the camera.

Depending on the shot, I find that I can do most of my tracking and matchmoving in 2D using Fusion's 2d tracker, which can track for scale and rotation, as well as position. For many fx shots where the camera isn't moving all crazy-like, 2D tracking is all you need.

Greenlaw

Philbert
01-24-2010, 08:05 PM
Sounds like an interesting technique. We learned Boujou in school and it was amazingly easy. Unfortunately I'd need to save up more than a few bucks to afford my own copy.

cresshead
01-24-2010, 09:28 PM
Sounds like an interesting technique. We learned Boujou in school and it was amazingly easy. Unfortunately I'd need to save up more than a few bucks to afford my own copy.

which is where syntheyes wins out...such low cost for such a great tracker.:thumbsup:

mind you if i have to track i'd try combustion as i already have it!:)
[2d tracking]

Greenlaw
01-24-2010, 11:03 PM
Yeah, we like SynthEyes at work (it's also what I use at home.) SynthEyes is very powerful and is just as accurate as Boujou, and a lot cheaper. But we also keep one Boujou license on hand just in case SynthEyes fails a really difficult track. Because each tracker uses a different algorithm, in these situations, Boujou may possibly give us a better track. (And if Boujou fails then we turn to R+H's tracking department and its proprietary software--they always get it done perfectly.) :)

Greenlaw

Philbert
01-24-2010, 11:33 PM
I have used SynthEyes and I agree it's quite powerful. It didn't feel quite as quick or easy as BouJou, but I guess that's why Boujou costs the big money. I downloaded VooDoo Tracker but never got to play with it. It looked kind of cumbersome though. Unfortunately I don't ever need to do much motion tracking so I never really get to use these on a production to see which I really like best. Plus at home I only have a mini-DV camera (not HD) that's not very good for tracking.

erikals
01-25-2010, 02:14 AM
Boujou can also facetrack afaik.
think HDRI magazine had a tutorial on it.

SynthEyes can, but it looks a bit tricky.
http://www.youtube.com/watch?v=tagyQuOZiA4

Otterman
01-25-2010, 02:50 AM
Hey many thanks for the info guys-theres some interesting techniques here. Im a little hesitant to manually set this up-heck i know how difficult it is just to match a still image let alone track animation. Thats why i was hoping mocap data would do the trick.

This needs to be spot on-im having to match up cad data with the real thing....see screen shots. The idea is to show off the internal workings whilst the 'real' camera pans around. A toughie-if the camera movement doesnt match up exactly the whole effect will fall over.

U still think software to track footage is the best route?

Philbert
01-25-2010, 03:08 AM
I'm not sure you're understanding what mocap (motion capture) is. That is where you have an actor wearing a suit with reflective balls and multiple cameras shoot a light at those balls and record the actors movements. Like this:
http://www.youtube.com/watch?v=3v4ITG2xyyk

At least that's the most common method.

What we've all been describing and what it sounds like you want is motion tracking or camera tracking (same thing).

robyht
01-25-2010, 03:18 AM
Hey Otterman

Definitely track the shot in 3D, use Syntheyes. If the real shoot has been done and you need to put a 3d model in there then you need to spend a lot of time tracking the camera, watch out for camera zooms etc. mo-cap is something entirely different.

Once you have completed the track you will have a LW scene (ie camera move) that you can then bring in the 3D model of the inner workings.

good luck

Otterman
01-25-2010, 03:25 AM
No i understand the differences i was thinking that moCap data could be used to capture the motion of the camera-(heights and angles..etc) then be interpreted by lightwave....parenting camera to the tracking null etc... I believe the company that will be doing the film work have a means of supplying me with this kinda data...im just not sure what it is just yet and how im going to interpret it. Thanks for the heads up-guess i need to find more background information

goodrichm
01-25-2010, 06:29 AM
No i understand the differences i was thinking that moCap data could be used to capture the motion of the camera-(heights and angles..etc) then be interpreted by lightwave....parenting camera to the tracking null etc... I believe the company that will be doing the film work have a means of supplying me with this kinda data...im just not sure what it is just yet and how im going to interpret it. Thanks for the heads up-guess i need to find more background information

Maybe Al Street's transMotion utilities will help:

http://www.ats-3d.com/transmot/

Teruchan
01-25-2010, 12:10 PM
When I was working in the studios, we did use Boujou quite a bit and it can be deceptively easy. On the other hand, I have also tracked some pretty complex shots in films by eye with nulls. It always depends on the shot, but it can be done with pretty good results even with complex camera motion.

Greenlaw
01-25-2010, 12:10 PM
No i understand the differences i was thinking that moCap data could be used to capture the motion of the camera-(heights and angles..etc) then be interpreted by lightwave....parenting camera to the tracking null etc... I believe the company that will be doing the film work have a means of supplying me with this kinda data...im just not sure what it is just yet and how im going to interpret it. Thanks for the heads up-guess i need to find more background information

Sounds like they used a motion control system (or 'moco' for short), which is a computer controlled camera, usually on a track or a powered rig. This is typically used to shoot a scene in multiple passes with perfectly matched camera moves, but the motion data used to control the camera can also be used in a 3D program. We use this sometimes, but with mixed results (usually because we hadn't supervised the shoot ourselves and we got unuseable data.) When it does work it's almost 'plug-and-play' though, and when it doesn't, then we use a tracking program like SynthEyes.

Greenlaw

Greenlaw
01-25-2010, 12:24 PM
Talking about motion control reminded me of some fun videos by Zbigniew 'Zbig' Rybczyński (of Ultimatte fame,) in which he used a 'man-powered' motion control system. The most interesting one is a 90-minute program called 'The Orchestra', and the way he did it was by playing the score and following a trail of paper on the floor with the waveforms of the music printed on them, while pushing the camera on a dolly. He would literally walk many dozens (hundreds?) of passes this way to compose very elaborate music videos. Yes, the camera drifts a little occasionally, but most of the time it's actually spot on. (And because of the surreal subject matter, the drifting doesn't really hurt the 'dreaminess' of the program.)

There's a making of featurette on the DVD that shows this system in action. You can get the DVD here: http://www.zbigvision.com/

Greenlaw

Philbert
01-25-2010, 12:44 PM
That sounds like it makes more sense and it's very undertandable that they would send this data to a 3D artist.

Otterman
01-26-2010, 02:59 AM
Greenlaw-i think your right about the multiple pass camera thang. Ive looked into this a little more... http://en.wikipedia.org/wiki/Match_moving

...and to quote

"Three-dimensional match moving tools make it possible to extrapolate three-dimensional information from two-dimensional photography. Programs capable of 3D match moving include:

Voodoo (freeware)
Scenespector VooCAT
Icarus (University of Manchester research project, now discontinued but still popular)
Maya Live
The Pixel Farm PFTrack
PFHoe (Cost-effective match mover based on PFTrack algorithms)
REALVIZ MatchMover (Autodesk bought and re-released as part of Maya 2010 bundle)
Science.D.Visions 3DEqualizer (which won an Academy Award for Technical Achievement)
Andersson Technologies SynthEyes
Boujou (which won an Emmy award in 2002)

These programs allow users to derive camera movement and other relative motion from arbitrary footage. The tracking information can be transferred to computer graphics software such as Blender, 3ds Max, Maya or :hey:LIGHTWAVE :thumbsup:and used to animate virtual cameras and CGI objects."

So there is a route to getting this info into lightwave. Have you managed this and if so what apps did you use?