View Full Version : Lightwave 10.1 & Red One MX & Nikon Lens SETTINGS

pekka varis
04-25-2016, 07:10 AM

I doing a 3 minute commercial video, and I have free hands to visualize & direct it pretty much the way I want.
Here is a X-Ray tech I learned from Prometheus: http://forums.newtek.com/showthread.php?150521-Easy-and-fast-X-Ray-effect

Now in this video about gear delivery automate machines in industrial environment, I go with the customer to a cable factory here in finland, and I shoot a video footage about the product: large soda-machine sized delivery machine. Nice!!!

I shoot the stuff with my Red One MX and I do a camera movements with dolly on tracks, and then I am DREAMING of this: softly cross-dissolve, element by element, the image to this X-Ray version of the delivery machine. In other words, I try to do a 3d motion tracking. I know how to use mocha with 2d plates, like here: http://www.imagineersystems.com/features/planar-motion-tracking/ You can watch that stuff on this anarchistic Dance musical trailer (warning, no Lightwave used on this movie!) that was on Cinema theater two weeks ago here in Helsinki: https://www.youtube.com/watch?v=N7yrBnir7o4&list=PL_pc5-TgTl3OXFOudAkoUl9r0KqATakP6

But that is not enough now. So where do I find some good tutorials about this kind of thing?

What If I calculate every spec: Red One MX CCD size, my Nikon lens settings, the real cm distances of camera/real life objects I shoot ? Distance of camera at the beginning / end of dolly track movement, and also the focus ring change in the lens..
If I setup my LW layout / camera to mach this, DO I get even close? Can I then just reconstruct the dolly camera movement by matching the timing?

If this is a crazy idea - what tools there are for this kind of compositing? Do I have to use mocha or such software?

Pekka Varis

04-27-2016, 11:59 AM
Although I haven't done much match-moving, I've done both manual and automatic match-moving and I definitely prefer using software. I use SynthEyes, which does a substantially better job than manually rotoscoping, and MUCH faster. I'm definitely no expert, though.


pekka varis
04-28-2016, 02:31 AM
Thanks! I seek out tutorials for SynthEyes..

04-28-2016, 04:32 AM
Check this out first, by the dazzling Mr.Rid


(p.s.I hope we get a LWGO2Fusion Update soon, that would help!)

04-28-2016, 09:24 AM
If you choose to go the automated SynthEyes route, there's a bunch of tutorials on the YouTube channel:


and here's one on Basic Automated Tracking:


From the sound of your shot, I think the Automated Tracker would handle it just fine. The Mr. Rid approach would probably work fine as well, although SynthEyes also matches the camera focal length, estimates 3D distances and horizonal / vertical orientation in the scene, etc.


05-01-2016, 06:43 AM
With the RED 1 and syntheyes, just remember the film back size (sensor size) and focal length estimation are intrinsically linked

You can't use Reds sensor specification as an input since the RED 1MX sensor will be running windowed and the full sensor won't be used (pixels on the sides only activated for EPIC, and running in 16x9....shooting 4K vs quadHD etc etc)

To do it properly take your resolution and times it by 5.4 microns....the size of an MX pixel

So shooting quad HD.....take 3840 x .0054 = 20.736mm
Do the same with height and enter that as your sensor size.

That way your focal lengths will jive with your actual recorded shot if you are using a 50mm prime you can enter that data and it eliminates some tracking errors from the software improperly estimating the focal length.....which is the basis for all 3D solves.
Although I would enter 50mm as an estimate in the software as a Nikon lens may actually be 48-51mm. Gives syntheys room to fudge the numbers a bit as you know your sensor size is accurate to 1/1000 of a mm

The same has to be done every time you change resolution on the RED as you are changing sensor size each time.
Just remember 5.4 microns for the MX (Dragon sensor has smaller pixels I believe 5.2 microns)