Antonio Boalís asked on Facebook - Is there a way to export the movement of the camera to be replicated by a drone?

RPSchmidt

Active member
So I saw this question over on the Lightwave 3d Support Group Facebook page, and my first thought was "Why would you want to?".

The reason I say this is because it seems to me that if you are trying to matchmove a 3d object to drone footage for an animation, it would be far easier (and more accurate) to:

  1. Shoot the drone footage
  2. Do a 3d camera track on the footage
  3. Export the camera to Lightwave
There are too many environmental variables that affect drone flight (wind being the first and foremost). I would think that would make the compositing process extremely frustrating.

Or am I wrong?

I also thought that I might be misunderstanding the question.

I don't use Facebook or I would just post my comment there.

Thoughts?
 

prometheus

REBORN
I perceived it from the title of this post, as it was about creating the camera movement in lightwave or using other 3D software and it´s camera, then let the drone fly the same path :D
Or perhaps export the movement of the drone camera and transfer the movement to another drone, that wouldn´t work unless it´s a pure record of the drone movements exact gps position that is recorded then duplicated over to a programmed path of a drone, and in such case it really hasn´t anything to do with the camera.

which would have no real purpose for matching cameras to a photoshoot I suppose, but just interesting to see what kind of manouvers you could program with a Lightwave animated object, or camera.

The proper description should be, is there a way of exporting a drone camera movement to a Lightwave camera.
As for environmental variables, well..there must be a recording script on the drone that records it´s movement, or gps if it could be so accurate to track tiny movements caused by the wind, not sure what is out there.

Not related exactly, but with google earth studio, you can export out the animated path to after effects, and there´s a tutorial to do it to blender..which I haven´t checked properly yet.
 
Last edited:

RPSchmidt

Active member
I mean, I understood what the OP was asking; I just can't puzzle out why.

The drone will never be able to fly the same path as the camera; it can get pretty close, but there are so many environmental variables to drone flight that you simply won't get the same flight path.

But perhaps it is as you said... the purpose isn't for camera matching. But that still leaves me wondering why?
 

prometheus

REBORN
I mean, I understood what the OP was asking; I just can't puzzle out why.

The drone will never be able to fly the same path as the camera; it can get pretty close, but there are so many environmental variables to drone flight that you simply won't get the same flight path.

But perhaps it is as you said... the purpose isn't for camera matching. But that still leaves me wondering why?

Who said anything about flying outdoors?
:p
Indoors with multidrones syncing to perform task is fully doable, so why not with converted camera or object paths to a drone, but then again, a camera doesn´t behave as a drone, so those variables is different.
you could probably just take a drone similator and extract the motion from there.

But for outdoors, yes there may be wind affecting, but that depends on drone type and how heavy it is and other designs for the drone that is stabilizing it.

If you were to use a 3D environment with buildings, accurate scaling and origin/rotation, then make a fly path in 3D so it can lift off, fly to one building, land and deliver, lift off and fly to another building, all set within a 3D environment..then I can see the reason why.
Perhaps there are simulators for this, the question may be if you want to add your own origin elements and targets, which you may not find or be able to add within a simulator.
 

MonroePoteet

Active member
Yes, I think even if the LW camera motion could be exported to fly the drone, the environmental variables would require an after-flight match moving pass anyway. I think the drone I have uses GPS coordinates (e.g. Latitude / Longitude or UTM coordinates) for its flight information, so I'm not sure how exactly the exported LW path could be matched. The single-precision floating point numbers used by LW may cause loss of precision in such large-scale scenes.

I think a more straightforward approach would be to storyboard the desired drone flight for the target presentation / production, fly the drone to the storyboard, and then match the camera motion with Syntheyes or other matchmover software.

I've done this successfully in LW2015 using Syntheyes to matchmove my DJI Inspire drone footage and insert LW objects and cast shadows of them onto shadowcatcher objects. Syntheyes directly exports a usable Lightwave scene file, and I usually parent my LW objects to the "tracker" Nulls it exports.

The matchmoving itself can be a relatively minor part of the job. Getting the lighting, gamma, fog / mist, motion blur, etc. to match up so the 3D assets don't look out of place can be more challenging than the camera motion matching. In my experience the 3D assets don't "skate" around at all on the background plate when I use the Syntheyes exported LW scene file and parent the 3D assets to the exported tracker nulls.

mTp
 

slacer

Active member
And hope that the camera in the original video does not use zoom or doesn't use more than one cam.
If the drone operator also has control over the cam, you are lost my friend.
 

MonroePoteet

Active member
Yes, matching a camera that doesn't zoom during the shot produces better results, but the SynthEyes doc has some recommendations for shots with zoom:


Without zoom, Syntheyes did an excellent match moving job when I did pan / tilt motions with the drone camera. If those are known to be fixed during the flight you can Lock the various channels in SynthEyes (usually used for tripod shots), but especially in windy situations even with the *superb* stabilization of the DJI Inspire drone the camera may tilt / pan / roll slightly during flight.

mTp
 

Sensei

TrueArt Support
If you're able to record GPS locations of a drone (e.g. put smartphone on drone to have data, use some app which will write GPS to a file), then to a .CSV file, you can load it to LightWave using my TrueArt's Node Library CSV Reader.
CSV is industry standard format, imported/exported by e.g. Excel and Open Office.. you can write script which will output it within minutes of work..
 

slacer

Active member
The non military resolution for GPS is in meters not centimeters. This is only a approximation of your drones position. Not to forget the height info.
From a google search:
As is well known an by the standards of today, the resolution or accuracy of handheld GPS is +/- 3 meters.
 

COBRASoft

Active member
That's true for 1 gps measurement, not when you're taking 'weighted averages' of multiple reads, taking the quality of the measurement into account (part of a gps read). It won't be military precise, but much better than 1 individual value.
 

tischbein3

Active member
So I saw this question over on the Lightwave 3d Support Group Facebook page, and my first thought was "Why would you want to?".
First I also thought this, but previz shot planing and getting and filming a smooth automated camera path comes to mind


The reason I say this is because it seems to me that if you are trying to matchmove a 3d object to drone footage for an animation, it would be far easier (and more accurate) to:

  1. Shoot the drone footage
  2. Do a 3d camera track on the footage
  3. Export the camera to Lightwave
You are right, for matchmoving alone it might be not accurate
Commercial GPS with all the bells and whistles (multiple readouts etc), can get around 30cm accuracy max, wich isn't enough for close distance.
But for high altitude shots with high distance to features / not enough paralax, where trackers have a hard time tracking this, it might give you a good starting point.

But yes, it wouldn't be an alternative to normal tracking.
 
Top Bottom