View Full Version : Trouble with drone camera tracking for landscape pan matching dissolve

12-24-2018, 01:34 PM
I'm working on a documentary about my hometown (Manhattan, Kansas) and I thought it would be fun to show what the landscape of the town looked like in 1855 when the first settlers arrived.

So I commissioned an acquaintance to shoot a couple of 360 drone panning shots over two iconic locations here in town. His drone is a Phantom 4 Pro. FOV is 8.8 mm, 73.7 by 45.7 (HD 16x9) with a 13.2x7.425 mm sensor (including the vertical crop; the full sensor height is 8.8 mm).

The drone shots keep the horizon quite horizontal throughout the pan. Both After Effects camera tracker and SynthEyes track the shot, apparently correctly, but then I discover that there's a single, huge, bank/roll wave that is 20 from its highest to lowest point in the curve. What's that doing there? What am I doing wrong?

There's also a corresponding pitch/yaw curve that's also 20. I assume one corruption is compensated for by the other. Is there some way to eliminate these corruptions so that most of the pan is heading only? I realize there may be a way of "locking off" the bank & pitch during the calculation in SynthEyes, but I assume that I would want to have the accurate data and there probably was a little bank & pitch movement during the shot.

Any ideas appreciated --Hal

12-24-2018, 02:15 PM
Not quite grasping what your're seeing so just taking a stab here, but might it be a rolling shutter issue? Could you post a frame grab or is it something more temporal throughout the shot?

Since you're using AE (not quite sure how you'd apply this with SynthEyes) but maybe this will give you some ideas anyway.

One of his script options gives you soft or hard locking of axis.


12-24-2018, 02:29 PM

Here are the curves

12-24-2018, 02:33 PM
Ma3rk: The shot itself is rock solid, doesn't need stabilization. But the soft locking of axis is worth looking into.

Every time I looked at the rolling shutter option in SynthEyes I ignored it because my Ursa has global. But just now I remembered, oops, this isn't the Ursa, it's my friend's drone, silly! Okay, so I'll give that a try first and see if it helps, thanks

12-24-2018, 02:40 PM
Didn't help. btw drone manual says it's a mechanical shutter. Does that have the same effect as a digital rolling shutter?

12-24-2018, 08:16 PM
Can you post frames from the drone footage? Specifically, frame 800 (all curves about zero), frame 1200 (minimum non-zero bank), frame 1550 (maximum pitch), frame 1950 (pitch and bank are equal but non-zero) and frame 2300 (pitch and bank back to zero).

Or, if you can post the video footage from (say) frame 800 through 1600, I could try running it through SynthEyes and try to figure out why pitch & bank are off.

I've only used SynthEyes for match moving the camera, not AE. If visually the horizon is stable in the drone footage, it may be that it's indistinct enough to allow Trackers to be assigned and accurately tracked, so features closer to the drone are given priority. Perhaps there are features closer to the drone position which change in parallax as the (usually fish-eye lens) drone pans through the 360 degrees, and maybe that causes the software to *interpret* it as change in pitch & bank.

In SynthEyes, I usually do an Automatic solve, and then pan through the footage deleting any Trackers which skate around, sometimes dozens of them. You can explicitly specify keyframes for individual Trackers to correct if they skate off their intended "spot". As well, SynthEyes allows you to specify an origin (arbitrary) and lines parallel to the X, Y and Z axes to help the solver. Might be worth a try.

Good luck!

12-24-2018, 11:34 PM
Didn't help. btw drone manual says it's a mechanical shutter. Does that have the same effect as a digital rolling shutter?

I wouldn't think so but ultimately depends on how it gets encoded I suppose since it is digital. I bet Russ Anderson could tell you for certain.

12-25-2018, 11:50 AM
I've done quite a bit of Syntheyes tracking, and have tracked my own Phantom 3 drone extensively. Here's one project I shot, tracked and comped...


So here are some thoughts in no particular order...

1.) When you say 360 degrees, I'm assuming you mean the drone was stationary and did an in place rotation? If so, did you use the Tripod solve in Syntheyes? Which would need to be done in that case.

2.) Is there tell-tale motion in the video footage? If not, another approach would be to grab a number of frames from the footage and do a still image stitch of a panorama. Of course if the drone operator used ND filters to bring the shutter speed back to 1/48 or 1/60th of a second, then there might be too much motion blur in each frame.

3.) Most all solves I've done of the type you are describing with a drone have a good deal of tilt and bank in the final solve, even if it visually was almost a perfect pan. I've found this to be the case even when the camera is on a tripod with ONLY a pan... some amount of tilt and bank wind up in the solve. Bottom line, if it looks fine, it's fine. Unless you have need for higher accuracy of course.

4.) For the purpose of video, the shutter is absolutely rolling shutter on the Phantom P4. It's "mechanical" in context of still images it can take. Generally, I've had pretty good luck without having to do any rolling shutter compensation. The drone gimbal takes most of the "curse" out of it for drones, of course fast motion will reveal it, along with fast pans, but you can see from the example above that even with objects filling the frame you can get by.

5.) You listed very precise specs for the sensor and FOV, but are they accurate? Every one of these cameras is minutely different due to manufacturing issues. If you're using specs from DJI, don't trust them, do your own calibrations. Use those numbers for guesstimation only and let Syntheyes tell you what it actually is. I did a whole series of controlled tests with my P3 until I had the exact FOV and specs figured out. They were different than the published specs.

Best of luck!

12-26-2018, 05:50 AM
Thanks for the helpful replies. I was in family mode on Christmas Day. Sorry I didn't reply till now. Hope you are still willing to help out.

MonroePoteet: You and anyone else on this forum are welcome to download the shot in question. It is here: https://vimeo.com/308208257 and requires this password: drone_pan BTW, I did not include the first 70 frames when processing, which you might notice if you try to process it and find that your graph doesn't match mine.

I have spent a total of TWO DAYS processing this shot in SynthEyes and then analyzing in LightWave, over and over again, with dozens of different settings, and all to no avail. The problem is the landscape in LW banks as the camera pans across it. It is supposed to match the live action, but of course it does not. I match one frame in the middle of the shot, or anywhere else, and the results are always the same.

The one thing I have not been able to do so far is to use constraints in SynthEyes to stop it from allowing banking, but I don't understand how to do it. I've read the manual and it just goes right over my head. I understand constraints in LW, but not in SE, so I need to work on that. However, I'm skeptical that this will solve the problem. My opinion is that the result will be that the drone will start translating all over the place.

In reality the drone did a straight up climb in altitude by around 20 or 40 feet in order to reveal the sign "Manhattan" on the side of the hill. Then it stopped and mostly stayed still for the pan. That is why I didn't do a Tripod solve. However, I have tried cutting off the altitude climb and tracked only part of the pan with similar results. But I can't remember, I don't think I remembered to switch to Tripod solve for that test, so I'll try it again in a little while.

Imageshoppe: 1) Yes. 2) I don't think this will help, but interesting idea. 3) This is fascinating. To repeat what I said in my first post, both AE camera tracking solve & SynthEyes solve added in large tilt & bank curves that should not have been there. And of course, if all I wanted to do was add an object, then it would not be a problem. But since my purpose is to match a 3D camera to the real camera so that the real and 3D landscapes match, it's a huge problem. There should probably be a small amount of tilt and bank, but 99% of the turning should be in the heading. Since there doesn't seem to be any way of eliminating this weird artifact, the only solution I can see would be mathematical. I would have thought there should be some way of analyzing and correcting the data so that tilt and bank are nearly reduced to zero, but with a small amount of noise here and there that accurately describes the small imperfections that may exist in the footage. 4) Yes, I eventually realized that rolling shutter was not an issue here. 5) Yes, I've already noticed that SynthEyes keeps telling me that HFOV is 71.1 (if I remember correctly), not 73.7. The specs I have are from a forum post at DJI from a user who figured it out, and others chimed in and eventually agreed that he'd got it right. Since I don't own the drone I'm not in a position to test the rig to get more accurate data.

Someone might suggest that perhaps there is a way to animate the 3D landscape instead of the camera. I'm not sure exactly how I would do that, but I suppose it's possible. The problem is, I'm using LightWave's built-in Sun that places it according to location, date, and time of day to match the original footage. So if I did that it would look like the sun is spinning around the landscape in sync with the camera pan. I guess I could match the sun position with a second light and then parent it to the spinning landscape, so maybe that is a possible solution. But it really would be nice if I could get the camera to match.

If anyone here chooses to download the footage and tests it, please let me know! I will be very grateful. Thanks very much for your assistance and thoughts on this issue.

12-26-2018, 06:35 AM
I just did a quick test, frames 900-1930 only, in Tripod mode. I'm embarrassed that I forgot to do this a couple of days ago. Anyway, it's the same problem. Here's the graph.

12-26-2018, 10:11 AM
Okay, thanks for uploading that. This is a classic "two shots in one" problem. The first part of the shot is a standard track and works perfectly, if tracked until the drone settles in. The second part of the shot is a perfect "tripod shot", and Syntheyes needs to approach this section a bit differently.

The reason the tripod section of the track fails is rolling shutter. There is a "skew" between the top of the frame and the bottom, and that ratio is applied over 1500+ frames of an image panning from right to left. That particular off-axis solve is simply Syntheyes making sense of the trackers, and Syntheyes doesn't know that some of the trackers from the START of the pan are the same features from the END of the pan.

Let me take a good look at it later today and see what I can figure out for you.

Jim Arthurs

12-26-2018, 10:30 AM
Thanks so much. I didn't think rolling shutter was a problem but maybe you're right. I'll look into that some more too. Thanks again!

12-26-2018, 10:44 AM
Yes, fast or slow, the skew ratio is the same, and over 1500+ of frames moving from right to left that will become a factor. In fact, if you look at the 3D solve of the tripod section in the viewport, you see the slant of the track is probably the exact mimic of the skew ratio.

Also, my apologies, how to approach a multi-section track is in the manual on page 156 using "holds"... I didn't mention that in the last e-mail. I think there's a tutorial video on the process but I can't locate it on YouTube right now for some reason...

12-26-2018, 11:19 AM
On page 107 there is the explanation of how to find the correct setting for the rolling shutter. If the P4P is at 30 ms, and the footage is 24 fps (which suffers the most from RS compared to 30 fps and higher) then the setting should be 1.25. But the highest setting is 1. So not sure how to proceed.

As for multi-section, I'm not too concerned about the first part of the shot. I'm only really interested in the pan, frames 900-1930. If I can get a useable camera track of that section then I'm all set to work on this in LightWave

12-26-2018, 02:49 PM
Thanks, that knowledge helps!

Actually, the rolling shutter value for this camera (based on tracking the first section of motion) is about .4604. Think of it as the % of the frame duration. 1.0 would be a reset time equal to the 24fps frame rate (41.6 ms), .5 is half the frame rate (20.8ms), so this camera is better than some at 19ms (my GH5s UHD is 11.9ms, an Ursa mini 4.6k is 15.2ms, an A6300 UHD 24fps is a crappy 39ms!!!), and worse than others.

But who cares, I think we've got a fix! Check out this screenshot with a clue as to how it's done... 143692

simply pull up the Solver Locking panel after your first tripod solve and toggle the roll axis soft/hard lock under rotation weights. Then go and do a tripod resolve. Boom!


Jim Arthurs

12-26-2018, 05:38 PM
Jim, it sounds great, but I must be missing something. I'm still getting 20 waves across the entire pan section. It's true that the waves change somewhat from the original wave pattern, but it'll still result in a banking camera when I load it into LightWave.

I did exactly what you said in the Lock Controls panel. I'm not sure exactly what you mean by "do a tripod resolve". In the Summary tab "On Tripod" is selected and in the Solver tab the second drop-down menu "Tripod" is selected. Not sure if you meant click "Go" there, or click "Auto" under Summary. I've tried both. I don't think Auto changes anything, but Go does. Did I do something wrong?

Thanks so much for helping me out on this. --Hal

12-26-2018, 05:41 PM
I've noticed that my Top & Front views come out the same as yours, but my Left view still has tilted trackers. For some reason something I'm failing to do is stopping the tilt from correcting

12-26-2018, 06:10 PM
My Solver panel looks very similar to yours, but my Left view is tilted

12-26-2018, 06:30 PM
Sorry, I meant "tripod refine", not "tripod resolve". This keeps the original solve parameters, just "refines" them with any new changes.

How about a zip with the Syntheyes file and the LW2015 scene file? You can just use the LW file and get to work with the next step of the job. My track and scene covers *almost* the full 360 degrees and certainly the range you're interested in. I also put the camera at 0,0,0 in the LW scene, 'cause that's just tidy :)

One caveat... I converted the .mov into a .jpg sequence for ease of use in both LW and Syntheyes, and stripped out the letterboxing at the same time. Separate zip in the directory with this new .jpg image sequence.

I also included a 360 degree panorama image I created with every hundredth frame from the pan portion from your scene to "prove" to myself it was just rolling shutter. As you can see, the pano image is straight as a rail, just just a little vertical mis-alignment on the final image used in the pano, that could be hand adjusted if making a pano was the ultimate goal.

Both in this folder. (until tomorrow, the .jpgs are 2.5 gigs!)..



Jim Arthurs

12-26-2018, 06:35 PM
Yes, I eventually figured out you meant tripod refine but I still couldn't get the same result you got. Well, thanks for the zip. I'm downloading it now, but have to go, so will check it out tomorrow

Thanks so much!

12-26-2018, 07:06 PM
I think the missing element was I forgot to mention to toggle the "constrained" box under the "tripod refine" pulldown when doing the lock and then refine step. And make sure the camera is selected...

12-26-2018, 07:31 PM
The hardest thing for me to imagine was that Russ hadn't a script or option panel somewhere that addressed this sort of thing already.

12-27-2018, 06:53 AM
I think the missing element was I forgot to mention to toggle the "constrained" box under the "tripod refine" pulldown when doing the lock and then refine step. And make sure the camera is selected...

This is what happens when I do that. I'd still love to figure out how you did it, but I'll go ahead and look at the stuff you uploaded for me now. Thanks! --Hal

12-27-2018, 07:07 AM
It took me a few minutes, but I finally realized it's constraining all three axises instead of just roll. Not sure why that's happening

12-27-2018, 07:49 AM
I think we'd BOTH like to know how I did it. I tried to recreate this morning and didn't hit the magic combination I found when redoing and undoing yesterday. Anyway, I'm not looking a gift solve in the mouth, just go ahead and use the LW scene, just replace my proxies with your original footage if desired.

... I did one extra step, and dropped some geometry into Syntheyes to "catch" the projection of all those frames into a panorama that you can use on an object in LW (all included in a new zip called MANHATTAN_pano.zip in the directory).


For others interested, here's the resulting pano...


It shows off the good and bad of the track. You can tell by the overlap area in the image where I went past 360 degrees that the FOV we got is slightly off because the horizontal features don't align, and the actual drone pan wasn't perfectly level as shown by the vertical offset when everything was forced flat on roll. But good enough to do any work you want on the range of frames you're interested in. You can use the pano image to paint up blending feathers from "real" to "fake", etc.

Anyway, good luck!

Jim Arthurs

12-27-2018, 07:56 AM
Amazing work! Thanks!