View Full Version : motion path becomes shacky

06-03-2003, 01:57 AM
In the attached image you can see what happens to the motion path of the camera. The camera itself in the viewport distorts aswell. This causes the camera to shake when played back. I think the reason for this is because the camera has phyically translated, in 3D, to a small area, sorry it is real difficult to explain. The camera is flying around and flies into a area of a model which is extremely small relative to the rest of the scene. Hopefully someone has experienced this problem aswell and has a solution, we have tried to parent everything to a null in the scene and scale that up but it didnt help it seemed to get worse.
There is no noise modifier on the channels in graph editor either or anything of that sort.
Thanks in advance for your help
chow 4 now

06-03-2003, 08:06 AM
The problem isn't in LightWave, it's in the floating point precision of your computer. The scale range in your scene is so great that the machine can't compute the motion path accurately. I had this problem once when I was doing an animation of a satellite in orbit, and I used a full-sized planet.

One solution is to change the scale of some objects (make the biggest ones smaller, or vice versa) and change the camera angles and positions so the perspective remains the same.

Another solution (the one I used) is to render the big background objects first, then use those frames as a background image sequence behind the animation of your smaller objects.

06-03-2003, 10:26 AM
Thanks Doug Graham for the reply, thats insane floating point calculations, hehe what do I know about that. Does that meen if I open the scene on a computer with a better chip it shouldnt be there. On my machine dual p2 350MHZ the motion path is messed up, so I tried it on a AMD Athlon 2400 and it was still the same. Thanks for the advice, I was just trying to stear clear of xtra work ,being on a tight deadline and all
Glen :p

06-03-2003, 05:39 PM
No, a more powerful machine won't help at all. That would just enable you to do the same inaccurate calculations a lot faster.

The problem is the number of bits the computer uses to represent numbers...and to fix that, you'll have to wait until Intel comes out with 128-bit processor architecture.

See, it's like this: Let's say you model a road that's fifty miles long. Now, you also model an ant that is 0.2 inches long. You want to animate the ant crawling around on the roadway. You're asking the software to draw a motion path with an error of, maybe, 1/100 of an inch, in a scene that is fifty miles (or 5ox5,280x12x100 = 310,680,000 times as large. That's an accuracy of better than one part in 300 million, and the computer just can't compute that well.

Keep the dimensions of your scene and the motions in it within four orders of magnitude, from least to greatest (i.e., 10,000), and you should have no problems.