PDA

View Full Version : A simple curve to define motion blur sampling intervals



hesido
05-09-2005, 07:56 AM
I had sometime in the past read that the motion blur falls on the photograps in a logarithmic fashion instead of linear. And it kind of made sense to me somehow. I really can't back that up with scientific data, but still, it may be the reason we can spot cg motion blur esp. during a sequence also involving real world elements hence real motion blur so you can somehow feel what is cg and not).

If we had a simple curve (similar to a gamma curve) to adjust the motion blur sample intervals, it would enable us to try strange effects as well as try to have a more photorealistic motion blur in the end. The default would of course be linear.
If this would break compatibility with plugins like vector blur, then the user may simple be warned about it.