PDA

View Full Version : Four Point tracking



tonyrizo2003
02-23-2008, 08:55 AM
Hi Guys,

Total newbie here with this question. FYI I am a Lightwave guy that also does video production I have used the Toaster in the past and am really impressed with Tricaster. However, the school that I work for, is looking for something like the live VR set features found in the Tricaster. The caveat is that they want to be able to do simple camera moves.

Normal studio, 3 camera setup with a fourth on a small jib arm for over head shots.

I've read about the four point pixel tracking in the VT5, if I am correct this is a post processing feature not a realtime process?

And the Tricaster LiveSet stuff is done with locked down cameras, no camera moves. i.e. pans, tilts, zooms, etc.

Is this a fair assesment?

So then my next question is, will four point pixel tracking ever become a realtime application with LiveSet?

Right now I am going to be setting up a post process work flow with Syntheyes and Lightwave to do what they want. It's not realtime like they would like but it's the best I have to offer right now.

thanks

TOny :)

SBowie
02-23-2008, 12:02 PM
I've read about the four point pixel tracking in the VT5, if I am correct this is a post processing feature not a realtime process?Presuming we're referring to the same feature, it's a rendered effect (in Aura), and it's 2D.


And the Tricaster LiveSet stuff is done with locked down cameras, no camera moves. i.e. pans, tilts, zooms, etc. Correct.


So then my next question is, will four point pixel tracking ever become a realtime application with LiveSet?I think we're just seeing the tip of the LiveSet iceberg, and "ever" is a long time. That said, what you're talking about would require not only realtime tracking but also a realtime VR environment. Combined, those items comprise a very tall order. I wouldn't think we're going to see anything like that for quite some time.

(I'd love to be wrong, but then I'd also love to see world peace by the third quarter of this year.)

tonyrizo2003
02-24-2008, 09:06 AM
Presuming we're referring to the same feature, it's a rendered effect (in Aura), and it's 2D.

Correct.

I think we're just seeing the tip of the LiveSet iceberg, and "ever" is a long time. That said, what you're talking about would require not only realtime tracking but also a realtime VR environment. Combined, those items comprise a very tall order. I wouldn't think we're going to see anything like that for quite some time.

(I'd love to be wrong, but then I'd also love to see world peace by the third quarter of this year.)

Steve, thanks for confirming what I thought. As I will be starting my tests with Syntheyes next week ($400 program) which exports directly to Lightwave. I was thinking with my non programmer mind, wouldn't it be cool to be able to have Syntheyes on the Tricaster! Especially since the (not really sure about this process) VR Sets have realtime reflection and rendering.

Thanks for the response.

Tony :thumbsup: