PDA

View Full Version : Lightwave integration with the Tricaster (or lack of)



robertoortiz
05-02-2015, 05:45 AM
It is very interesting to me to me how well respected the Tricaster is in the broadcast video market.
I have seen first hand,Here in DC, how a LOT of small studios, government facilities use it for their video broadcast needs.

But when I ask the users what do they use for their 3d Needs or if they use any of the virtual sets in the Tricaster, they usually tell me that they either use
photographs or use other 3d apps to complement it like C4D or Blender.

That fact blows my mind.

Mind you this is just small sample of users
But this means that for these users when they need to do title sequences, 3d Text, lowe 3rds, and virtual sets they prefer to have them are done with 3rd party products instead of using the native 3d program of NewTek.
Why is this?
What the heck is the perception problem the app that CREATED the 3D broadcast market and pioneered it.

I am looking forward to your comments.

-R

ernpchan
05-02-2015, 08:20 AM
Are they using other apps because they don't know of LightWave or because the other apps are easier for them to use?

If it's because they don't know of LightWave, then that's an area of improvement for NT.

I can understand the use of photos to make sets as I'm guessing that most TC operators probably aren't versed in 3d.

CaptainMarlowe
05-02-2015, 02:22 PM
Honestly, I can understand that in the 90s you had to rely on 3D apps for titling, opening scene or lower 3ds. But when you see nowadays what a simple 2.5D app like Apple Motion is able to do in this area, I really don't see any benefit in using a 3D app (except for virtual sets, perhaps).
With an App like motion, you have native 3D titling (OK, until a few weeks ago, it was through plug-ins, but it is now native and very well done !!), you can apply to it motion tracking, lots of filters and behaviors, real time and completely editable lens flares and effects, a very efficient texturing, with real-time feedback and render times counting in minutes, or seconds depending on your rig. How could a standard 3D app with offline renderer compete against that ?
For more complex tasks, like unfolding papers or things like that, certainly, but even for some complex titling, I just can't see the advantage. To me, 3D titling in 3D apps like LW (or Blender or whatever) is already outdated.
Just get a look at motionVFX templates for Motion or FinalCutPro. Try to mimic these kind of effects faster with a 3D app. I'm pretty sure you would take an awful lot of time. In Motion, it's almost real time. And once it's done once, then it is saved instantly as a Final Cut Pro template that you can re-use instantly each time you need it...


http://vimeo.com/125431939

robertoortiz
05-05-2015, 09:44 AM
I mentioned this on another thread. I am surprised that for virtual sets they don't have a more direct bridge between Lw and the TC.
It would be cool if there was to send 3d Space camera data to Lw and it would be also cool if Lw could send renders/ buffers back to the TC.

lightscape
05-08-2015, 08:04 AM
I mentioned this on another thread. I am surprised that for virtual sets they don't have a more direct bridge between Lw and the TC.
It would be cool if there was to send 3d Space camera data to Lw and it would be also cool if Lw could send renders/ buffers back to the TC.



Agree there should be a seamless bridge between lightwave and tricaster. Its a shame C4D was used on these virtual sets for tricaster.

https://vimeo.com/80266970

https://vimeo.com/80266970

http://www.virtualstudio.tv/studios/bespoke-tricaster-virtual-set-design-company

www.virtualset4tv.com/virtual-set-hd-learning-green/

robertoortiz
05-08-2015, 08:33 AM
Wow that is great work.
And it is madness that Newtek is delivering this NEW market in a silver platter to C4D. A market that they should have locked.

My advice to NT
develop an app for IOS that will capture camera data and send it via Bluetooth to the Tricaster.
And the Tricaster can send that data to LW.

Why a phone app?
because you can use the phone motion sensor to get spatial data when the phone is latched to the camera.
Look at this for a similar idea (https://fstoppers.com/gear/gear-new-iphone-app-dslr-remote-unlike-any-other-5959)