PDA

View Full Version : Newtek and Virtual Production -> I think we have a winner



Curly_01
05-10-2012, 11:21 AM
Hi,

I just saw a presentation of ZOIC studio's at FMX about virtual production.
They are evolving to a realtime rendering system on set with realtime rendering using a game engine. I think ZOIC studio's rock.
My humble opinion is that Newtek through technologies like Tricaster etcetera has better playing cards to win this game from 'the other big company'. They have far more experience in this. If they are clever enough I would set R&D effort on integrating a game engine as a render engine in Lightwave. Because Newtek has such a technological experience in integrating life footage in 3D here's my message to Newtek.

Newtek put all your R&D on integrating mocap in Lightwave and integrating a game engine as a renderoption. If they do that -> Lightwave supports collada already -> That 'other giant company' -> might loose this game. Game Over Aut***** .

Think about,

Lightwave was the first 3D software.

LET's win this game.

Curly_01
05-10-2012, 11:46 AM
Integrate Tricaster Live in Lightwave with gpu rendering.

The competition doesn't have this sort of toys. Use Ipisoft Kinect mocap to garbage matte out actors or to place virtual actors on the set.

Curly_01
05-10-2012, 11:54 AM
If have designed a set in CAD or in Lightwave. Give Lightwave the possibilty to camera track set props and place them on the right spot. Go talk to the Syntheyes guys to integrate a camera tracker in Lightwave. I wouldn't mind spending 500 or 800 dollars more on a 3D application that has integrated camera tracking and a gpu game engine renderer and ms kinect mocap virtual actor stuff.

You can win this game.

Dexter2999
05-10-2012, 01:16 PM
If have designed a set in CAD or in Lightwave. Give Lightwave the possibilty to camera track set props and place them on the right spot. Go talk to the Syntheyes guys to integrate a camera tracker in Lightwave. I wouldn't mind spending 500 or 800 dollars more on a 3D application that has integrated camera tracking and a gpu game engine renderer and ms kinect mocap virtual actor stuff.

You can win this game.

Why track this way? Why not use technology like a Wii? Props have UV markers that are tracked by the a camera that would be in a location comparable to the camera filming the talent.

Or not.

robertoortiz
05-10-2012, 01:57 PM
Integrate Tricaster Live in Lightwave with gpu rendering.

The competition doesn't have this sort of toys. Use Ipisoft Kinect mocap to garbage matte out actors or to place virtual actors on the set.

Agreed!