PDA

View Full Version : TriCaster-like product for assisting in live action shoots containing CG elements



PabloMack
02-07-2013, 09:51 AM
Ever since watching a "Making of Avatar" documentary, I have been thinking about buying or building a system that will assist in live action shooting for shots that have CG in them. It is very difficult for actors to know where to look and how to orient their body parts in order to interact with the CG elements that are not present in the shoot but will be added in post. Being involved in film-making myself, I am seeing how important this is for making a successful project. While searching on the web, I see other people wanting the same sort of thing.

On the surface, TriCaster seemed to fit the bill. But upon closer inspection, its architecture, seems to be wrong for this purpose. It is targetted at live news casts. In film making with CG, the output monitor would not need to be recorded except for demonstration purposes and for instant replay for the Director's benefit to evaluate the "take". All that this system would be used for is to obtain "correct" original live footage that would then go into post. A titling system would most likely be used as a teleprompter if anything at all. Its output would probably need to be on a separate monitor, though. The system would also be layered as in TriCaster, but the main video would not be in the background but would be the middle layer. The "Virtual Set" would be a combination of what you would put in a layer behind the main video and another layer you would put in front of the main video. Each would be one of 1) still image, 2) stored video, 3) video stream taken from either a physical connection or a screen grab from a computer across the network. This system has no need for transitions and such other effects. The main video might be used for camera tracking so a way to pan and zoom the foreground and background layers in coordination with the main video would be extremely useful for fixed tripod shots. This would make a good 2D system.

"Chroma Key Live" is a free software package that is targetted at this market. It provides both background and foreground layers between which you can sandwich your main chroma keyed video. But this software is pretty limited in that it doesn't come with a large set of choices for where your video streams come from or where they go to. I would want to be able to use a still (many formats), a video (different resolutions and encodings), feeds from cameras as well as streams coming from a network for all inputs. The ability to create a live webcast is nice but is not necessary in this product. The main difference between TriCaster and this system is that it is used to assist in making a film but, once the shots are "in the can", the images produced by the system are not used. Only the original video taken by the camcorders are used. Sadly, Chroma Key Live only runs on a Mac and I am a Windows man so I will probably never use it.

What I have described above is a 2D system. A 3D system would ideally be able to import a Ligthwave Scene. A minimal 3D system would be able to superimpose the live video and the virtual set (Lightwave Scene) so that the director, cameraman and actors can see where they are in relation to the CG elements. Camera tracking would allow the CG Virtual Camera to follow the live camera and adjust the virtual set accordingly. Obviously, a good 3D system would take a great deal of thought and sophistication to approximate the image that is intended to be created in post.

Certainly it would not be appropriate to call this system "TriCaster" because of its different focus. Perhaps something more like "CG Live" would more fit the bill. Please think about it. This might lead to a whole new product line! Certainly, a lot of the internals could use the same elements that have already been engineered for TriCaster.

SBowie
02-07-2013, 10:11 AM
Since the advent of 'VAD' (thanks, Rob!), this has been an interesting notion. You're right that TriCaster provides a lot of things that would be overkill for this purpose, though it does fit some of the fundamental requirements. I want to correct what appears to be one misunderstanding about TriCaster's LiveSet capabilities though:


The system would also be layered as in TriCaster, but the main video would not be in the background but would be the middle layer. The "Virtual Set" would be a combination of what you would put in a layer behind the main video and another layer you would put in front of the main video. Each would be one of 1) still image, 2) stored video, 3) video stream taken from either a physical connection or a screen grab from a computer across the network.TriCaster's LiveSet technology can do this (and a great deal more). In fact, as I've indicated in another thread, TriCaster can achieve these limited goals without LiveSet.

Unlike LightWave's virtual studio tools, however, the LiveSet implementation is 2D; so while it can simulate pan, pedestal and zoom movement to a degree, these effects are really performed from a fixed viewpoint in the set (thus no parallax), and neither can it perform realtime match moving. (There are a number of existing studio systems that can do this, of course, but the ones I've seen are quite expensive).

A marriage of selected TriCaster, VPR and LW's virtual set tech would be a thing of wonder indeed. That said, I suspect that with current gen 'affordable' hardware, you'd be forced to endure a less than ideal framerate, among other things. This is not to say it's not well worth keeping in mind as tech advances rapidly and further LiveSet advancements appear.

jmmultex
02-07-2013, 07:06 PM
Since I've gotten to spend some time with it, I've definitely felt that the dynamic motion tracking found on the 8000 is just the start of something new in the virtual set area for NewTek - almost like a bare bones version of something much more substantial that NewTek might be working on. Move the tracking from 2D to 3D - like a feed from an external source - and a whole lot more can start to happen with Virtual Sets (think VizRT's Viz Virtual Studio or Orad ProSets - but uniquely NewTek)

While a lot of this can be computationally demanding, it doesn't all have to happen on one box. Combine that with the increasing power of CPU's, GPU's, and Custom AISC's, and the NewTek ecosystem can start to embrace a range of capabilities that today fall into the domain of exclusive, highend, and EXPENSIVE systems providers.

I think what you are looking for could certainly be realized in the near term in a TriCaster-like platform - and NewTek seems perfectly positioned to make it happen.

Best,
John