PDA

View Full Version : Tricaster Broadcast needs additional features



cezaryg
01-27-2009, 02:26 PM
Posted in Tricaster forum under: Tricaster Broadcast unstable and freezes
Suggested to shere ideas here.

----------------------------------------------------------------

OK, I just bought the Tricaster Broadcast and after a week of preparations for work the system is still unstable, suddenly freezing, few times even with the infamous “blue=screen-of death.” Those freezes are always in such severe state that even the so called three-finger-salute (Ctrl-Alt-Del) is not rebooting. The hard boot is the only option available, with no guarantee of a recovery to the Broadcast interface after the first hard boot.

I was astonished learning that the top of the line NewTek $12 k system has only 1GB RAM onboard (an inexpensive 667MHz Kingston DDR2 ValueRAM available for $17 on Newegg), and that the video card has only 256MB memory. No wonder something had to happened when I went crazy “overloading” those resources by loading approximately 10 .AVI clips (around 60MB each) on each DDR 1 and DDR 2, plugged in 3 cams, overlayed with few simple lower 3rd, spiced it with iVGA, and, ouch, streamed it live.

NewTek Tech Support S. told me it is OK to add memory, but till today I’m waiting for specs. Tech Support D. told me adding more memory would be violating specs. I suppose it would be like using a mountain bike on paved roads would violate specs.

Eventually D. provided me with a motherboard model number.

From people from Intel I learned that the mobo’s correct model number is DG33BU and learned few things, in particular regarding memory. I’m very disappointed with the NewTech support team unhelpful and irritated towards the customer, even argumentative. I am refraining from details at this time.

The bottom line is that this expensive equipment, regarding of the explanations I got from the support group – according to the sales pitch is, at the threshold, for a quasi professional work. I cannot imagine what is going to happen to me if during a live broadcast the system will be still freezing.

But, it is obvious for me that if someone is asking $12 k it would be nice to toss 4GB RAM free, even if 32-bit OS would recognize 3GB. Nowadays 2 x 2GB RAM cost only $40 total with free shipping, you know where from. My HP Pavilion notebook has 4GB RAM and 512MB dedicated video RAM. And is not freezing during live Internet broadcast using 3 cams and $500 software!

Nowadays it is easy to get good 512 MB video card for $35, with 1 GB dedicated GDDR3 VRAM $50. Don’t tell me that we could not be tipping in a restaurant $17 (the street price for the Tricaster Broadcast video card). What about utilizing 2 SLI integrated video cards for even more muscles?

Few other thoughts:

(1) When during live show, say, the guest has names to long to fit the template’s space it would be nice to be able in the selected “Text” tab do some minor tweaks like moving and repositioning in addition to writing over. Currently, when you on-air and hot don’t even try click on “Edit Text” tab – it would freeze DDR 1 and 2. When I suggested this feature the Support told me to go get LiveText 

(2) In order to get high quality signal distributed by a broadcasting server the encoded video to be send to the server should be on higher bitrate then the currently available (573Kbps Tricaster’s maximum). I am suggesting options for higher, even 1.2Mbps.

(3) It would be nice to have additional resettable timer. Info like NTSC 16:9 ISO mode and clock is cool if your customer looks over your arm, but is redundant, even not needed gadgetry. If you look at the interface you know what ISO you are, and guess, you know exactly if you shelled $12 k or $15 k for a PAL/Secam/NTSC unit. What really a director or producer need is to know how much time is left during live production or broadcast if your segment must be say 25 minutes long.

(4) It would be nice to have those little DDRs play-heads indicators in say RED color. They are currently gray and are hard to see on which clip and where you are when doing live production and buried with thousands of things at once.

(5) It would be nice in iVGA mode that the cursor would disappear from the captured external computer screen after say 1 second and reappear after moving the mouse.

(6) It would be nice to have an HDMI interfaces in addition to the SDI in a future HiDef Tricaster release. Am I redundant here? Sorry. We need HiDef ASAP.

cezaryg
01-27-2009, 02:28 PM
Added post after comments:

Thank you all for responses. In general my expectations from this great piece of equipment are high, and, I believe, the NewTek designers and integrators could make this a stellar product. To make this happened you do not need restrict the product use.

The Tricaster is a computer but not for word processing and other similar tasks.

As it is currently built on a decent Intel mobo DG33BU this computer being integrated as a powerful TV switcher etc. has inherited room for growth: RAM can be populated up to 8GB 1.8 V DDR2 800/667 MHz SPD memory (in case the NewTek would consider the Tricaster in 64-bit OS to fully utilize 8 GB RAM – this is my another suggestion to the product’s features).

I respectfully disagree with each and every opinion that running such a resource hog on 1GB RAM is sufficient. The RAM quantity and video RAM quantity does matter. As well as the video card quality does matter too. For example Nvidia allows SLI (Scalable Link Interface) so even two or more PCIe Nvidia GPU video cards share work load when rendering.

Realistically as the system currently is it could be running 64-bit, plenty of nowadays inexpensive RAM, powerful video card or two integrated, 2 stripped array of SATA 3.0 Gb/s 7200 RPM HD – this is what this rig currently has or could be expanded to. If properly integrated the needed changes on-the-fly to tweak templates from the Edit Text tab would not stop DDRs. In fact there is room for even 3 HD for RAID 1 (mirroring) in addition to RAID 0 (stripping). This would add redundance to the system - if one HD drive failure the system is still running. You know, the show must go on... :)

For a live TV broadcasting to be able to at least move and reposition the text is a must. Every production has inherited surprises, and, especially when the show is live and hot the flexibility in a system is a must. Otherwise it is not to the par.

In addition LiveText requires at least one more computer gear in sometimes already limited space available when on remote shoot. For simple tweaking when time is of essence my suggestion has a merit.

From a director’s and producer’s standpoint it is important to know how much time is left when you producing live and hot timed video segment. If cars have user resettable mileage counter, why a professional TriCaster has not the user resettable timer?

Not to divulge into tech details, when you make your content available for Web you send it to the broadcasting servers first. And this signal must be the best since garbage-in garbage-out. The Tricasters’ 573Kbps limitations are bad. That feed to the broadcasting server must be higher bitrate in order to provide the server a desired quality to work with when rebroadcasting.

Lastly, VT[5], great piece of equipment but again not of much use on remote shoot because calls for a serious gear. Wasn’t the portability the critical component of the Tricasters’ success?

Finally, I believe I am not unreasonable being part of the NewTek Tricasters’ Family asking for maxing the juice from the equipment. By the way, Pizazz, here is the link to the SONY’s very successful newest professional ENG camcorders line equipped with HDMI interface, see HVRZ7U, PMW-EX1/EX3, HVR-S270U http://pro.sony.com/bbsc/ssr/cat-broadcastcameras/cat-hdv/

I do not endorse neither I’m being paid for advertising SONY :)

Thanks.

csandy
01-30-2009, 08:08 AM
You should give customer service a call and/or talk to upper management. You're a customer, not a pest.

Quiet1onTheSet
01-30-2009, 09:13 AM
Posted in Tricaster forum under: TriCaster Broadcast unstable and freezes
Suggested to shere ideas here.
I'm utterly amazed at what's going on here, cezaryg. Your approach to this matter seems to be void of understanding, and lacking in prudence relative to your stated need for help, causing me to stop reading the rant after a mere cursory glance.

When a NewTek staffer provided you some honest specs about your system, you would've served yourself well, in not falsely assuming the products NewTek delivers are under-powered: You're utterly mistaken.
:tsktsk:
TriCaster Portable Live Production systems are hardware/software production switchers and editing solutions which are extremely efficient, in performing the fantastic multitasks they're designed for -- and to aid that process, TriCaster units have a uniquely-tuned version of Windows XP Professional operating behind the curtain, to get us the majic we're enjoying.

Because Windows PCs and Macs generally need hoards of RAM and gobs of CPU power to effectively do video, :compbeati
that in no way suggests that a NewTek-derived multimedia production and post system needs the same brute force.

'Fact is, the contrary is true.
:thumbsup:

Therefore, the flaming comments derived from an apparent ignorance about display RAM, CPU power and system memory requirements for TriCaster systems are woefully off-base.
:hey:

Right from the start, you haven't indicated to us what make/model of power conditioning device you're employing at the power cord end of your TriCaster BROADCAST(tm). Is it possible that you're not using a device with automatic voltage regulation circuitry at the very minimum, for helping you in supplying clean, stable power to your unit?

Secondly, is it possible that some cards or RAM chips aren't fully seated, due to some shake-up during transport?

Thirdly, to what degree has your dealer (NewTek-Authorized, I trust) been helpful?

Finally, rather than complain at this juncture -- about what you suspect the unit lacks, you'd be best off concentrating your efforts on getting the help you really need now; else, your seeming posturing in these forums is suspect, leading some to question whether or not you're masquerading as a disgruntled customer, while in fact, doing the dirty work as an agent for a NewTek competitor-wanna-be.

Assuming that's not the case however, we gladly welcome you to the NewTek user community :dance:
...and if you're bent on playing nice, you'll soon discover that NewTek and all of us here, are more than glad to be of effectual help.
:bowdown:

csandy
01-30-2009, 12:15 PM
Quiet one, huh? I suppose that name is sort of like "Little John" (the biggest of the Merry Men).

I don't think it is welcoming to berate 1) a new customer 2) a customer that has been on the forum for all of two minutes.

Also, before taking such a powerful and argumentative stance, it would be prudent to read ALL of someone's comment. There may be a lot of ranting due to warranted frustration, and some good suggestions to boot.

To say "Your approach to this matter seems to be void of understanding, and lacking in prudence relative to your stated need for help, causing me to stop reading the rant after a mere cursory glance" would only perpetuate a cycle of name calling, finger pointing, and misunderstanding.

It's not professional.

Presumably, someone who pays upwards of $10,000 for a single video device is at the very least a serious amateur. Likely, someone who buys a device with professional components like a serial digital interface works in some capacity as a professional. I believe the poster here is venting his frustration as what he deemed to be a professional device marketed to professionals.

I think the rant stems from not the fact that there is problems with the product, but his treatment as a customer when he sought help for the problem.

As a manager, I know that many customer grievances can be allayed by providing good customer service, being courteous, actively listening, and providing reasonable solutions. This saves my company time and money and build good will with the consumer.

Good customer service keeps complaints off the Internet and telephones and huddles at conferences. Satisfied customers enhance your product's marketing efforts by creating mini-evangelists who will boast your wares. Remember the original toaster and the excitement it generated? What was more powerful than a friend or colleague telling (or showing) you what an amazing and revolutionary (indeed, paradigm shifting) device it was. There were no chat forums (okay, you had Fidonet, BBS' and the like, but nothing compared to day), twitter, blogs, global e-mail, and other bidirectional communication tools that are so ubiquitous today.

Reputations and good will have become more important, not less important in today's Web 2.0 world and beyond.

Finally, whether you believe in His word or not, the teachings in Proverbs ring true "A gentle answer turns anger away. But mean words stir up anger." (Prov. 15:1 NIV) What a powerful customer service statement.

It's not necessarily what is said, but how. Undoubtedly cezaryg had some issues and hopefully he can find some solutions. I believe this forum is to help users help themselves, provide feedback to NewTek staff, and receive information from NewTek itself. It seems to me that in a section that calls for discussion on "TriCaster - Feature Requests" one would expect to find some discussion on what a user feels the product is lacking.

I think too often in this thread people are verbally maligned for requesting some feature or the other, only too be told they don't know enough about the product, really don't need the feature they request, or are simply out-of-touch. For a discussion area that seems to be, at least on its face, an invitation to brainstorm and help the manufacturer enhance its product, such criticism it antithetical to the this section's intended purpose.

More helpful would be a response like this: "hey - I see you're new and perhaps have not tried the following approach to solve your problem" or "there is a little known feature that does exactly that!" or "here's a NewTek product that's right up your alley, and it will do what you require with flying colors!"

All in all, it doesn't hurt to err on the side of civility.

rally1
01-30-2009, 01:45 PM
Typical Quiet1onTheSet stuff.

Quiet1onTheSet
01-30-2009, 01:52 PM
Typical Quiet1onTheSet stuff.While his words are betimes mistaken for non-civility, it's easy to see how such perception is derived.
(Much too passionate, with ill-chosen words, with an overwhelming verbosity -- can be that Quiet1ontheSet character!)

He in fact meant to convey that the erroneous suggestions about the product needed to be challenged, from the standpoint of directing the newbie away from traditional thinking, relative to RAM, CPU and other system demands, for video production and post. That's all.

As well, C.S. Andy's defense of cezaryg is understandable! On the other hand, cezaryg has made some gaffes along the way, relative to his formulation of perceptions.

That besides, please forgive Q1's gaffe, in unwittingly conveying the misguided notion that cezaryg's frustration doesn't warrant empathy. In such frustration, there's need for separating fact from feeling. That's the motivation of the heart of Q1 on this.

OK, gang. Ready for some fun?

Quiet1onTheSet
01-30-2009, 02:08 PM
Added post after comments:

Thank you all for responses. In general my expectations from this great piece of equipment are high, and, I believe, the NewTek designers and integrators could make this a stellar product.
They "...could make this a stellar product"!?
Hmmmp! :D
[followed by an outburst of laughter]
:jester:
But even to the exclusion of TriCaster BROADCAST(tm), the TriCaster product line already is stellar!

To make this happened you do not need restrict the product use. The Tricaster is a computer but not for word processing and other similar tasks.
Ex-*squeeze* me? Sure, it's based on a computer system, but make no mistake about it: TriCaster is a Production, Post-Production, and Live-Streaming and Presentation tool, but it's not designed to be a general-use "computer" any more than my wife's Honda Civic is designed to be a transport for 7 persons.

I'm wearing a wristwatch that's digitally based. It's a computer -- with additional elements in the mix, all designed for a certain function, that doesn't include spreadsheets or word processing.
:tcicon:


As it is currently built on a decent Intel mobo DG33BU this computer being integrated as a powerful TV switcher etc. has inherited [sic] room for growth: RAM can be populated up to 8GB 1.8 V DDR2 800/667 MHz SPD memory... Yeah, and your wristwatch could be made to run for years, on a Sears Diehard capable of powering Big Ben. Say on --

(in case the NewTek would consider the Tricaster in 64-bit OS to fully utilize 8 GB RAM – this is my another suggestion to the product’s features).
Please, NewTek! Don't!
:thumbsdow
Please don't go 65=4-bit until you've got a license to strip Windows 7 down to essentials -- or til some other 64-bit OS comes along that's solid, won't freak with disk drives, won't hog up RAM and is otherwise worthy of bearing the virtue of hosting a system that bears the "TriCaster(tm)" moniker!


I respectfully disagree with each and every opinion that running such a resource hog on 1GB RAM is sufficient. You disagree with *each and every* opinion on this? This isn't mere opinion, friend. It's fact. And you dare suggest TriCaster's a "Resource hog"!!?? Would you not define, say -- Windows Vista 64 as such, or are you so much in bed with Microsoft, that you can't bring yourself to admit your foul play on this one?

your opinions are indeed deserving a modicum of respect, notwithstanding -- it would be an eye-opener for you, perhaps, if you'd check the goings-on in Windows Task Manager -- while switching away, playing back a clip, streaming, and recording. Be prepared to be amazed at the efficiency of TriCaster BROADCAST.
:)


The RAM quantity and video RAM quantity does matter. And if your wristwatch draws only 2 milliamperes, you wouldn't want to make the assumption that supplying a battery that can deliver 5 watts of juice would make your watch run faster (I know, such a ridiculous analogy).
;D


As well as the video card quality does matter too. For example Nvidia allows SLI (Scalable Link Interface) so even two or more PCIe Nvidia GPU video cards share work load when rendering.

OK. Interested in an increase in functionality and power, so as to play highly-detailed, top-performance games on it, eh?

You might go for a PlayStation3(tm) in your production workflow -- better yet, pull the trigger, and invest in an a multi-core Alienware laptop; and send that across your iVGA channel. While you're at it, it might be prudent to install NewTek's LiveTEXT(tm) on that, and you'd be *tight*. You'll then enjoy slick, remotely generated, motion laden text, manipulated via the virtual set system via the Effects bus if you like -- all during your live-produced show.

For anyone sharing your stated requirements, that arrangement would be a boon.
:newtek:


Realistically as the system currently is it could be running 64-bit, plenty of nowadays inexpensive RAM, powerful video card or two integrated, 2 stripped array of SATA 3.0 Gb/s 7200 RPM HD – this is what this rig currently has or could be expanded to. -- And my wife's Civic should've been developed for limousine service.

If properly integrated the needed changes on-the-fly to tweak templates from the Edit Text tab would not stop DDRs. Wait. Like me early on in my TriCaster exploits, you're missing the point that the Edit Text facility is a Pre-production utility. Let's not make it out to be for what it isn't.

It's not for Production use, so we ought not infer then, that it's somehow inadequate for that which it wasn't designed. That's going overboard.

In fact there is room for even 3 HD for RAID 1 (mirroring) in addition to RAID 0 (stripping). This would add redundance [sic] to the system Oh. You mean redundancy. Well sure. Well, as I've alluded to before, there's room in my wife's Civic for 3 people in the trunk. That would almost double up on the capacity, now wouldn't it; plus, those car-poolers wouldn't have ta' spend a dime on gasoline, right? Only they'd experience certain death from carbon-monoxide inhalation, no? Please continue -- you've got my ear!
:screwy:


- if one HD drive failure the system is still running. You know, the show must go on... :) Neat suggestion; Of course, we've got to keep the system cost in check. Until such expansion is offered, one could employ a back-up device like, say -- a DVD set-top recorder, or hard-disk recorder, a Sony DVDit, or perhaps an S-VHS VCR, even. They're usable as backup devices.


For a live TV broadcasting to be able to at least move and reposition the text is a must. Every production has inherited [sic] surprises, and, especially when the show is live and hot the flexibility in a system is a must. Otherwise it is not to the par[sic].

Every broadcast has surprises? Oh! You mean, like, if the graphics guy's spelling and/or grammar is a bit short of accurate, he'd really screw up the program, right? Well, yeah. That's a bummer. No amount of disk recording space is gonna' fix that.

Of course that's not what you meant...


In addition LiveText requires at least one more computer gear in sometimes already limited space available when on remote shoot. For simple tweaking when time is of essence my suggestion has a merit.
and for space-limitations, my Alienware laptop suggestion has merit, too, no? Did I mention you could use it as an uber-fast gaming machine? Plus, you could have ginormous disk space on it -- and depending on model, in a RAID configuration, too!


]From a director’s and producer’s standpoint it is important to know how much time is left when you producing live and hot timed video segment. Yeah. Good for the director/producer to have either a timer, stopwatch, clock or [I]wristwatch nearby.


If cars have user resettable mileage counter, why a professional TriCaster has not the user resettable timer? For the same reason that Civic has PAR quartz lighting (i.e., for headlamps), and TriCaster BROADCAST does not; but it does have a resettable record button and the disk content can be deleted, thereby resetting the mileage you've put on the drive.

Besides, producer/directors will have ensured that some entity outside of the switcher manufacturer has supplied the lighting.


Not to divulge into tech details, when you make your content available for Web you send it to the broadcasting servers first. In *every* web-delivery scenario? Hmmmmm.


And this signal must be the best since garbage-in garbage-out. The *best*, eh? *Always*? Even when the resultant signal's going to be viewed on my BlackBerry STORM(tm)? Nah. I don't *think* so -- but you have the floor --

The Tricasters’ 573Kbps limitations are bad. That feed to the broadcasting server must be higher bitrate in order to provide the server a desired quality to work with when rebroadcasting. Naw. Check this:

You can use the Windows Media Streaming Application directly (go into your Admin panel, then select All Programs...and navigate to the Windows Media Encoder app; set it up for the bitrate and other parameters you wish to employ -- rather than relying on the presets that are resident in your TriCaster BROADCAST(tm) Record/Stream tab.

Plus, have you fathomed just what kind of quality you might deliver to a hosting service, if you were emboldened to utilize that nifty Serial Digital output and AES-EBU audio outputs on your kuel machine, for processing and subsequent delivery to a server?
Heck!
*You'd* be kuel.
Real kuel.

In all seriousness, your enthusiasm for the product's expansion and improvement possibilities are welcome and worthy of serious consideration.

Moreover, you're to be envied, for having landed a TriCaster BROADCAST(tm) in your arsenal.
:thumbsup:

cezaryg
01-31-2009, 11:21 AM
Thank you all. One more feature request to the Tricaster family of products, in particular the Studio and the Broadcast: JIB functionality added to the LiveSet. The virtual camera would move similar to "Stereo Panning" but not for post but real time feature for live production need. JIB will to pan, trucks atc following live switched camera angles in the LiveSet.

Quiet1onTheSet
01-31-2009, 11:31 AM
Thank you all. One more feature request to the Tricaster family of products, in particular the Studio and the Broadcast: JIB functionality added to the LiveSet. The virtual camera would move similar to "Stereo Panning" but not for post but real time feature for live production need. JIB will to pan, trucks atc following live switched camera angles in the LiveSet.You're quite welcome, and thanks for being a good sport, cezaryg.

You mean generation of faked "...pans, trucking and arcing motion?" :agree:-- provided there could be some way to accomplish this pseudo-realistically*, by realtime warping the scene and talent, shadows, reflections, etc., in some psychologically-convincing manner.

That, I'm sure, is what many neophytes to LiveSET(tm) have had a fancy for (I myself have).


*I know -- what an oxymoron, right?...

cezaryg
01-31-2009, 12:13 PM
The Virtual JIB or Virtual Camera would not be exactly like the manually defined by user 3-D sound manipulation in the "Stereo Panning." I am envisioning this feature for live production not for a post production.

Virtual JIB will to pan, tilt, truck, dolly, crane, etc. smoothly moving (tweening) between switched camera angles selected in the LiveSet during live production. This coiuld be done by the software animation computations between the key-frames -- the LiveSet's currently available 4 camera angles. The whole feature should be accessible to the user as an additional button.

ted
01-31-2009, 07:23 PM
I had a client ask me that very question Thurseday. Can you zoom and pan the shot?
I told him..."it ain't that gig". :D
I told him to up the budget and I could do that. :thumbsup:

That would be great, but I wonder what it would add to TriCaster.
Maybe a higher end TriCaster type device???

DeanAU
02-01-2009, 01:29 AM
The Virtual JIB or Virtual Camera would not be exactly like the manually defined by user 3-D sound manipulation in the "Stereo Panning." I am envisioning this feature for live production not for a post production.

Virtual JIB will to pan, tilt, truck, dolly, crane, etc. smoothly moving (tweening) between switched camera angles selected in the LiveSet during live production. This coiuld be done by the software animation computations between the key-frames -- the LiveSet's currently available 4 camera angles. The whole feature should be accessible to the user as an additional button.


That would be something special.

billmi
02-02-2009, 09:53 AM
Thank you all. One more feature request to the Tricaster family of products, in particular the Studio and the Broadcast: JIB functionality added to the LiveSet. The virtual camera would move similar to "Stereo Panning" but not for post but real time feature for live production need. JIB will to pan, trucks atc following live switched camera angles in the LiveSet.

In theory, only possible if the virtual camera makes not significant change of aspect on the subject, as the moment it does, the subject will look like a cardboard cut-out - or you'd need a slew of cameras to extrapolate from a'la the ring camera systems used to create virtual camera arcing in the Matrix.

As to the original post - it amazes me that as such a relatively new user, cezaryg has developed such a thorough understanding of how Tricaster software, and the custom tuned version of Windows it runs on allocates RAM and video RAM resources to know what its requirements are. His ability understand it better than the Tricaster design team is downright uncanny, if not perhaps unbelievable.

As for NewTek's reluctance to offer advice on upgrading Tricaster internally - that does not surprise me at all - it is built and sold as a plug-and-play device, not a system with user-configurable hardware.

If someone is looking for a system that can support all sorts of hardware and software upgrades from a wide variety of vendors, then VT[5] would be the product to chose over Tricaster - it utilizes the same core hardware, but is built go go into dealer or user configured computers. There is more to learn when putting VT[5] to use, and in selecting the proper hardware to support it, but the trade off is more power and flexibility.

Quiet1onTheSet
02-02-2009, 11:28 AM
In theory, only possible if the virtual camera makes [no] significant change of aspect on the subject, as the moment it does, the subject will look like a cardboard cut-out - or you'd need a slew of cameras to extrapolate from a'la the ring camera systems used to create virtual camera arcing in the Matrix.That fact, billmi, coupled with Ted's suggestion that adding such functionality might bear a significant cost penalty, has given me pause, so as to not insist on NewTek making that feature request a top priority. It's an awesome feat to pull off convincingly.

I'm not so sure neophytes to TriCaster at least -- adequately apprehend the enormity of such a proposed software development project; and as suggested earlier -- the sheer presence of NewTek specialties, such as real-time synthesized reflections, soft shadows, warping and the like, would make an already-difficult pseudo trucking/panning/arching implementation downright insurmountable, no?.

(I'd pay Lead Programmer, Dr. Andrew Cross a huge "Thank You" if he could somehow find the time to comment on this.)

Let's face it: NewTek's industry-shattering LiveSet(tm) technology ain't no post-production-only jaunt -- limited in the 2D domain only; as a 3D live feature with a number of cool twists included, it's far more sophisticated than that.

This distinction, say -- from the Adobe(R)-acquired "Ultra" product, is often overlooked by casual observers and neophyte students of LiveSet(tm), I believe.

This latter point is made due to the fact that this writer was guilty of such faulty surmising, so as to believe that what cezaryg is asking for, should've been feasible (dare I confess this?) and utterly included in the original TriCaster STUDIO(tm) release, a couple years back.

My drunken excitement over the sheer power and cost-savings potential of LiveSet(tm), over any other live, 3-dimensional virtual set system anywhere, and at any price, was (and still is) notwithstanding, I sober-mindedly (such rarity for Q1!) dared not ask for this back then.

Quiet1onTheSet
02-02-2009, 01:17 PM
As to the original post - it amazes me that as such a relatively new user, cezaryg has developed such a thorough understanding of how Tricaster software, and the custom tuned version of Windows it runs on allocates RAM and video RAM resources to know what its requirements are.

His ability understand it better than the Tricaster design team is downright uncanny, if not perhaps unbelievable.

Ahhhh - such witty sarcasm is refreshingly substantive and can be employed as a useful, albeit humorous device for collaborating an earlier, acute observation made by Quiet1OnTheSet, which our friends CSandy and rally1 may have chosen to overlook.

Quiet1onTheSet
02-02-2009, 01:34 PM
As for NewTek's reluctance to offer advice on upgrading Tricaster internally - that does not surprise me at all - it is built and sold as a plug-and-play device, not a system with user-configurable hardware. How true.

Else there would doubtless be additional obstacles and added complexities in NewTek's continual ability to provide for TriCaster Portable Live Production System end-users, that legendary, Limited Lifetime Warranty service they're known for, with quick turn-around aplomb.

ACross
02-02-2009, 03:24 PM
The Virtual JIB or Virtual Camera would not be exactly like the manually defined by user 3-D sound manipulation in the "Stereo Panning." I am envisioning this feature for live production not for a post production.

Virtual JIB will to pan, tilt, truck, dolly, crane, etc. smoothly moving (tweening) between switched camera angles selected in the LiveSet during live production. This coiuld be done by the software animation computations between the key-frames -- the LiveSet's currently available 4 camera angles. The whole feature should be accessible to the user as an additional button.

This is nowhere near as easy as it sounds. All current computers and rendering systems tend to have limits which boil down to the fact that currently computer hardware is simply not fast enough to perform real-time ray-tracing, radiosity (etc...) at 30 frames a second (or even anywhere close.) LiveSet makes almost no compromises in what it can render, but it does compromise in that it needs to be a single fixed view ... allowing us to do a whole bunch of clever pre-processing on the frame so that we can then re-render it incredibly fast. I should also highlight that in practice, LiveSet cannot use more than about 10% of the CPU to render and key because we need the rest for DDRs, on-screen previews, streaming, recording, etc...

Unfortunately this does not mix well with the concept of allowing the camera to be moved freely around.

That said ... we are continually developing our technology to do radical new things, so who knows what we will see in the future. Just rest assured that this is a complex problem ... if it was not it would have been solved by everyone else already !

Andrew

Quiet1onTheSet
02-02-2009, 05:14 PM
This is nowhere near as easy as it sounds. All current computers and rendering systems tend to have limits which boil down to the fact that currently computer hardware is simply not fast enough to perform real-time ray-tracing, radiosity (etc...) at 30 frames a second (or even anywhere close.) LiveSet makes almost no compromises in what it can render, but it does compromise in that it needs to be a single fixed view ... allowing us to do a whole bunch of clever pre-processing on the frame so that we can then re-render it incredibly fast... we are continually developing our technology to do radical new things, so who knows what we will see in the future. Just rest assured that this is a complex problem ... if it was not it would have been solved by everyone else already !

Andrew
Yeah. Consistent with our speculation, is your affirmation of the level of difficulty in pulling that additional feat off.

Dr. Cross, we greatly appreciate NewTek, and your team's arduous efforts to get us what we currently have in LiveSET(tm), and for what's next. Also, for chiming in to enlighten us further here...
"Thank You."
:thumbsup:

Quiet1onTheSet
02-02-2009, 05:21 PM
My drunken excitement over the sheer power and cost-savings potential of LiveSet(tm), over any other live, 3-dimensional virtual set system anywhere, and at any price [notwithstanding], -- I sober-mindedly (such rarity for Q1!) dared not ask for this back then.

p.s.:
Whew!!
'Glad I *didn't*!
:D

DiscreetFX
02-02-2009, 06:46 PM
Our TriCaster Pro FX works great and is a fantastic system.

cezaryg
02-02-2009, 10:20 PM
Thank you Billmie, I’m very flattered.

Regarding paying plenty for NewTek Product Accessories: I shelled out a hefty $2,000 for LC-11 and had to change myself the EXT, DDR1, DDR2 and TXT position on the switcher to match the Tricaster's interface layout. Be carefull following NewTek vague suggestions in Q&A section. Thankfully I disregarded additional NewTek Tech Support telephonic suggestion, and, thankfully avoided damaging the LC-11.

By the way, get LiveText and DataLink, this was the NewTek response to my product suggestions when I asked for a minor feature and improvement like live resizing and repositioning the text from the “TEXT” tab instead of “EDIT TEXT” tab.

The Virtual JIB I'm suggesting is not impossible to deploy. The 3-D image can be extrapolated and morphed from 3-cam different angles input. Web is full of 360 views implementations. The LiveSet as it is now requires properly framed, preferably stationary cameras. Setting the Virtual JIB to the 3 stationary cameras -- 3-D views could be accomplished. Andrew, real cameras in the Virtual JIB are stationary. The effect is as the camera moves on JIB, but it is a virtual move only.

Thanks.

joseburgos
02-03-2009, 05:11 PM
In theory, only possible if the virtual camera makes not significant change of aspect on the subject, as the moment it does, the subject will look like a cardboard cut-out - or you'd need a slew of cameras to extrapolate from a'la the ring camera systems used to create virtual camera arcing in the Matrix.

A motion control rig would give the camera control.
The motion would be fed into an expensive virtual set piece of equipment that is available.

Hypertheticaly;
You make your virtual set inside a 3D real-time engine and link the motion control to the engines motion.
In this way, when the camera moves forward, the 3D engine moves.
When it pans left, the 3D engine pans left.
Once you have that accomplished, you feed the effects bus the 3D engines systems output and the camera on the motion control device, to the main bus.
The camera would have livematte enabled allowing the 3D engine to be the background.
Now you have a true real-time virtual set.
A 2nd 3D engine system could handle the foreground set pieces.

Mocap and of a lessor quality, Wii controllers, could make for a less expensive motion control device.
Still, these systems do exist if you have the cash :)

joseburgos
02-03-2009, 05:23 PM
The Virtual JIB I'm suggesting is not impossible to deploy. The 3-D image can be extrapolated and morphed from 3-cam different angles input. Web is full of 360 views implementations. The LiveSet as it is now requires properly framed, preferably stationary cameras. Setting the Virtual JIB to the 3 stationary cameras -- 3-D views could be accomplished. Andrew, real cameras in the Virtual JIB are stationary. The effect is as the camera moves on JIB, but it is a virtual move only.

Thanks.

Anything is possible but you over simplify terminology and technology.

Quiet1onTheSet
02-03-2009, 08:53 PM
A motion control rig would give the camera control.
The motion would be fed into an expensive virtual set piece of equipment that is available.

[Hypothetically:]
You make your virtual set inside a 3D real-time engine and link the motion control to the engines motion.
In this way, when the camera moves forward, the 3D engine moves.
When it pans left, the 3D engine pans left.

Hey, Jose! Long time, no hear-from (since I've been a bit busy).
Help me understand something here, if you please: Following your explanation, I would have thought the opposite is true. If I'm incorrect, would you kindly explain in greater detail, the goings-on, in the system you described?

Here's what I would've imagined:

"When the camera moves forward, the 3D engine moves the virtual set scene backward, with respect to the camera; when the camera pans left, the 3D engine moves the virtual scene right with respect to the camera..."

OK, am I correct?
:stumped:
(i.e., Does my grammar and syntax more accurately communicate what you intended?) If not, please detail for us, how it is, that I've got it backward?
Thanks in advance, guy.
:bowdown:

Quiet1onTheSet
02-03-2009, 09:08 PM
Anything is possible but you over simplify terminology and technology.

You are most definitely correct.

cezaryg certainly does have an overtly simplistic, if not utterly distorted view (no pun) of the matter, relative to a possible LiveSET(tm) implementation of faked motion, as he seems to misunderstand some of the terminology (I gather that from your expressed view); as well, he grossly understates the complexities involved, as you've also indicated.

Here, he might have us suspect he may know more than Dr. Andrew Cross -- and other companies, large and small, which traffic in things "green screen", virtual set, motion-tracking, and the like.

Even with your explanation, I'm not sure he'll "get" your idea of actually moving the cameras, and having that motion induce a motion path for the virtual scene, through esoteric motion tracking system(s) that work in realtime, within the live production domain.

I may be correct also, in suggesting that he may be deliberately suppressing the the matter of his idea wreaking havoc with the integrity of the live talent's appearance within the virtual scene (per Billmi's caution).

Also, he misses the point that Dr. Cross made, regarding CPU allocation for LiveSET, vis a vis the other simultaneous tasks that TriCaster and VT[5] systems have to manage.

Worse still, it appears to me at least, that he's totally ignoring the presence of synthetic soft shadows (i.e, generated by the LiveSET engine, which gets data from the talent in the scene; as well as the synthetic reflections, and the like, which are hallmarks of NewTek's 3D live virtual set implementation.

What on earth would "his system" do with those elements, in his proposed idea. and what of the state of today's computer technology being swept under the rug as it were, in his commentaries? (Refer to Dr. Cross' notes to see what cezaryg seems to ignore).

joseburgos
02-03-2009, 09:36 PM
Hey, Jose! Long time, no hear-from (since I've been a bit busy).
Help me understand something here, if you please: Following your explanation, I would have thought the opposite is true. If I'm incorrect, would you kindly explain in greater detail, the goings-on, in the system you described?

Here's what I would've imagined:

"When the camera moves forward, the 3D engine moves the virtual set scene backward, with respect to the camera; when the camera pans left, the 3D engine moves the virtual scene right with respect to the camera..."

OK, am I correct?
:stumped:
(i.e., Does my grammar and syntax more accurately communicate what you intended?) If not, please detail for us, how it is, that I've got it backward?
Thanks in advance, guy.
:bowdown:

No the virtual set does not move, it's the camera within that moves.
Think game engine to better grasp the theory.
There are a number of companies that sell these systems at a broadcast budget price tag.
So your real world camera would not zoom in or out (although I have seen this as well) but rather dolly in and out.
This way the live subject matter on green screen tracks the virtual 3D engines parallax.
A real time 3D engine using a workstation class graphics card can do some pretty photo-real stuff using modern texturing (normal mapping, OpenGL effects, etc.)
In this manner, the geometry can be less just like in a game.
A few of these system have input for 3D Max and Maya.
PIM has one for LW.
Without a foreground or things that occlude the live subject matter, one 3D engine and a motion rig would work just fine.
Again to a lesser expense, a Wii controller or Mocap as well.
The live texturing of geometry is where the magic comes in and the cost of a system.
Feeding a set monitor, real-time reflections, shadows, etc.
Good thing is most use the graphics card hardware to reproduce some of the effects (fog, smoke, lights, etc) via OpenGL.
Again think game engine.
PIM's web page example had a lot of this demo'd but don't know what happened to them.

Take care,

SBowie
02-04-2009, 06:56 AM
Of course, attempting to this sort of thing along with live multi-cam switching, DVE's, and so on while doing realtime compression for streaming, providing projector output and NTSC/PAL becomes even more challenging. No one ever said "never" ... but it's a long ways from non-trivial, and certainly not just around any corner.

billmi
02-04-2009, 08:49 AM
The 3-D image can be extrapolated and morphed from 3-cam different angles input.

The only folks I've ever seen with the software to pull that off are on CSI:Miami, but I think that show is *slightly* fictionalized. The only crime scene unit vehicles I've seen here in FL are ratty old Econoline vans, not Hummers.

PIZAZZ
02-04-2009, 09:41 AM
The only folks I've ever seen with the software to pull that off are on CSI:Miami, but I think that show is *slightly* fictionalized. The only crime scene unit vehicles I've seen here in FL are ratty old Econoline vans, not Hummers.

What are you talking about Bill, I was told that CSI Miami was a true life documentary?

Quiet1onTheSet
02-04-2009, 09:49 AM
What are you talking about Bill, I was told that CSI Miami was a true life documentary?Interesting you should ask, as I've heard that those CSI television dramas were rife with technical errors and misleading ideas about crime-scene investigation techniques in general; Thus, they're to be understood as pretty much fictionalized as Bill indicated -- so I've been led to believe.

Therefore, any understanding on my part is not from having studied the matter. As such, I would be pleased to hear more from Billmi on this.
:hey:

Quiet1onTheSet
02-04-2009, 09:51 AM
No the virtual set does not move, it's the camera within that moves.
We both suggested that the camera moves. But it seems that your earlier post instructed also, that when the camera moves, something else also moves -- in the same direction.

That's what I'm trying to understand, from the motion-graphics guru (that'd be you, May'in!).
:hey:


Think game engine to better grasp the theory. There are a number of companies that sell these systems at a broadcast budget price tag.
Got a suggested price range?

So your real world camera would not zoom in or out (although I have seen this as well) but rather dolly in and out.
Hmmm. "Dolly" (that is, physically pushing the camera toward the subject, or back away from same) is a term I don't recall having seen in cezaryg's request. I'm glad you introduced that term, since proper use of terms aids us students' understanding of the subject matter.

Plus, it gives yourself and other folks here, the opportunity to provide some needed explanation and discuss further, the techno-jargon involved in all this.

Hope you don't mind these sets of questions, designed for better understanding these concepts you've discussed here (and in other forums and mailing lists over the years).

I must confess now, that, I've been having difficulty grasping this; and since having weened myself from addiction to Super Mario Bros., in the late 80s, I'm no longer a gaming enthusiast. So please bear with this live-motion, virtual set newbie, as I ask...
:question:


...This way the live subject matter on green screen tracks the virtual 3D engines parallax.
A real time 3D engine using a workstation class graphics card can do some pretty photo-real stuff using modern texturing (normal mapping, OpenGL effects, etc.)

3D engine is software and hardware combined, right?

Also, "Parallax": How might one define that, as it relates to motion tracking, in easy-to-grasp terms?

In this manner, the geometry can be less just like in a game.[/quote]
What's the intention of making it so that "the geometry can be less?" I don't follow -- unless the intent of the design, is to cut down on the need to render more polygons??
:question:


A few of these system have input for 3D [Studio]Max and Maya. PIM has one for LW. (Hmmm. PIM might well be a company that produces computer games...)
:hey:



Without a foreground or things that occlude the live subject matter, one 3D engine and a motion rig would work just fine.
"Occlude". Whas'sat?

Also, I meant to ask earlier:
"Virtual 3D engine": is that essentially, software that's tied to some sort of physical instrumentation that's attached to the real-world camera's lens system, or tripod, or some such thing?


Again to a lesser expense, a Wii controller or Mocap as well. OK, I'm imagining that "Mocap" is a motion-capture system, comprising of motion detection device(s), with it's associated software. Advise if I'm off-base, please.


The live texturing of geometry is where the magic comes in and the cost of a system. Feeding a set monitor, real-time reflections, shadows, etc.

Uh-oh. That last statement amounts to an incomplete thought, if I'm reading it correctly. Seems like you might have forgotten to complete what you were thinking.

But wait --were you about to comment on the idea I've set forth earlier -- that those broadcast-level motion systems (such as you've been discussing), are expensive as they are: but *adding* NewTek's proprietary twists in 3D Live virtual sets imagery, such as those realtime synthetic shadows, reflections, the LiveSet monitor(s), you mention would make such systems all the *more* cost-prohibitive for most of us?


Good thing is most use the graphics card hardware to reproduce some of the effects (fog, smoke, lights, etc) via OpenGL.
Again think game engine. I suppose by "most", you mean, "most broadcast-level 3D virtual set engines which are tied to motion systems. Kindly advise if I misunderstood, and thanks for all your patience and sharing of your wealth of knowledge in this arena, Jose
:thumbsup:

cezaryg
02-04-2009, 11:33 AM
Another feature request: I call it LiveEdit.

Imagine two sources (example: two cams recorded an interview) being synchronized and paused in a DDR1 and DDR2 in the Tricaster Broadcast. Clicking on a designated button would start DDR1 and DDR2 simultaneously allowing A/B Roll Editing in Live Production Tricaster's section. This could be a useful live editing feature (streaming during live editing as one possible use) in addition to the Edit Media designed for a post production.

Thanks!

joseburgos
02-04-2009, 11:42 AM
The only folks I've ever seen with the software to pull that off are on CSI:Miami, but I think that show is *slightly* fictionalized. The only crime scene unit vehicles I've seen here in FL are ratty old Econoline vans, not Hummers.

Ah for the record on fictional facts of the show, even they don't do it in real time :)

cezaryg
02-04-2009, 11:59 AM
Dear Quiet1onThSet, let me try to explain again that my Virtual JIB idea and feature request I’m envisioning is the FX.

To clarify how it works: the cameras in your studio are STATIONARY. Now you click on AUTO button after previously selecting a Virtual JIB effect and now you have a JIB transition between the LiveSet switched views (whether it is in form of Pan, Dolly, Truck, Crane, or other movement -- let’s live the semantics alone for the sake of clarity)

joseburgos
02-04-2009, 12:08 PM
Dear Quiet1onThSet, let me try to explain again that my Virtual JIB idea and feature request I’m envisioning is the FX.

To clarify how it works: the cameras in your studio are STATIONARY. Now you click on AUTO button after previously selecting a Virtual JIB effect and now you have a JIB transition between the LiveSet switched views (whether it is in form of Pan, Dolly, Truck, Crane, or other movement -- let’s live the semantics alone for the sake of clarity)

Yes and what you are asking for is for the virtual set to track the camera motion.
These systems exist for around $100,000.00 US and up.
Again you are over simplifying the amount of hardware and software you need to do this.
If you have this kind of a budget, then write me private and I will send you some links to other manufactures.
Otherwise I am starting to be offended by your responses when you throw out theories and feature request and not attach a nominal value to it.
For it's price tag, a TC or VT with stationary cameras rivals all other manufactures.
Again if you seriously need this feature, write me private and I will gladly send you some links to other vendors.

I am starting to regret looking into this thread :(

joseburgos
02-04-2009, 12:20 PM
Peter,
Lets take a step back.
Dolly, Truck, Pan and Jib, are all relatively easy to map to a 3D real time engine.
Zoom is not and you really up the technology to do this but I have seen it.

So why is it easy and what exactly am I talking about?
I will refer to LW and not a game engine with a hope you have used LW.
In LW, you can create a TV studio with geometry/polygons.
This would have textures to give the desired look of the geometry.
Once your TV studio is finished, image the ability to make this go full screen and being able to view all your textures mapped to the polygons in real time via OpenGL.
Now take your LW camera and map the X, Y, Z plus H, B, P to a motion control device that is connected to the camera.
This device will now move the camera inside LW.
A limit would have to be set so that the camera does not pass a certain point of position.
Now imagine a texture mapping plug-in, like LiveSet, being applied to a polygon like the screen on the TV studios video monitor.
This plug-in would map Live video in real time to this geometry and take it's feed from a video input board on same system.
This plug-in, in LW terms, would have a master plug-in much like LiveSet, to also add OpenGL real time effects to be viewed on screen.
These type of OpenGL effects are vast and I would not dare explaining them all but rather point you to NVidia to research all that can be done in real time using OpenGL.

But to try and continue, this needs a mean machine separate from the switcher system.
4 plus processors and Quadro class graphics card with plenty of ram.

And yes, less geometry so as to allow for real time rendering of the 3D environment.

Motion control devices are expensive and Motion Capture systems are less at about $5000.00.
A Wii type system is extremely cheap but limited on the volume it can track.

Occlude is when the view is blocked.
This would be very high for any optical motion tracking device and is over come by multiple Mocap cameras triangulating position (around 6 for a minimum).
A Wii would be extremely prone to this and it's field of view would be the smallest.
A cinema motion tracking device is normally hardwired and would not be prone to occlusion since it is not optical.

Hope I explained but better if you researched game engines to better understand what they can do as well as OpenGL ability.

Take care,

Quiet1onTheSet
02-04-2009, 12:29 PM
Dear Quiet1onThSet, let me try to explain again that my Virtual JIB idea and feature request I’m envisioning is the FX. -- let’s live the semantics alone for the sake of clarity)
But there is little, if any clarity, in what you're suggesting here, cezaryg. It appears you have no idea of the magnitude of the facts that you're *ignoring*, relative to the current technology, in your request.

What's more, you seem to think that a real-time computer-generated, fake trucking/arcing/tongue of the camera view should be possible, with some software code, with total disregard for the resultant effect on the whole "look". You're even ignoring the need for reserve processing power for the other simultaneous functions of TriCaster (Steve Bowie's inference) -- and by extension, you're doing the same with NewTek's more venerable VT[5] system.

And what do you mean "...the feature request [you're] envisioning is the FX"?
For the record, no one is playing with semantics. Rather, you're not playing fair.

What budget would you fathom a system such as you're imagining would cost?

And again, while getting your requested fantasy realized, what would you want done with NewTek's realtime reflections, synthetic reflections, soft shadows, warping and LiveSET monitor displays, within the scene, for each and every camera view and each of the multiple zoom angles within the Setup tab -- just toss them out altogether, thereby rendering your fake motion, all the more less convincing to the viewer?

You haven't mentioned a word on that. One can't just go willy-nilly, and try to pull off a faked jib "effect" without *affecting* other things in the scene/image; and one cannot pull off anything remotely that awesome, if there's no computational speed and power to support it, within a given budget (Jose admirably inferred that).
:hey:

cezaryg
02-04-2009, 12:34 PM
This Discussion is called Feature Request, not Work Order Bin. NewTek has free ideas flow from users like me sharing experiences field testing their manufactured equipment.

Criticizing only or bashing others is not any contribution to make the product better.

Is the current competitors' price tag an argument against progress? There was a time when 32MB was around $500 and proprietary editing systems were costing hundreds of thousands of $$$$.

Sadly enough, all my ideas become a platform for combativeness for some. I am starting to regret sharing my experience with the product and bringing my ideas for NewTek and to this Forum.

joseburgos
02-04-2009, 12:49 PM
This Discussion is called Feature Request, not Work Order Bin. NewTek has free ideas flow from users like me sharing experiences field testing their manufactured equipment.

Criticizing only or bashing others is not any contribution to make the product better.

Is the current competitors' price tag an argument against progress? There was a time when 32MB was around $500 and proprietary editing systems were costing hundreds of thousands of $$$$.

Sadly enough, all my ideas become a platform for combativeness for some. I am starting to regret sharing my experience with the product and bringing my ideas to the Forum overpowered by the Forum's loudest members.

I hope you did not misunderstand me as I for one would never bash a feature request but I am trying to tell you it is way to costly at this time to the end user a tracking virtual set system.
Your example is great to help you understand that yeah in a few years, this may be possible for a reasonable cost.
So all I was asking was if you needed this now and had the budget otherwise this is not a feature you just ask for and it can be made available.

This is why I constantly say you are oversimplifying the technology.

And for the record, I would love it if the TC/Vt could do this but it truly is not practical at the price range it is sold for.
Better Newtek developed a LW plug-in to do this and house it on a separate computer system with it's own video input card for live texturing.

Well you guys enjoy as I am divorcing myself from this thread.
Take care all,

Quiet1onTheSet
02-04-2009, 01:04 PM
Peter,
Lets take a step back.

Hope I explained but better if you researched game engines to better understand what they can do as well as OpenGL ability.

Take care,
Wow. That was all really helpful; thanks a million. You've studied this a great deal I see -- and yes, I have used LightWave, at least for motion effects in Title Generation for Video.

From those of us genuinely interested in learning, and productive dialogue, we say,
"Thanks for all your valuable insights, Jose!"
-PeterG

Quiet1onTheSet
02-04-2009, 01:09 PM
Criticizing only or bashing others is not any contribution to make the product better.

Is the current competitors' price tag an argument against progress?
No one here is suggesting that at all, cezaryg. It seems you're misinterpreting our responses to your description of your idea.

Feature requests are subject to discussion here, not just merely posted and then left unchallenged (where such response may be helpful perhaps, in refining the request).

For the record, relatively speaking, I've been "quiet" here for quite some time. It's in stark contrast to my forum involvement and decorum before NAB 2008, as objective observers here might easily attest.

:ohmy:

Quiet1onTheSet
02-04-2009, 01:24 PM
Sadly enough, all my ideas become a platform for combativeness for some. I am starting to regret sharing my experience with the product and bringing my ideas for NewTek and to this Forum.I wish to empathize with you at least to some degree, but your use of hyperbole here is unfortunate. "All [your] ideas become such thing"?

Naw. That's a huge stretch, not even close to reality.

None of your ideas are being fought against, nor is any party in combat with another. No objective observer or participant should buy that -- not for one cent, let alone $100,000. And I meant what I communicated, in affirmative style, that your ideas for improvement are welcome and are worthy of serious consideration. But add to that, the fact that they're also a platform for dialogue amongst the members here -- yea, even in the Feature Requests Forums.

Only it may be that you're not in truth, interacting with us, on the points we've all raised. Your doing so, in the interest of creating an environment for mutual learning -- relative to the comments and insights we've presented in response to yours, would perhaps provide the best use of this fantastic "discussion forum", to be honest.

As for any emotional response on your part, that you're "...starting to regret sharing [your] experience with the product and bringing [your] ideas for NewTek to give consideration to"?

But why? Because you feel NewTek should be penalized for our expressed technical concerns relative to your idea, or is it really because you resist the fact that your expressed "FX" idea could use a bit of refinement, relative to budget and technology considerations, so they're commensurate with current realities (let's not even mention the economy!)?

That sentiment might be unfair, though you're entitled to feel you're being bashed, even though you're not. Your ideas ought to be given serious consideration, as indicated twice before; but remember that allusion made to CSandy and Rally1 about "feeling versus fact" in relation to your felt notion of supposed CPU, graphics and system RAM deficiencies? Yeah, the emotional response to the honest scoop you received from a telephone rep ought to have been, "What?? NewTek's TriCaster products are *that* fine-tuned, and fantastically efficient? Woah!

(Then you could've hung up the phone in anxious anticipation of saving time while learning how to get your product up and running, so you can be soon, making more money!)
:phone_cal

billmi
02-04-2009, 07:46 PM
Another feature request: I call it LiveEdit.

Imagine two sources (example: two cams recorded an interview) being synchronized and paused in a DDR1 and DDR2 in the Tricaster Broadcast.

You can do this already in VT with as many cameras as your hard drives can handle simultaneously, and a synchronization script in ToasterScript. Really, if you're after that kind of stuff, and custom memory, graphic card and disk array configurations, VT is the NewTek product to use, not Tricaster.

billmi
02-04-2009, 08:09 PM
Re: virtual jibs....

This is a subject that's been discussed in the LiveSet forum previously.

Keeping things 100% software and using fixed camera positions, as LiveSet does now avoids the need for any camera tracking.

Simulating a move between two camera positions at different aspects on the subject (forget all the set issues - I'm just talking about the talent) would, with today's technology be a pipe dream. I'm talking about having one shot that sees the talent from the front, and one shot from a 45 degree position to the side, and trying to simulate an arcing camera move between them. That's not going to happen in realtime for many years to come. Pulling it off today requires either detailed 2-D morphing with manual point selection, or for a better look, creating a 3D model of the talent and mapping the video from the two camera view over it. That's not even close to being a realtime, or even fully-automated (i.e. not needing a human modeler/point picker) process.

However - if we eliminate changes in subject aspect from the picture picture - and restrict ourselves to virtual camera moves where the angle of the subject doesn't change relative to the camera, I think it's a lot more do-able.

Remember Ultra before Adobe bought it? They had a number of fly-in camera moves that worked pretty well, all based on original footage from a locked down camera shooting the talent in a front shot. The virtual camera move never moved from one real camera view to another, it stayed on one camera and flew that camera forward from a long shot to a closer shot, or trucked left or right at a bit of a distance to reveal the talent.

Assuming the LiveSet format could be animated, this sort of thing would be do-able.

Considering that the same sorts of things (scaling, reflections, warping, shadows) done in a LiveSet with multiple sources work in a DVE with a single source overlayed on an unscaled background source, it would not surprise me if this sort of thing was within the realm of possibility.

csandy
02-04-2009, 09:41 PM
This thread has become quite fascinating.

Anyone interested in a little less passionate and more authoritave discussion on cutting edge virtual set technology should take a read:
http://www.tvtechnology.com/article/70994.

Quiet1onTheSet
02-04-2009, 10:54 PM
Another feature request: I call it LiveEdit. Billmi is correct: The more venerable VT[4] and VT[5] system offerings from NewTek is certainly the ticket on that one, cezaryg.

These permit DDRs as well as SpeedEDIT output to be present and live-switched on the Toaster's software Switcher interface -- yes, in a Live production scenario.
:vticon:

:rolleyes:

billmi
02-05-2009, 08:51 AM
Great article on the CNN "hologram." It is extremely cool that they have done this in realtime, and it's not a position matching camera like I had thought. That said, it still requires a full realtime 3D scanning/imaging system to shoot the talent and generate the position and image data for the realtime rendered result. You can't get that out of 2 cameras.

It also brings up my annoyance with CNN's misuse of the term hologram. Holography is a film based process. It would be like me calling a photo I took with my digital camera a daguerreotype. But hey, they're a news network, who would expect them to stick to the fact.

MMI
02-05-2009, 11:03 AM
Not to seem contrary, but Holography really has nothing to do with film. It's an optical/physics technology developed to use with electron microscopes that just happened to use film as a recording medium when it was invented because film was available and cheap. They couldn't do "live" in 1947 so had to settle for delayed reconstruction. See below:

"Holography was invented in 1947 by Hungarian physicist Dennis Gabor (Hungarian name: Gábor Dénes) (1900–1979),[1] work for which he received the Nobel Prize in Physics in 1971. It was made possible by pioneering work in the field of physics by other scientists like Mieczysław Wolfke who resolved technical issues that previously made advancements impossible. The discovery was an unexpected result of research into improving electron microscopes at the British Thomson-Houston Company in Rugby, England. The British Thomson-Houston company filed a patent in December 1947 (patent GB685286), but the field did not really advance until the development of the laser in 1960.

Holography (from the Greek, ὅλος-hólos whole + γραφή-grafē writing, drawing) is a technique that allows the light scattered from an object to be recorded and later reconstructed so that it appears as if the object is in the same position relative to the recording medium as it was when recorded. The image changes as the position and orientation of the viewing system changes in exactly the same way as if the object was still present, thus making the recorded image (hologram) appear three dimensional. Holograms can also be made using other types of waves.
The technique of holography can also be used to optically store, retrieve, and process information."

Quiet1onTheSet
02-05-2009, 04:39 PM
Doubtless, this tasty thread has become quite fascinating...
...with progressive refinements and that, with a well-balanced diet of passion to boot.

Its a great thing, that cezaryg posted here in Feature Requests. I'm certainly grateful he did.
:thumbsup:

billmi
02-08-2009, 08:57 AM
Re: Holograms - even if one could create a hologram digitally, holographic technology is reconstructing the image using light interference patterns. All of the light hitting the subject is phase synchronized. The light bouncing off the subject is combined with a reference source (split off from the same light used to light the subject) so the two waveforms added together create either increased or canceled out amplitude. Comparing the recorded signal to a new reference signal re-creates the scattered light and that creates a 3 dimensional image. A hologram can be viewed from various angles simultaneously, as it is truly 3-dimensional, not a 2D view of a 3 dimensional object. You can look at or even photograph a holographic image from different angles and get the same image as if you were shooting the original subject from the same angles.

That's is very unlike the CNN "hologram" which is a building a digital 3D model and then using that to generate a 2-dimensional image of that the original subject looks like from a single viewpoint to be composited into a scene. There is no 3D image being created, if there were the people in the studio could see the 3 dimensional projection, and it could be simply shot by the studio cameras.

Instead, CNN's system uses camera tracking hardware, and their "hologram system" generates output images to match the movement of a camera in the studio and the two images are composited together, so the end result to the user is something what it would look like if the guest were holographically projected into the studio, even though there is no holographic projection going on.

Quiet1onTheSet
02-08-2009, 09:38 AM
..Instead, CNN's system uses camera tracking hardware, and their "hologram system" generates output images to match the movement of a camera in the studio and the two images are composited together, so the end result to the user is something what it would look like if the guest were holographically projected into the studio, even though there is no holographic projection going on.

Oh, *my*, Billmi -- you have utterly *nailed* that one! Huge thanks for that furtherance of your contributions here.

:thumbsup:

pro---studio
02-20-2009, 05:39 AM
Hi everyone,

after reading this interesting thread I think we are talking about two different things:

Camera tracking virtual studios

Stationary camera virtual move virtual studios

I worked with a Orad virtual Studio and it is awesome but really complicated. It is possible to use a "pattern" on the green screen to track the camera movements. You do not have to use expensive tracking systems with sensors and tracking point cameras. But also if you use a pattern based system the price is far beyond a "webcast"-companys reach. The systems render foreground and background simultanously, using a special foreground matte channel that is forwarded to the ultimatte keyer which uses this channel as information which parts of the "non-greenscreen" parts should be keyed out. The graphics engine behind this system is either a sgi high end platform or another os high end system. I don't think there is a chance in implementing this function into a "all-in-one" solution like tricaster.

The second approach is the virtual move.
The tricaster does the calculations allready but does not "pan and scan" between the different views. That is all.
Think of the virtual set as a high resolution graphic where the presenter is keyed in and resized to a proper size. If tricaster could pan and scan into that "image" while resizising the keyed in presenter, that would be a virtual camera move without any real move of the camera. I know that the reflections, shadows and even the virtual monitors have to be resized also. Maybe that is to cpu intensive - but I think the tricaster programmer crew should comment on that. I know that a pan-scan approach is not a real 3d virtual move - but maybe this is a option that can be implemented with low cpu usage.

Just my thoughts - I don't even have a tricaster now but looking forward getting one.

Regards

Pro

Quiet1onTheSet
02-20-2009, 09:43 AM
Thanks, pro--studio, for the insight, relating to your knowledge of and experience with virtual studio systems -- namely that amazing Orad.

Is Silicon Graphics, Inc. (SGI) still building venerable computer systems? I'm seeing some of their workstations show up in a used computer shop -- priced as if they were gracing the shelf of a Salvation Army thrift store.

We're delighted to have you interact with us here, while anticipating ownership of your very own TriCaster portable production studio.



...But also if you use a pattern based system the price is far beyond a "webcast"-companys reach. The systems render foreground and background simultanously, using a special foreground matte channel that is forwarded to the ultimatte keyer which uses this channel as information which parts of the "non-greenscreen" parts should be keyed out. The graphics engine behind this system is either a sgi high end platform or another os high end system...

If you've got the time, would you kindly offer a bit more clarity on the above statement that's highlighted in yellow? You've done a fine job explaining your take on the matter, except I got lost in that one phrase; Thanks in advance.


"...I know that the reflections, shadows and even the virtual monitors have to be resized also. Maybe that is to cpu intensive - but I think the tricaster programmer crew should comment on that. I know that a pan-scan approach is not a real 3d virtual move - but maybe this is a option that can be implemented with low cpu usage.

Just my thoughts - I don't even have a tricaster now but looking forward getting one.

Regards

Pro

Take a look at Post #18 in this thread, authored by Andrew Cross, PhD. At the very least, his lead programming activities at NewTek, Inc., has had him expending quite a bit of energy in getting us what we have in TriCaster's price-shattering LiveSET(tm) 3D Live virtual set technology. In that post (#18), he comments along the lines that you have accurately communicated: that is, that we must keep in view, the fact that TriCaster systems must be designed so as to reserve a decent amount of computational power beyond what the LiveSET(tm) functionality demands -- so we can enjoy having at our disposal, the other simultaneous tasks TriCaster affords.

As well, it would be fascinating to hear from more programmers (yes, even from NewTek), on this intriguing subject.

pro---studio
02-20-2009, 10:21 AM
Hi,

yes SGI is still producing high end machines. Orads RealSet and ProSet now uses Linux based render machines. I worked with an older one that used sgi machines to render the virtual studio back- and foreground.
You still have to use extreme cpu power to produce realtime 3d images of a virtual studio plus you have to have something like a tracking system that extracts the exact point and focal-lenght of each camera. Even the high-end Orad systems have a delay of 2 frames in to out.

Sorry for my poor english - I could explain it exactly in german but have some problems in english -I will try. To clarify what I ment with that:

"which uses this channel as information which parts of the "non-greenscreen" parts should be keyed out."

The System that I worked with produces 2 SDI video streams. One SDI stream contains the complete virtual studio enviroment (foreground & background). The other SDI stream contains an alpha channel information for the ultimatte keyer. With this two streams the system is able to tell the ultimatte keyer that the vitual studio foreground objects have to be opaque. The Ultimatte units have external matte inputs for that. With that Information the presenter can walk behind objects in the virtual studio that are rendered by only one machine. More recent and sophisticated systems use 2 alpha channel outputs that are able to render foregroundobjects that are half-opaque like glass or something like this.

Hope you understand what i mean.

Regards

Pro

Quiet1onTheSet
02-20-2009, 10:56 AM
"which uses this channel as information which parts of the "non-greenscreen" parts should be keyed out."

Regards

Pro

Your English is terrific, as far as I can tell. I believe I tripped up on the redundancy ("parts"), and perhaps (as I often do), you may have left out a word or two.

Decoder ring:
ACTIVATED...

[PROCESSING. PLEASE WAIT!]
>BEEP!<

RESULT:
"...which uses this channel's information to determine which portions of the non-greenscreen elements should be keyed out..."


How'szat?
:rolleyes:

pro---studio
02-20-2009, 10:59 AM
Yes exactly.

:thumbsup:

csandy
03-01-2009, 08:22 PM
Silly G: http://www.sgi.com/products/servers/altix/ice/altix_ice_flash.swf

I think the computer I posted earlier in the thread would do the trick though. The Silly G machine would probably be overkill.

cezaryg
03-12-2009, 02:40 PM
Since NAB2009 is due soon let me bring back for your criticism my other features requests I posted on first page, in addition to my Virtual JIB so interestingly discussed here:

1) Unglue the iVGA mouse cursor! It would be nice to glue it by the left-button-click, and unglue by, say, the second click, or a double-click. As it is now the screen movement is strictly following the mouse cursor’s movements. I would like to be able to frame the VGA, set it firm, and do some interactive stuff on the screen without being followed by the screen movement duping the cursor’s movement.

2) In Edit Media it is impossible to “marquee select” assets on the timeline beyond the screen given field. In my view cursor’s movement beyond the timeline frame would drag a horizontal and vertical move beyond the rectangular space allowing for more assets selection if working in a zoom mode. The way it is now, to do more assets left-click-select I have to double-click anywhere in the timeline field to go to the default size and see all. If a lot of assets are there they are cramped and the detailed selective-select is impossible.

3) It would be great to be able to deselect the Razor lines if not needed anymore. These removed Razor lines could always be stored and recalled (Save Project) if needed. Now it is possible to remove the Clip Markers only.

4) It could be useful to be able to do the on-the-fly sound level adjustment. Example: in the Edit Media leveling the sound level while playing the video while dragging the Project Level knob to adjust levels. To accomplish the same now we have a time consuming but great granular feature of the Keyframe in conjunction with a Master knob. My idea is to save a time for rough adjustments. Monitor Volume knob or additional knob could do the Project Level.

rewire69
03-16-2009, 10:58 AM
I was doing a show with some Hip Hop artists the other day.The names will not be mentioned by the way. And I had totally forgot about how stupid some people can act on stage.Now in my hurry to get the online show running when they asked if this was a clean show and I said yes.I didnt think about the possibilities of someone wanting to be a jerk and be disrespectful to our underage viewers etc and drop more "F" bombs and other swear words that It made me almost blush.I have toured with most nationals and I know what goes on etc. But these were semi nationals and as soon as I asked th em not to swear in between songs etc.I should of thought about it when they got up with that twisted lil monster look in there eyes and did a 5 minute rant and stupid stuff that made no point at all.So anyhow I ended up turning the volume down or off all together for over 10 minutes at one point in time.So I ended up shutting the broadcast down and calling it quits. And the performers were thinking they were still online getting a broadcast while I was lugging the tricaster out the door.So anyhow aside from the typical action of turning down the volume when you get idiots on air sometimes.A button to generate a solid beep would be cool for those just incase moments ya know.

PIZAZZ
03-16-2009, 11:00 AM
rewire, just load the 1k sample audio file in the DDR and double click it when needed. Of course, you need to leave that DDR volume up at all times.

cezaryg
04-18-2009, 08:36 AM
UPDATE: you were correct - the NewTek focus on customers and product support is impeccable, indeed. My defective TriCaster Broadcast was recently replaced. I am so glad.

Having the TC I am keenly interested that the TriCaster rigs stays the most cutting edge, the most competitive and the best products available. Thus I’m glad I joined the Forum to contribute sharing with you my ideas, like this VirtualJIB one. Thank you for your active participation. I hope there will be not so distant time this idea will be implemented.

Quiet1onTheSet
04-18-2009, 10:14 AM
rewire, just load the 1k sample audio file in the DDR and double click it when needed. Of course, you need to leave that DDR volume up at all times.

Jef -- what an ingenious idea.
:thumbsup:

DeanAU
04-20-2009, 09:39 PM
A 1K button would be a great idea tho

cezaryg
04-27-2009, 10:26 PM
Please check this link (XD300 NAB preview): http://www.awpdigital.com/index.php?option=com_content&view=article&id=234&Itemid=329

It is exciting to see that the XD300 redesigned virtual set is in fact going a direction as in my thread with my Virtual JIB concept I brought to this Forum. In addition, a dynamic modification and repositioning titles and templated during live production I requested on this thread the NewTek deployed. Another request I had for a DDR playhead being clearly indicating its position while playing was addressed too. And, finally, the XD300 TriCaster is now a muscle 64-bit application with a powerful 32-bit video; I was right on the target here as well.

I’m guessing now that NewTek is intentionally and wisely waiting for a Windows 7 deployment since a 64-bit stabile, new, and lean OS and an XP replacement would be a natural choice for the new HD TriCaster, in my humble opinion.

ted
04-27-2009, 11:15 PM
I guess NewTek took your wishes and made it reality. :)

cezaryg
04-28-2009, 11:50 AM
Thanks Ted :) You was right when you wrote about NewTek being very focused on its customers. I hope the NewTek would listen to those folks who are pointing now to the limitations of the 3-input XD300 TriCaster.

Paul Lara
04-28-2009, 12:27 PM
Thanks Ted :) You was right when you wrote about NewTek being very focused on its customers. I hope the NewTek would listen to those folks who are pointing now to the limitations of the 3-input XD300 TriCaster.

We are aware of the desire for more than 3 HD cameras, but we have to start somewhere.

Don't forget to join NewTek for a live demo of TriCaster XD 300 (http://www.newtek.com/demo/)next week!

ted
04-28-2009, 11:01 PM
We are aware of the desire for more than 3 HD cameras, but we have to start somewhere.
!

And as soon as you build a 6 input we'll be screaming for the 9 input model! :D

Quiet1onTheSet
04-29-2009, 07:56 AM
And as soon as you build a 6 input we'll be screaming for the 9 input model! :D
Naw! But as soon as you build a 6-input model, we'll be screaming at our bankers for another small business loan!

csandy
04-29-2009, 12:47 PM
Yup, I'm personally going to skip the 3 input model as well. It's a shame, we've already begun transitioning to HD. The Harry Jerome Awards in Toronto this past weekend went well - and the TriCaster BROADCAST perfomed flawlessly. We used the Edirol 440HD and an Extron something or the other for switching and the TriCaster for playback, special effects and back-up recording. Everthing was shot in HD. With 5 cameras though, the proposed prototype XD300 wouldn't fit the bill. But as soon as NewTek sells enough of those bad boys, I'm sure there will be a version with more inputs. Given what I PURELY speculate is the design of the board, discrete previews and video paths will likely still be limited to 3 - but that's a similar setup to current offerings. Maybe we can use NewTek again for 2011 awards show work.