PDA

View Full Version : Field Rendering: why different?



Serling
06-04-2008, 09:45 AM
Avid: Even/Lower-Odd/Upper
Abobe: Even/Lower-Odd/Upper
Autodesk: Even/Lower-Odd/Upper
Lightwave: Odd/Lower-Even/Upper???

Newtek: Could we please get with the rest of the industry on this in 9.5? I choose Odd/Lower to render fields in Lightwave, but it always feels like a crap-shoot as to whether these other apps will interpret your field rendering correctly on import.

For now, all my imports should be coming in as Even/Lower field first. Which of your two options best matches this setting for export?

Thanks in advance.

3DGFXStudios
06-04-2008, 09:46 AM
who cares about the differences! Just don't use them. first check if it's really different. Some programs start counting at field 0 and some at field 1

Serling
06-04-2008, 09:52 AM
I work in TV where this stuff does matter. Thanks for the reply!

FredyN
06-04-2008, 10:46 AM
for me is significant upper or lower.
upper field first - used by avid (Betacam export)
lower field first - most used for DV video systems
it depends from hardware, you can test it best on crawlind titles or horizontal moving objects on CRT monitor

UnCommonGrafx
06-04-2008, 11:44 AM
Your best bet is to keep a file around to immediately and easily test this.
Let me see if I can remember this...
Take a tv-sized image and color it half and half, top to bottom, with obvious colors; place it in the lw cam view; with fielding on, you should see it change as you scroll the lw timeline.

If that is all correct for you (too lazy to check @tm) then using this as a hundred-twenty animation will be a great way to quickly make a file to gauge the needs of the system you are working with.

See, regardless of what LW is doing, it's the empirical evidence that matters. I've been told footage was one way but it would never read right that way with the program I was working with. Made it work and got it done. Complaints weren't received back, either.

Test files that render really fast are the key.


Avid: Even/Lower-Odd/Upper
Abobe: Even/Lower-Odd/Upper
Autodesk: Even/Lower-Odd/Upper
Lightwave: Odd/Lower-Even/Upper???

Newtek: Could we please get with the rest of the industry on this in 9.5? I choose Odd/Lower to render fields in Lightwave, but it always feels like a crap-shoot as to whether these other apps will interpret your field rendering correctly on import.

For now, all my imports should be coming in as Even/Lower field first. Which of your two options best matches this setting for export?

Thanks in advance.

clagman
06-04-2008, 03:34 PM
Ugghh, I never render in fields (interlaced). It seems best to render progressive frames and if you require interlace then just interlace in AE (or whatever you have).

Surrealist.
06-04-2008, 08:40 PM
That would be my way of doing it too. Not sure if there is or why there would be a difference in starting that in LW.

I use Vegas and there I can just set the project to the desired properties. I guess it may depend on the NLE used.

And I am curious for TV work, at what level and is this on tape? And on that note. Why even use interlaced at all? Would it not work to just let the broadcast equipment do that?

Here's the background to these questions:

I have been working with video for years - not directly in TV.

But I had fairly recently discovered that I could go progressive all the way through my pipeline and deliver in that format on tape. At the consumer level this viewed on a TV looks much better than something interlaced. And the consumer equipment - even production monitors - deal with this fine.

When I delivered my feature film to my distributor, I simply delivered in progressive. This was all of the broadcast tapes I was required to deliver for domestic and international. And I had no problems.

So my question is, could you print to tape in progressive - most new decks have this capability - and then just simply run that through the system at the station?

Then my true ignorance of this comes though with not knowing how this is different for broadcast. I can see that if you had an interlaced signal that if it was not in sync that would cause problems for broadcast. But if you had a progressive source, what would happen on broadcast? And have you ever tried that? Can the station equipment just convert to interlaced on the fly? I am going to guess that progressive would not broadcast on SD equipment because it is too much information. Is that true?


So again I am fairly ignorant of the broadcast side. I am curious what goes on there.

Serling
06-04-2008, 10:29 PM
Standard definition NTSC (this includes DV, which is the format I work with in news right now) specifies that NTSC SD TVs receive an interlaced signal. Fields are produced as a result of that interlaced signal.

Each frame of 30 fps video in NTSC SD and DV is really made up of 2 fields of interlaced video flashing to the screen every 1/60th of a second. 30 frames per second = 60 fields per second. Because there are different variations of how the fields are drawn when they reach the screen (NTSC SD is scanned upper field first, NTSC DV is scanned lower field first) making sure you're importing and exporting with the correct field order (more appropriately called "dominance") is critical in broadcasting an interlaced signal. Import with the wrong field dominance and you're going to see flicker and motion judder. It's ugly as sin.

Computer screens and progressive scan TVs are "field agnostic." The only real way to make sure you've exported then imported something properly is to check the work on an interlaced screen (TV set or monitor which - BTW - I do).

However, when you're looking at some of the render times associated with output from Lightwave, it sure would be nice to have field rendering options that followed the industry norm instead of being as they exist now.

The network I work for will eventually convert to 720p, at which point I'll be happy to render without fields. But until then, making sure my imports match my exports regarding field dominance (which is very easy to do in AE, C4 and other compositing apps) saves me time. And in news, time is always of the essence.

P.S. We don't output to tape: everything we air is sent to playback servers.

geo_n
06-05-2008, 01:14 AM
Had this problem recently. i made a field render in lightwave and the video guy couldn't use it. So he said to render double the frame! And he did the fixing in AE. That was the first time I encountered this problem. Not in my era of tv I guess because the video guy said this kind of render was used in the old style video. :D

Surrealist.
06-05-2008, 09:30 AM
Standard definition NTSC (this includes DV, which is the format I work with in news right now) specifies that NTSC SD TVs receive an interlaced signal. Fields are produced as a result of that interlaced signal.

Each frame of 30 fps video in NTSC SD and DV is really made up of 2 fields of interlaced video flashing to the screen every 1/60th of a second. 30 frames per second = 60 fields per second. Because there are different variations of how the fields are drawn when they reach the screen (NTSC SD is scanned upper field first, NTSC DV is scanned lower field first) making sure you're importing and exporting with the correct field order (more appropriately called "dominance") is critical in broadcasting an interlaced signal. Import with the wrong field dominance and you're going to see flicker and motion judder. It's ugly as sin.

Computer screens and progressive scan TVs are "field agnostic." The only real way to make sure you've exported then imported something properly is to check the work on an interlaced screen (TV set or monitor which - BTW - I do).

However, when you're looking at some of the render times associated with output from Lightwave, it sure would be nice to have field rendering options that followed the industry norm instead of being as they exist now.

The network I work for will eventually convert to 720p, at which point I'll be happy to render without fields. But until then, making sure my imports match my exports regarding field dominance (which is very easy to do in AE, C4 and other compositing apps) saves me time. And in news, time is always of the essence.

P.S. We don't output to tape: everything we air is sent to playback servers.

Ha ha, sorry to waste you time with the interlace 101 speech. I knew all that already. But very succinctly done I must say. :)

I did not use nor have I ever had a progressive scan TV or monitor. I should have been very specific.

I can play back progressive scan on an interlace screen with no flicker. No problems or course because it is progressive. I was curious what would happen if you fed the station equipment progressive even though it will output interlaced. I was wondering if it would accept it.

So do you yourself edit 3D and video source? Or do you just send them the 3D and they mix it.

Anyway sorry to annoy you with my curiosity. :)

Lightwolf
06-05-2008, 09:47 AM
I was curious what would happen if you fed the station equipment progressive even though it will output interlaced. I was wondering if it would accept it.
It would... but wouldn't look as nice as proper interlaced video. Which has the big advantage of doubling the frame rate (basically) while doubling the res. And animations do look a lot smoother...

@Serling: I'd say: do a quick test and then remember the setting ;)

Cheers,
Mike

Surrealist.
06-05-2008, 11:43 AM
Well I guess my question is would it accept it and convert it on the fly because I thought that the advantage of interlace was the delivery. I thought it had to be interlaced to fit in the bandwidth or something like that. I am not real well read on the broadcast end of things, but my thought was that it (the equipment that sends out the broadcast signal) would have to convert it for it to be broadcast "legal" - if you will.

oobievision
06-06-2008, 12:32 AM
try this image out
<img src="http://www.codesampler.com/d3dbook/chapter_05/chapter_05_files/image012.jpg">

but really I wouldnt bother field rendering cause once you export it in Premiere it will already be field rendered. if anything export it as a targa sequence and import it into after effects and render it out there. If you really want to make sure that is.

oobievision
06-06-2008, 12:37 AM
Well I guess my question is would it accept it and convert it on the fly because I thought that the advantage of interlace was the delivery. I thought it had to be interlaced to fit in the bandwidth or something like that. I am not real well read on the broadcast end of things, but my thought was that it (the equipment that sends out the broadcast signal) would have to convert it for it to be broadcast "legal" - if you will.

Interlace is a technique of improving the picture quality of a video signal primarily on CRT devices without consuming extra bandwidth. Interlacing causes problems on certain display devices such as LCDs[1]. It was invented by RCA engineer Randall C. Ballard in 1932,[2][3] and first demonstrated in 1934, as cathode ray tube screens became brighter, increasing the level of flicker caused by progressive (sequential) scanning.[4] It was ubiquitous in television until the 1970s, when the needs of computer monitors resulted in the reintroduction of progressive scan. Interlace is still used for most standard definition TVs, and the 1080i HDTV broadcast standard, but not for LCD, micromirror (DLP), or plasma displays; these displays do not use a raster scan to create an image, and so cannot benefit from interlacing: in practice, they have to be driven with a progressive scan signal. The deinterlacing circuitry to get progressive scan from a normal interlaced broadcast television signal can add to the cost of a television set using such displays. Currently, progressive displays dominate the HDTV market.

Interlaced scan refers to one of two common methods for "painting" a video image on an electronic display screen (the second is progressive scan) by scanning or displaying each line or row of pixels. This technique uses two fields to create a frame. One field contains all the odd lines in the image, the other contains all the even lines of the image. A PAL based television display, for example, scans 50 fields every second (25 odd and 25 even). The two sets of 25 fields work together to create a full frame every 1/25th of a second, resulting in a display of 25 frames per second.

Reference from Wikipedia.com

oobievision
06-06-2008, 12:48 AM
Or as in NTSC which is what we use here in the USA typically has an aspect ratio of 4:3 at 48khz or at 32 khz. and has a frame rate of 29.7 frames a second which read at 1/29.7th of a second for even/odd. HD Requires Higher Frequencies such as 50 or 60khz to avoid flicker.

Progressive Scanning is the right way to broadcast a signal, but back in 1934 it was cheaper to transmit the signal in interlaced. and make Televisions more affordable. oh and it was cheaper to transmit Interlace as you dont use so much bandwidth.

Surrealist.
06-06-2008, 04:25 AM
OK, thanks for the history lesson. Pretty cool stuff

And I would just export of my NLE as well.

So I guess from what you are saying, the station could handle the progressive signal. But would it output it that way of convert to interlaced?

Serling
06-06-2008, 08:48 AM
All of this is interesting and all, but I'd still kinda like someone from Newtek to jump in and explain why their field rendering options are 180 degrees out of phase with the rest of the broadcast world and which of their two interlacing options best matches an Even/Lower (DV) workflow.

UnCommonGrafx
06-06-2008, 09:43 AM
Serling,
If you hold your breath long enough, in the afterlife you may get that answer. ;)

Until that time, with the Serenity prayer in mind, know that empirical tests get you closer than anything someone might tell you that later doesn't work.

The answer is, and not trying to be a smart aleck, the one that works with the footage at hand.

ivanze
06-06-2008, 11:45 AM
I've used Odd/Lower always for DV and used Even/Upper once for DPS perception.

ivanze
06-06-2008, 11:54 AM
I've used Odd/Lower always for DV and used Even/Upper once for DPS perception. But like some say I prefer to render without fields in Lightwave, import the frames in After Effects and turn on Layer > Frame Blending > Frame Mix to add fields to my renders. It looks even better than field rendering from Lightwave.

Serling
06-06-2008, 11:38 PM
Thank you, Ivanze...just what I was looking for. :)