View Full Version : NTSC standard making me go insane!

08-19-2005, 09:54 PM
Ok. I know there have been plenty of threads on this but none have really answered certain things purely. And I am at an end on trying to figure things out.

Recently on another forum I replied on someones post in order to help them with a NTSC resolution problem. Then another member replied and said a different resolution, but explained why and it made sense...so I figured I was wrong and I should check some things.

But low and behold when researching this topic again I come across many websites that are with what I said, and some others that are against. I also went to many encyclopedia sites and other ones that would explain it from a technical point of view, but almost all of them explained it in a signal form and not just straight digital talk.

Yes I know that's what the standard really is made up of but when everything went digital everything got crazy.

The specs I have always used and which have always worked are 720x480, 29.97 fps, pixel aspect ratio of .888 repeating with odd field first (when rendering from Lightwave). What confuses me is that everywhere I look in program help files and here on these forums most people are saying 720x486. A ratio of .888 comes from 480 being divided by 540. And the .9 that I see in every program for NTSC comes from the 486. The one member explained that the person should just strip the six lines from the 486 in order to get 480, thus saying that even 480 is really .9, but just cropped.

From the technical point of view the websites I researched on say the NTSC standard actually has 525, but those extra lines are for non-displayed data such as closed captioning and vertical sync. OK. That makes sense but what's the deal with the difference between 480 and 486?

The one member said the NTSC D1 format was the 486 one, and DV along with all DVD's are 480. Then I see these other ones with a resolution of 704x480...saying the decreased horizontal is due to how some cameras don't record anything 8 pixels from each side.

Can anyone clearly make a definitive answer on all the formats? Not just give the specs but why they are that way?

I also had two other small questions.

Since the camera is taking 60 fields per second, is the exposure time half of what it would be with one regular frame within those fields or is it what you set?

When you record something from a camera, or when you are rendering with Lightwave, are there really 480 lines of resolution, or are there really only 240 discrete lines alternating between fields?