PDA

View Full Version : Banding in 16-bit render in FCP



Johnny
04-19-2005, 02:15 PM
I have a scene in which I am using 8-bit images for Bump only no 8-bit image maps are used for color in this scene.

I rendered the scene as a 16-bit image, but when I view that image, some of my surfaces using 8-bit bump maps have noticeable banding and chunkiness when viewed on a TV monitor via FCP 4.

Are 8-bit bump maps a no-no when going to video, or is there something else involved in making LW-produced output look smooth and unbanded in video?

My FCP test sequence is NTSC DV 720 x 480, Fields off (as is the rendered still), and I get banding whether I use RGB or YUV color space. And tho this is just one frame, I have the frame rate at 29.97.

Thanks!

J

drclare
04-19-2005, 04:11 PM
DV is compressed, so no matter what you are importing into FCP, it is rendering it as DV. That is why you are getting the banding.

Johnny
04-19-2005, 04:13 PM
are you saying that I'm stuck with banding?
you mean, all animations done with LW will have this banding? or is it my method that could be introducing it?

J

Meshbuilder
04-20-2005, 03:12 AM
are you saying that I'm stuck with banding?
you mean, all animations done with LW will have this banding? or is it my method that could be introducing it?


He is saying that as long as you are using DV codec in finalcut you will get the banding.

No, LW isn't the problem here..

You should edit you video with another codec like "Uncompressed 10-bit 4:2:2".

allanBook
04-20-2005, 03:30 AM
For animation, create FCP sequences using a codec other than DV if you want to keep the high quality of your renders. Many use the Animation codec instead.

drclare is correct in stating that DV is compressed and will lossy compress your images.

If your renders out of LW are already 720x480, PAR 0.9 (or 1.2 for anamorphic widescreen), then all you need to do is set FCP to respect the color fidelity of your renders:

1. in FCP, go to "Audio/Video Settings"
2. click on the "Sequence Presets" tab
3. click on a sequence type you prefer using (ex. DV NTSC 48 kHZ Superwhite) and click on "Duplicate"
4. select the duplicate entry and then click on "Edit"
5. under the "Quicktime Video Settings" section (lower-left) pick a lossless codec like "Animation"
6. Rename the preset and click "OK" to save it.
7. back in the "Sequence Presets" tab, click on the empty space next to your new preset in the left-most column to make it the new default sequence.
8. Click "OK" to get out of "Audio/Video Settings"

Using this preset for your animation work in FCP should keep your images looking as they did when you rendered them from LW (barring any color-shifts that you add or any other filters from FCP's library that you decide to use, that is).

Please forgive me if you know all of this stuff already and I just mouthed off for nothing, but I just wanted to be specific just in case it would be helpful.

-Allan

Johnny
04-20-2005, 07:39 PM
For animation, create FCP sequences using a codec other than DV if you want to keep the high quality of your renders. Many use the Animation codec instead.
drclare is correct in stating that DV is compressed and will lossy compress your images.
If your renders out of LW are already 720x480, PAR 0.9 (or 1.2 for anamorphic widescreen), then all you need to do is set FCP to respect the color fidelity of your renders:
-Allan

Hi, Allan;

thanks for taking time to share those pointers and advice..I followed all of them, but I am still getting nasty, chunky, posterized, banded video when I play on a video monitor..Looks great on my Mac, tho...

I tried *several* codecs, including the ones suggested here (Animation and Uncompressed) and nothing makes a difference.

Not quite at the hair-pulling stage yet, but I'm baffled why nobody's suggestions are fixing this problem.

maybe this is silly, but could something be wrong with my scene? I know I'm outputting 16-bit renders (I can verify this by opening the images in PS).

I'd post a screen capture if I could get it off the tv...

allanBook
04-20-2005, 09:29 PM
hmm... well, NTSC TV displays can't usually show subtle color gradations since they the NTSC video stream used to send information from the computer to the TV is not as capable to show off so many millions of colors the subtle nuances that computer monitors can create.

I guess the problem (since you're able to see the images in pristine quality on your computer screen, but not on your TV/Video monitor) may be that the luminance and chrominance range of your animation renders are exceeding video broadcast limits.

To bring the luminance and chrominance of your renders into a broadcast-safe range, you could either apply a broadcast safe post process filter in LW or use the Broadcast Safe filter inside FCP.

If you want to check if your images may not be broadcast safe right now, you can go back into FCP and apply one of the settings of the broadcast safe filter right now, and FCP should show you areas of the image that exceed video broadcast limits.

Yikes, my responses are always so darn wordy! I'm really sorry about that.

It's all about trying to fit as much of your high quality renders into the somewhat limiting confines of broadcast video. Certainly, HDTVs are helping in this regard, but most households still use SDTVs as their primary TV monitors.

Hopefully, a more experienced (and not-as-wordy-as-me) lightwaver will pop by and give some more advice.

Good luck.

-Allan

PS: Do not use DV (unless you are forced to because of iMovie, but you have FCP so use something better like Animation).

dsol
04-21-2005, 10:31 AM
Just a thought - you are saving out your images as 16 bits per channel - and not 16 bit total (ie. 5 bit R&B, 6bit G)?

It could be a problem with your DV playback settings too - the DV codec has a low quality draft playback mode. You can force quicktimes encoded with it to use the best quality mode by opening the movie in QT pro and going to movie properties:video:quality

Johnny
04-21-2005, 10:35 AM
Just a thought - you are saving out your images as 16 bits per channel - and not 16 bit total (ie. 5 bit R&B, 6bit G)?

It could be a problem with your DV playback settings too - the DV codec has a low quality draft playback mode. You can force quicktimes encoded with it to use the best quality mode by opening the movie in QT pro and going to movie properties:video:quality

when I open one of the images in PS it says 16 bits per channel, RGB.

I'll take a look at that QT movie quality angle..thanks!

**EDIT: just tried that..DV codec, quality "High"..no difference

J

dsol
04-21-2005, 11:40 AM
Perhaps it's a quantisation issue going from 16bpc RGB to 8 or 10 bit YUV in FCP. Have you tried rendering out a segment as 8bpc RGB and using that?

Johnny
04-21-2005, 12:03 PM
I had been doing 8 bit/channel renders and got even worse banding..then I read this tutorial, got religion, and changed my workflow to 16 bit..

here's the link: http://vbulletin.newtek.com/showthread.php?t=16608

yet on the video monitor, I get chunkyness and banding similar to his example of what a gradient looks like in 8 bit.

J

ArneK
04-21-2005, 07:17 PM
As long as you are previewing on a monitor using DV out through a DV camera or similar, you will get banding. It is still the DV codec that renders the footage on your tv monitor...

You need to use a pro video card like DeckLink or similar to view uncompressed footage. It looks a million times better. Better colors, much sharper etc. You also spot aliasing problems much easier. Get a cheap uncompressed video card. You won't regret it, and you'll also wonder how you ever managed without it.

(Remember though, that if you want to view your uncompressed footage in realtime you probably also will need a RAID harddisk setup to handle the data rate...)

archiea
04-21-2005, 08:48 PM
Just a quick, semi unrelated suggestion.... If you plan to use DV for output for whatever reason, ad a little bit of grain to your shot. This will replace the banding with a dither pattern. Shoot a blue sky with DV and you may get some banding, but there is usually enough noise from the CCD to "break up" the banding.

DV can't really be considered even 8Bit with the 4:1 color compression.

You would need a High Quality compnent display card to accrutely encode 16 bit RGB to video. Even then, you can still get a little banding with 16bit with somre really shallow gradations.

Good luck

Johnny
04-21-2005, 10:03 PM
Just a quick, semi unrelated suggestion.... If you plan to use DV for output for whatever reason, ad a little bit of grain to your shot. This will replace the banding with a dither pattern. Shoot a blue sky with DV and you may get some banding, but there is usually enough noise from the CCD to "break up" the banding.

DV can't really be considered even 8Bit with the 4:1 color compression.

You would need a High Quality compnent display card to accrutely encode 16 bit RGB to video. Even then, you can still get a little banding with 16bit with somre really shallow gradations.

Good luck

OK..so, are you saying that attempting 16bit is a waste of time, lacking more hardware?

Not that I am even near Pixar quality, but how do they get their stuff to look so great when finally played on one of their DVDs on a $200 TV?


J

Johnny
04-21-2005, 10:09 PM
Just a quick, semi unrelated suggestion.... If you plan to use DV for output for whatever reason, ad a little bit of grain to your shot. This will replace the banding with a dither pattern. Shoot a blue sky with DV and you may get some banding, but there is usually enough noise from the CCD to "break up" the banding.

[QUOTE=archiea]DV can't really be considered even 8Bit with the 4:1 color compression.

Would you mind explaining why that is? Your saying that some bit depth is lost to compression?

Thanks,

J

ArneK
04-22-2005, 12:32 AM
Would you mind explaining why that is? Your saying that some bit depth is lost to compression?

Thanks,

J

Have look here (http://www.larryjordan.biz/articles/lj_compress.html) and here. (http://www.tvtechnology.com/features/Tech-Corner/f-RH-4.2.2-07.10.02.shtml) A little info on compression and SD/HD. :)

Arne

Johnny
04-22-2005, 11:02 AM
OK..those links are very informative...

Just to throw a kink into things, I've been experimenting, first with a simple scene with a few lights and primitives (no banding in FCP or on TV).

Then, I took the scene that's been giving me fits, Saved As another file, and got rid of the objects, but kept the lights, and imported the exerimental scene: no banding in FCP or on the TV, regardless of codec used. Tho not all will play thru in real time to the TV monitor..

To me, this suggests that the lights are OK in the scene giving the problems, but that a surface might be causing the banding? Or, the interaction between a surface and one or more lights.

Am I describing something people have tangled with before?

thanks!

J

monovich
04-26-2005, 04:21 PM
In my opinion, unless you are doing a lot of color correction or post work with your renders, 16 bit is a waste or rescources. 16 bit has much more image information, and is really great, but if you are going straight from the render to the TV, you aren't using that extra data much, if at all.

Also, you may try converting your 16 bit images to DV in After Effects instead of FCP. We get different results sometimes depending on which we use.

-s

Johnny
04-27-2005, 05:52 AM
In my opinion, unless you are doing a lot of color correction or post work with your renders, 16 bit is a waste or rescources. 16 bit has much more image information, and is really great, but if you are going straight from the render to the TV, you aren't using that extra data much, if at all.

Also, you may try converting your 16 bit images to DV in After Effects instead of FCP. We get different results sometimes depending on which we use.

-s

Can you give an example of "a lot"? What I envision is making adjustments in FCP, say if a clip is too dark, but not doing chroma keys or any kind of heavy processing. I am aiming to achieve all the "effects" within LW itself, using FCP to edit primarily.

I have noticed that on these clips, when I make tonal adjustments, ie, 3-way color corrector, the banding is amplified..looks just plain nasty, almost like the gradations are made from varying shades of construction paper layered one on top of the other to achieve the transition from light to dark.

It's minimized somewhat during play, but the bands vibrate or squirm. basically looks like beginner footage; even the 16-bit renders.

I should add the scene in question is a dimly-lit room; maybe it's a challenge to show a low-light scene in 3D > video?

J

dsol
04-27-2005, 07:42 AM
the DV codec generally doesn't seem to handle low-light scenes well (when filming) - so it may be a fundamental problem for the kind of stuff you're doing. Can your system handle an uncompressed 10bit or 8bit video stream? I'd recommend trying to change your sequence settings in FCP to uncompressed (using the "None" or "Apple FCP 8bit uncompressed codec).

How are you importing the anims into FCP - are they image sequences or quicktimes? Try playing around with the formats. Some of the image savers in LW (like the standard PSD one) seem to be incompatible with mac apps - possibly due to endian issues. Quicktime Pro is very good for converting files and seems to be very compatible with "weird" files that After Effects and FCP barf at...

avkills
04-30-2005, 08:38 AM
8 Bit video is going to have banding no matter what you do.

10bit uncompressed is the way to go. See if you can either pick up a AJA Io LA Firewire interface or a DeckLink card. That way you can bring your 16bit renders into AfterEffects and then render them out as 10bit uncompressed video. That should fix most of your banding issues. Since the 10bit codec is YUV, it works a bit different than RGB, 10bit is plenty of depth for YUV based video.

You could even try just rendering out a 10bit clip from Lightwave also. I usually do not since you have to make a separate alpha channel clip to composite with it. I generally use the Animation codec. Seems to work good. Re-render from AfterEffects to 10bit.

Also, if your final output is going to be video tape, then you should field render your footage, since that is the way NTSC video works. If you are burning straight to DVD, then you could do progressive and author the disc accordingly.

-mark