PDA

View Full Version : Light intensity and video



Johnny
06-14-2005, 08:48 AM
I'd read or heard that since Lightwave's lights can be turned to higher than 100% intensity, this means that those lights also exceed RGB values for video.. Can this lead to problems when used in animation intented to be viewed on televisions?

J

kjl
06-14-2005, 10:27 AM
Actually, you can easily get intensities and colors over 1 simply by adding multiple lights, or as everybody is fond of doing now, using HDR, so setting an intensity>100% on a light does not introduce anything weirder than usual. Regardless of that, there are actually quite a few color ranges even within the (0->1 , 0->1 , 0->1) RGB color space that are not displayable with NTSC.

I'm not sure exactly what the video pipeline is, but whatever color-timing/compositing software you are using to make your final output should have functions to make your images NTSC safe (...and in the process suck all the saturation and contrast away :) ).

Johnny
06-14-2005, 10:30 AM
…there are actually quite a few color ranges even within the (0->1 , 0->1 , 0->1) RGB color space that are not displayable with NTSC.

Is there a way for me to learn what those colors are? Do they fall outside of 16 - 235, and is it a simple matter of making sure colors are within that?


I'm not sure exactly what the video pipeline is, but whatever color-timing/compositing software you are using to make your final output should have functions to make your images NTSC safe.

Pipeline is LW to FCP. My reasons for asking related to banding and splotchiness that I sometimes get, and that a scene I'm working on is a dimly-lit room. On computer monitor, the light looks just right; on the TV, the room looks not brightly-lit, but brighter enough that you'd swear it wasn't the same scene file. I sense this is a diff. between computer monitors and TVs, but not sure where my solution might be.

thanks!

Johnny

kjl
06-14-2005, 12:24 PM
I'm sorry I can't be all that helpful - my stuff goes to film, not video, and the parts where it actually gets transferred from the monitor to other media happens in some other magic room in the building ;)

But as I understand it, the black level of NTSC is pretty bright, and many saturated colors (I think mostly in the green-blues, but I could be mistaken) are not valid. Unfortunately, I think that means it is not as simple as making sure things lie withing 16->235 (e.g. perhaps (235,235,235)-light-grey is NTSC safe but (16,235,235)-bright-cyan is not.).

I think your banding issue is a problem where you don't have much color resolution in your darks, but your monitor squashes them all to nearly black so you don't notice, and your TV brightens all of the darks up so you can.

Things that you could try would be rendering with higher bit depth to get more resolution in the darks, or adding dithering with your compositor, but those are probably newb solutions (I bet guys on a final cut pro forum could tell you the "right" answer).

I think Final Cut Pro has a Broadcast-safe filter - maybe it has a bunch of options and presets that just do the right thing?

Richard B
06-14-2005, 12:36 PM
Hi

Color banding results from your Video Hardware.
What are you using for displaying your Images on Video?

Richie

Johnny
06-14-2005, 01:04 PM
Hi

Color banding results from your Video Hardware.
What are you using for displaying your Images on Video?

Richie


Just a regular television. Yes, I know, not ideal, but until things change, an NTSC monitor is just out of reach for me.

Maybe this is a bad comparison, but on that same TV, I can watch toy story or incredibles and not see the kind of banding I'm seeing in my own work, so I wonder what do they do that I'm not doing — aside from having bathtubs full of money and buildings full of gear?!

J

Integrity
06-14-2005, 03:31 PM
This is probably going to be a useless reply to you on my part, since everyone else is way above me in the food chain. But I'll try anyway.

There is a Dither option in Lightwave somewhere in the image compositing/volumentric panels (I don't have Lightwave installed here right now). You might have it off, this might fix the color banding thing.

There is also a selection you can try in the Render Settings. Where you can change from Full Precision to regular RGB. From what I have played around with it will cut off all higher values to 100% making it somewhat legal for video (until you do the NTSC filter thing) at the full 24 bit range (you can also add in the one filter for adjusting the dynamic range into 24 bit to your preference). I don't know if in your pipeline your using HDR or 16 bit the whole way through, but my guess (and I'm probably wrong) is that you trying to display the whole high dynamic range on a lower range monitor, which is why it is dark (if that's what you said).

I also don't know why viewing things on an NTSC monitor is so bad considering everyone elso except the people with money view it on regular TV's.

Johnny
06-14-2005, 03:36 PM
good ideas, integrity.. I have played with the dither settings and they do diminish the banding greatly, but I still see a sort of splotchiness, but haven't tried the other ideas you mention, but will.

I have been clobbered, in a sense, by people on various boards for having the audacity to do video without the 'proper' NTSC monitor, but the more I looked into what was involved buying, calibrating and keeping one calibrated, not to mention keeping tabs on the near-infinite variables along the way, I rolled my eyes and decided to fuggedaboudit.

I have my monitor (TV) set up the best I can set it up and am hoping for the best. I really don't care about broadcast tv anyway, so all for the best.

J

Integrity
06-14-2005, 04:12 PM
Now that I think about it I feel the weird need to answer your original post. Nothing will ever exceed RGB values if they're converting correctly...or rather to their preference. I mean you can have a Light Intensity value (or even color) of way over 100%, but that does not mean it is exceeding RGB values, for Lightwave anyway. Everything as far as I know, is rendered in floating point (except for certain things, I mean only the core features of Lightwave itself, some plugins won't follow). You can have a contrast of values from let's say 25% to 575%, all stored in HDR / floating point...but in the end, YOU have the choice of how you want it to be converted in 24 bit / regular RGB / NTSC. There is a plugin where you can select the white point and all that for HDR so you can fit your dynamic range inside RGB, but this all entails things. I don't know how to explain it, and somebody will probably come back and flame me because I'm probably wrong, but think of it as real photography. You can photograph something where there are no strong difference in light levels, and then you can also go photograph something with something comparable to a 575% light and a 25% ambient.

Though with Lightwave, you can choose what is the top displayed level, like 100%, and every value over that (all along with 575%), will be white or what I think photographers call "overexposed". So your dimly lit room, might need a very low value to be able to be seen, but yet have a light "lighting" it up at a value of 575%.

And let's say you set you top level (I think it's called white point), to 1000% when it only need to be 100% for proper contrast (or your pulling it up so you can see your room), now your extending the range past the resolution, and this is where banding will occur like kjl said. Then your composite program or tv shoots up the brightness, showing it off when you don't want it too.

All I'm saying is that Lightwave's automatic HDR -> RGB thing might be messing things up...I don't know.

kjl
06-14-2005, 06:36 PM
If dithering helped your banding problem, I would say that you don't have enough resolution in your darks. Try outputting in an image format with >8 bits per channel and see if that helps, and then if you need to you can dither and color correct in FCP to your heart's content to get the levels looking right on your TV.