View Full Version : Gamma correction

09-11-2005, 12:57 AM
I'm a little confused, but I can sum it up in a short question:

When Lightwave renders, and when displaying the output (whether in the FP viewer itself of afterwards from a file), does it display it in linear gamma (as it had been computed) or non-linear with a power of 2.2?

I just read a tutorial about a plugin that talked about how Lightwave's color picker is not color-corrected, and I have always wondered why when using radiosity I had to boost up the intensity all the time; then I go read this tutorial, and find out maybe adding the FP Gamma plugin on radiosity renderings would make it work. Now simple tests shows this was probably the problem.

Or was I right that boosting stuff up was the right thing to do and Lightwave displays stuff at the PC standard of 2.2?

09-11-2005, 11:55 AM
Lightwave's renderer outpout linear images, and all the internal calculations use a linear color space. If you want accurate results (especially for radiosity), you must use the FP gamma filter.. but there is a problem : usually, all PC images (textures, backdrops, reflections maps..) are already gamma corrected, so theses images will be corrected two times, washing out your render.
A solution is to make all your work in linear space, but you will loss color details if your textures are not in 16 bits from the beginning.
Gamma correct your monitor to 2.2, so you now have a linear monitor, make all your textures linear (apply a inverse 2.2 if they come from external sources and are already corrected), light and shade your scene as usual, and you will get an accurate linear render. You can apply a 2.2 FP gamma on this render and set your monitor back to the standard 2.2 gamma when the work is finished.

Also, don't forget than HDR images are always linear. Lighting a scene with radiosity using a linear image, on a non linear monitor will never give realistic results : the resulting image is usually too contrasted, with burned shadows and blowed out higlights...