PDA

View Full Version : Display color space file?



Axis3d
01-22-2013, 08:24 AM
I am trying to load in a Display Color Space LUT file so I can properly see the correct colors while working in Lightwave. I understand that Lightwave opens the 3D LUT file format. The LUT file I am trying to open has a .3dl extension, but when I try to load it into Lightwave I get an error stating "failed to load colorspace table from file:"

Anybody know what is going on? Any help would be greatly appreciated.

gerardstrada
01-24-2013, 04:16 AM
It's the LUT format. LW is compatible with the Academy/ASC 3D LUT format, I think. Besides its own LW color table format. LUTs from Inferno and AD compositing packages uses .3dl extension and LW is able to load them depending on the header of the LUT file (it opens at least the ones generating by LUTBUddy here). But once opened, you'll get overexposed results and bandings even in 32-bits LUTs. It's not the format LW expects. Maybe you could export the LUT in the ASC 3DLUT format in your compositing package (re-generating from a color pattern), or get something like LightIllusion CMS for converting from different LUT formats. Just be aware there are LUTs than can be converted from one format to another but not in reverse. This is because some formats addresses functions that others not, or some formats are ambiguous in some functions or they are just black boxes, so results might change from format to format when converting.

Check also if you can get an ICC profile instead, either the real color space or a converted LUT to ICC. Then you could use the SG_CC Filter or Node (http://www2.informatik.hu-berlin.de/~goetsch/CCTools/) and results will be much more reliable.



Gerardo

Axis3d
01-25-2013, 04:50 PM
Thanks, Gerardo, I'll give that a shot and let you know if it works.

gerardstrada
01-28-2013, 01:59 PM
Consider that if the LUT is from a 3rth-party shop (post-poduction house / color-suite) would be a good idea to verify you are displaying in the same standard. sRGB/aRGB range, Rec. 709, dci-p3...

Consider also the LUT alone won't match exactly on different applications and monitors. Unless you are using the same specifications or/and the same color management system that handles the same LUT format. This is because applications might apply LUTís differently (or incorrectly), and some LUT formats just can't accurately represent certain color transformations, shaper LUTs concatenations (not the case of Lustre 3dl, I think), nodes interpolations, table resolutions, etc.

A way to verify if both are seeing the same thing is by using output LUTs instead of display LUTs so that they can bake the LUT into a test image and save that into a standard colorspace (common for both parties), so that you can check up side-by-side this test image with the applied display LUT in LW / your compositing package.

In case they don't count on the output LUT yet and if they are using the LUT at application-level only (this won't work at OS-level), they could also screen capture a test image with the display LUT (in the application they are using) paste that into Photoshop, assign their display profile and then convert to a common standard colorspace (Relative Colorimetric Intent). Embedding the standard ICC profile for this purpose assures both shares the same standard colorspace. then you could check up in PS side-by-side with your compositing package or LW.

In case you have problems with the LUT route, I could try here the LUT2ICC conversion. If so, you could try with SG_CCFilter. You won't gain anything more than what the LUT may offer, but at least you will be able to preview exactly that within LW.



Gerardo

Axis3d
01-30-2013, 08:06 AM
So far no luck. We have tried about 8 variations of the LUT file so far. We also tried converting the LUT to ICC, still no luck. I am using the mac version of LW and it looks like the SG_CCFilter is pc only. Would you be able to post a .3dl file that has worked for you in LW? We could look at the file in text form to see if there are any differences in headers or other formatting.

Kevin

gerardstrada
01-30-2013, 01:42 PM
As I commented in my first post, no .3dl LUT has worked for me aproppriately within LW. They can load in LW if we change the header, but they don't display correctly (over-exposed and weird posterizations). The trick so that they can load is replace the header. i.e.:

Let's suppose the header is:

3D Mesh
Mesh 5 12

Delete and replace it by this:

# of samples (table resolution) and name of the LUT (opcional)

i.e:

# 16x16x16 myLUT

How you know the table resolution?

After the header (3D MEsh bla, bla, bla) you'll see a group of numbers that commonly goes from 0 to the maximum number of the LUT bit-depth. Let's say from 0 to 1023 in 10-bits LUT. Something like this:

0 68 136 205 273 341 409 477 546 614 682 750 818 887 955 1023

In such case, just count the number of these groups (16 in the above case) and that's your table resolution.

That's all. I'm attaching a .3dl version that opens in LW - but as commented, it doesn't display correctly. I have the impresion the issue is in the order of the samples on the table, but frankly, it's easier use a LUT embedded in an ICC profile with SG_CCTools. Or maybe you could convert to the ASC LUT format.

Let me know if you are willing to built a LW color table, it's not easy to build a LUT from scratch but if it's just a 1D LUT, it's not so laborious.

As for the ICC compatibility, it only will work in SG_CCTools (yes, PC-only but perhaps you might try a PC emulator?). Anyway, these kind of ICC profiles won't load in LW because LW expects display profiles (and even in that way they don't display accurately) and the kind of ICC profiles you are converting from LUTs are Device-Link profiles. Again, they work nicely in SG_CCTools.



Gerardo

110824

faulknermano
10-24-2013, 04:07 PM
I would like to ask a question here (instead of opening up another thread) about the way colour management is applied to the render view. I need a better understanding of what's going on inside LW (and in general!)

Let me plot out my situation:
-I have a Spyder-calibrated monitor profile in ICC (monrgb.icm)
-I have Rec709-encoded image; there is no embedded profile, but the colourist tells me that the colour numbers are Rec709.

I want to view the Rec709 image in my monitor space using Photoshop.
-In PShop, my working space is monrgb.icc (monitor space).
-Open up Rec709 image. I assign Rec709 onto the image. Image slightly changes hue (less red) and contrast (less contrast). I think this is correct: it is translating Rec709 numbers to my monitor space so I can view it similar to how the colourist sees it.

Now I want to do the same thing in LW:
-Using the same Rec709 image (no embedded profile -- though it doesn't seem LW can understand embedded profiles anyway?), I load into Image Editor; double-click on image to view.
-When colourspace RGB set to sRGB it is what you expect; 1 to 1 with PS when the image is not colour-managed
-Then I switch colourspace in LW to 'rec709'; image in render view becomes slightly brighter, less contrasty, less red. Almost the same as PS, but not quite. And this is where I get confused: why is it different? Is LW's 'rec709' profile different from Photoshops? Or is the colour transform done differently?
-I've tried loading VideoHD16-235/VideoHD16.icc as input colourspace profiles into LW, but LW doesn't grok them (although it groks some of the other ones)
-I've experimented using the monrgb.icm as render view Display colourspace (with rec709 colourspace for the image) but that doesn't get it any closer.

I then try CC_Filter in LW:
-colourspace is set to default (sRGB)
-apply CC_Filter on image:
-input profile VideoHD.icc
-input intent Relative Colorimetric
-output profile monrgb.icm (monitor profile)
-output intent Relative Colorimetric
-resulting image is not quite there as well; much brighter and much less contrast.

One other question is colourspace setting of 'sRGB': to me this actually means 'monitor space' or 'no colour management'. Am I right?

Any enlightenment on this issue would be great. Until then, I will have to find out if Maya MR/VRay is also different.

faulknermano
10-24-2013, 04:52 PM
Consider also the LUT alone won't match exactly on different applications and monitors. Unless you are using the same specifications or/and the same color management system that handles the same LUT format. This is because applications might apply LUT’s differently (or incorrectly), and some LUT formats just can't accurately represent certain color transformations, shaper LUTs concatenations (not the case of Lustre 3dl, I think), nodes interpolations, table resolutions, etc.



Hi Gerardo, is this the reason why I get different results between LW and Photoshop?

gerardstrada
10-25-2013, 04:41 AM
I would like to ask a question here (instead of opening up another thread) about the way colour management is applied to the render view. I need a better understanding of what's going on inside LW (and in general!)

Let me plot out my situation:
-I have a Spyder-calibrated monitor profile in ICC (monrgb.icm)
-I have Rec709-encoded image; there is no embedded profile, but the colourist tells me that the colour numbers are Rec709.

I want to view the Rec709 image in my monitor space using Photoshop.
-In PShop, my working space is monrgb.icc (monitor space).

Hello Lernie, I wouldn't use a device-dependent profile (specific monitor) as working space. A standardized device-independent working space is always better.



-Open up Rec709 image. I assign Rec709 onto the image. Image slightly changes hue (less red) and contrast (less contrast). I think this is correct: it is translating Rec709 numbers to my monitor space so I can view it similar to how the colourist sees it.

Now I want to do the same thing in LW:
-Using the same Rec709 image (no embedded profile -- though it doesn't seem LW can understand embedded profiles anyway?), I load into Image Editor; double-click on image to view.
It seems LW can understand colorspace EXIF metadata (sRGB) from TIFs. Theoretically it understands colorspace tags from JPGs as well but in practice some JPGs files still embed the old JFIF metadata (first) and it seems LW doesn't read the EXIF metadata if it comes after JFIF.



-When colourspace RGB set to sRGB it is what you expect; 1 to 1 with PS when the image is not colour-managed
-Then I switch colourspace in LW to 'rec709'; image in render view becomes slightly brighter, less contrasty, less red. Almost the same as PS, but not quite. And this is where I get confused: why is it different? Is LW's 'rec709' profile different from Photoshops? Or is the colour transform done differently?
The color transform is done differently. PS is proofing colors to your monitor, while LW doesn't. In order to get exactly the same results, you need to proof to your monitor within LW too. You can do this in 2 ways. By using the SG_CCTools (we'll see that in a moment) or by loading your monitor profile in the CS panel.

With SG_CCTools results will be exactly like PS, and the CS panel solution should give you a very similar appearance to PS. In order LW doesn't posterize very dark areas of your image with this CS panel solution, you need to setup your monitor as display and output profiles. If we only use it as display profile we'll get quantizations in very dark tones.



-I've tried loading VideoHD16-235/VideoHD16.icc as input colourspace profiles into LW, but LW doesn't grok them (although it groks some of the other ones)
This is because LW recognizes display profiles, but the VideoHD profiles you are using are considered as input profiles (well, in practice they are not input profiles anyway, but that's another topic).


-I've experimented using the monrgb.icm as render view Display colourspace (with rec709 colourspace for the image) but that doesn't get it any closer.
Colors are not quite well translated from the ICC profile into the CS panel. Have the idea it could be due to that all ICC profiles are adapted to D50 and this adaptation should be done in both aspects, white point and primaries, not only in one of them - in both. Hope ICC compatibility gets improved in future versions.


I then try CC_Filter in LW:
-colourspace is set to default (sRGB)
-apply CC_Filter on image:
-input profile VideoHD.icc
-input intent Relative Colorimetric
-output profile monrgb.icm (monitor profile)
-output intent Relative Colorimetric
-resulting image is not quite there as well; much brighter and much less contrast.
This is because you are using the default sRGB preset. You need to set up the display color space to Linear in the CS panel (otherwise sRGB gamma will be added twice). Then in SG_CCTools assume linear_sRGB.icc as input profile and your monitor as output profile. Same rendering intent than PS and results will match exactly.



One other question is colourspace setting of 'sRGB': to me this actually means 'monitor space' or 'no colour management'. Am I right?
Nope. This means that you are color compensating for working in sRGB colorspace by converting from sRGB to linear and previewing in sRGB. 'Disabled preset' is no color management.


is this the reason why I get different results between LW and Photoshop?
I was referring to LUT formats where color conversions and concatenations are made between colorspaces with very different characteristics. This is not the case of sRGB to Rec709 conversion (and vice versa), since both colorspaces share the same whitepoints and primaries. In this case conversion is as simple as a gamma compensation. In a sRGB-range monitor this should not be an issue. If you are not getting very similar results might be due to perhaps you are using an aRGB-range monitor? In such case the issue might be how the CS panel is interpreting the ICC color data of your display profile for proofing colors to monitor. But at least the SG_CCTools should give you exact PS results.



Gerardo

faulknermano
10-28-2013, 04:40 PM
Hi Gerardo, thanks for your reply. I still have to go and apply what you're saying, though (got moved on to another task at the moment). I think I understand what's going on now; one of the major things I mistook was assuming the working space to be the monitor space insofar was PS was concerned: this is now obviously wrong. Will write back, possibly with more questions, but hopefully with discoveries. :)

Mr Rid
12-03-2014, 12:47 AM
... The trick so that they can load is replace the header. i.e.:

Let's suppose the header is:

3D Mesh
Mesh 5 12

Delete and replace it by this:

# of samples (table resolution) and name of the LUT (opcional)

i.e:

# 16x16x16 myLUT

How you know the table resolution?

After the header (3D MEsh bla, bla, bla) you'll see a group of numbers that commonly goes from 0 to the maximum number of the LUT bit-depth. Let's say from 0 to 1023 in 10-bits LUT. Something like this:

0 68 136 205 273 341 409 477 546 614 682 750 818 887 955 1023

In such case, just count the number of these groups (16 in the above case) and that's your table resolution.

I am not getting this to work. What app are you using to edit the header? Is it ok to save a plain text doc from Wordpad (changing the file extension to "3dl")?

I am changing my header to:

# 33x33x33
0 32 64 96 128 160 192 224 256 288 320 352 384 416 448 480 512 544 576 608 640 672 704 736 768 800 832 864 896 928 960 992 1023

But this results in the same 'blown out' exposure as the original 3dl.

Btw, this illustrates the problem with LW's interpretation of a 3dl LUT, compared to Fusion-
125826

spherical
12-03-2014, 05:09 PM
Plain text from Wordpad might work but, to be certain, use an ascii editor like Notepad or Notepad++ (better). Wordpad can read Word doc files and the formatting contained. An ascii editor knows none of that stuff, so can't screw anything up in the code.

Mr Rid
12-03-2014, 10:36 PM
Plain text from Wordpad might work but, to be certain, use an ascii editor like Notepad...
Notepad displayed the code as one continuous line that was confusing, and I wasnt sure if spaces were in the correct place. Notepad++ is fine, but I see now that I misunderstood what Gerard was saying. His trick is only good for getting 3dl files to load in LW, but they still wont display correctly, so there is no point in bothering. I will report as a bug.

Mr Rid
12-03-2014, 10:54 PM
Oop, now I am getting the 3Dl to display correctly! I had to first set the Color Space- Float Files to "rec709", then the 3DL LUT displays my DPX' correctly. I have never needed to apply a display lut in LW before.

Mr Rid
12-04-2014, 12:06 AM
Nope, take it back. LW is still not quite reading 3DL correctly. A cube file works better, although the "color correct OpenGL" does not quite appear correct, but F9(/VPR look ok.

gerardstrada
12-10-2014, 08:00 PM
Sorry for the late replay. Yes that trick was just for loading the lut, but not necessary since LW 11.6.

Pitifully not all 3dl LUTs are correctly mapped within LW. But several types of .cube LUTs work perfectly. Some others like Filmlight or 1D LUTs (.cube) don't work and we need to force to 3DLUT generation in order they can be recognized by LW. Not sure if this is a bug or it's just that there's so many LUT standards that even with the same name and extensions we have different specs and ways to handle the data. That's why for LW I find things easier with ICC LUTs and support for this format will make things easier for a common user's point of view, I think - but that's off-topic.

I think your setup can be improved a bit more within Fusion and LW. Noticed you are using an ARRI LUT. The website recomendation says one just load the footage and run the LUT:


http://s28.postimg.org/pkwo6g3p9/ARRIrecmdtn.gif

and that's ok for creative color corrections, but not very useful for CG integrations or when replicating real light effects, where working in linear light offers more realistic results. There are several ways to make this in Fusion and LW, this way is specifically for what I just interpreting you are needing, which is not necessarily the real case, but I'm sure it may be a reference to go from there to wherever anyone needs.

Linearizing correctly this data is not so straightforward in this case because someone had the "great" idea to code a log-C curve with a cineon curve format (DPX). This is like placing an oval shape within a perfect circle mold. Something won't match if we assume circle, because what's really in there is an oval. And same thing happens with cineon/dpx format and other non-cineon logarithmic curves. Some color suites are not aware that the fact these curves are called logarithmic doesn't mean they're the same. For avoiding issues, the ideal is having the original camera raw data or at least correctly linearized images on EXR sequences (someone should adopt the MOX file format already).

Then in this case, assuming cineon curve won't linearize the images correctly/completely. We need to assume the original Log-C curve. If one doesn't have the original curve, assuming cineon is better than nothing, but idea is undoing the original curve which can be done in all compositing packages.


http://s27.postimg.org/l7vyxr4b7/Lo_CLin_Fsn.gif

As we can see linearization is different because a LogC curve is different than a Cineon curve:

http://s4.postimg.org/ges31hklp/Cin_Log_C.png

Commonly enabling a display LUT is enough for having this already setup. So every operation after this node (glares, DOF, motion blur, etc) will be performed in linear light and we'll be previewing in screen space:


http://s30.postimg.org/4g32dg35d/display_LUT.gif

Ideally we would be taking into account not only curves but also gamut, and we would be proofing colors to monitor, but let's see for now just how setting this up for a log2video LUT. In such a case we need a lin2log node (specifically from linear to Logc). So every operation after this node (color looks, film grains, etc) will be performed in non-linear space:


http://s13.postimg.org/jl55b1l6v/Lin_Log_Wrkflw.gif

That's essentially the setup for Fusion. If we want to try the difference between working in linear light or not, let's take a look with a lens blur:


http://s27.postimg.org/jx71rmj6b/Fo_Cu_S.gif

Now we need to replicate same thing for LW. First thought would be just linearizing with LW Cineon curve, but of course that wouldn't be totally or correctly done. But again, if there's no option, an incomplete linearization is better than no linearization. But what if we want to do this correctly but don't have the same tool for LW? The obvious solution is a LUT, but they don't work all the same. Besides OCIO exports display LUTs so they will be clipped, and really, 3D packages LUT compatibility is not the best, and even if LW can do it well, a LogC2Lin LUT from ARRI LUT Generator won't work in this case because it generates normalized sensor data instead of relative linear exposure. Also a 3dl LUT from Fusion doesn't work as expected in LW, pitifully.

Frankly, my first option here is ICC LUTs (v4.3 performs floating point conversions more consistently than any LUT format out there). But you need a beta version of SG CCTools for getting unclipped results, which hope it will be released soon. So how to get unclipped results with the tools already available right now?

A simple solution for this LogC2lin part is just linearizing in the compositing package and exporting from there to LW in OpenEXR format which is recognized as linear automatically by LW:


http://s8.postimg.org/n6lsj0mmt/LWOpen_EXR.jpg

In case we don't want to duplicate the sequence and waste HD space, we need a parametric curve, which is basically a curve defined by parameters or variables, like a formula. I'm terrible on maths but in this case the formula is not so difficult:

log2lin(t)

(t > e * cut + f) ? (pow(10, (t - d) / c) - b) / a : (t - f) / e

replacing the variables by the values they provide for Alexa SUP 3.x we have:

(t > 0.149658) ? (pow(10, (t-0.385537) / 0.247190) - 0.052272) / 5.555556 : (t - 0.092809) / 5.367655

This is like a sentence that in plain English essentially says:

If your input RGB color is greater than 0.149658, then apply this operation:

power 10 of (the input RGB minus 0.385537 between 0.247190), then minus 0.052272, between 5.555556.

And if your input RGB is not greater than 0.149658, then apply this operation:

The input RGB minus 0.092809 between 5.367655.

"If.. then" is like a logic operation and LW has a Logic node and division and subtraction nodes. Ok let's implement this thing by using the so useful DP Image Filter Node Editor (http://dpont.pagesperso-orange.fr/plugins/nodes/DP_Filter.html) within Image Editor. Yes, we can use it in preprocess too / Thanks Denis!


http://s11.postimg.org/bjqqqbzab/Log2_Lin_LW.gif

Exactly like the compositing package. If we check this in Fusion, AFX or Nuke all look exactly the same.

From here, we could think that an ARRI lin2vid LUT would match the Log2Vid LUT, but that's not the case. For the Lin to/from ARRI LUT versions, the LUT generator assumes the log versions comes from QuickTime ProRes 4444 and DNxHD 444 files, and its "linear" is normalized linear, not the linear exposure we need for VFX/CG work. So linear2Vid or linear2Log or viceversa LUTs from ARRI generator won't work here. According to ARRI, we need to apply the inverse of the formula mentioned before:

lin2log(x)

(x > cut) ? c * log10(a * x + b) + d: e * x + f

Same as before, replacing the variables by the values they provide for Alexa SUP 3.x we have:

(x > 0.010591) ? 0.247190 * log10(5.555556 * x + 0.052272) + 0.385537 : 5.367655 * x + 0.092809

It's pretty much the same as the previous case, a Logic node, a multiply and Add nodes and something LW doesn't have but it should: a Log function node. Fortunately there're VERY useful Log function nodes developed by Akira Asagi in his AS Math Nodes (http://www.neko.ne.jp/~asagi/P2P/Plugins/lay.html). Check his other useful plugins there / Thanks Akira!

So we apply this node tree in DP Image Filter Node Editor (http://dpont.pagesperso-orange.fr/plugins/nodes/DP_Filter.html) as post-processing:


http://s12.postimg.org/bcht1ffyl/lin2_Log_CLW.gif

Again, exactly like the compositing package. If you like to use Viper, you can use the trick described here (http://forums.newtek.com/showthread.php?136662-LightWave-and-Wide-Gamut-monitors&p=1337325&viewfull=1#post1337325) to make work DP IFNE with Viper (this sounds like a feature request for me).

If there's no documentation about the curve (like some of the RED camera curves for example), or if you suck at maths like me, you could use a curve made with DP Curve like described previously in this thread (http://forums.newtek.com/showthread.php?119485-Why-is-the-quot-Adaptive-Sampling-quot-pass-so-slow&p=1315684&viewfull=1#post1315684) (just that in this case I haven't copied the values in a LW color table, but using the DP Curve directly):


http://s29.postimg.org/lc5bdnkpz/Log_C_DPCurve.jpg

It's the same result ..and math-free.



Gerardo

gerardstrada
12-10-2014, 08:03 PM
From here we just load the ARRI LUT (Log2Vid) in the CS panel (an AFX LUT from ARRI LUT Generator works perfectly in LW) and we get exactly the same result than Fusion:


http://s11.postimg.org/ahpqz9v8j/lin_Log_Cvid.gif

If you want to match a .3dl LUT from Fusion, we need an additional step. Seems that main difference between .3dl and .cube LUTs is that one doesn't assume computer monitor color proofing while the other does. So .3dl offers a bit more contrasted images than .cube LUTs. A way to match the .3dl result with a .cube LUT within LW is by converting from gamma 2.2 to sRGB gamma with SG_CCNode in DP IFNE. We don't need to configure the ICC profiles part of the node since this is just a simple gamma conversion. If want to match .cube LUTs with .3dl just do the opposite, from sRGB gamma to 2.2 gamma.


http://s14.postimg.org/4e5h0c4yp/cube3dl.gif

Same results. Correct Linear setup in both packages. Same workflow is also adaptable to AFX or Nuke.

Btw, attaching the nodes files in this post.


Gerardo

p.d. If you have upgraded to LW 2015 you might want to try the new DP FNEs (http://dpont.pagesperso-orange.fr/plugins/nodes/DP_Filter.html). We're betatesting yet.

gerardstrada
12-11-2014, 10:45 AM
If you like to use Viper, you can use the trick described here (http://forums.newtek.com/showthread.php?136662-LightWave-and-Wide-Gamut-monitors&p=1337325&viewfull=1#post1337325) to make work DP IFNE with Viper (this sounds like a feature request for me).
NT forums doesn't allow to edit my previous post, but just for clarification, the mentioned trick to preview DP IFNE results is for VPR, not Viper. DP IFNE already works with Viper by default.



Gerardo

spherical
12-11-2014, 01:07 PM
Holy Crap! I gotta get another cup of espresso and re-read this... at least three times. :D

gerardstrada
12-12-2014, 11:10 AM
Sorry. English is not my first language, but if there's something a can explain better, feel free to ask.

If you take aside all the bla,bla,bla (which explains the WHYs), we're left with the structure (the Whats) which is quite simple:

Linear setup in Fusion

LogC to Linear (for processings in linear light) => Linear to LogC (for processings in non-linear light) => LogC to Rec 709 (for previewing)


Linear setup in LW

LogC to Linear (for rendering) => Linear to LogC (for alternative processings in non-linear light) => LogC to Rec 709 (for previewing)

Probably I should have started with that http://forums.newtek.com/images/icons/icon7.png



Gerardo

Oedo 808
12-12-2014, 11:46 AM
Sorry. English is not my first language, but if there's something a can explain better, feel free to ask.

I'd need to get a pot of tea and to read it through five times and that has nothing to do with your English which is perfectly fine, it's just that these things can be tricky to grasp for some of us. But your explanations are very helpful and appreciated.

Waves of light
12-12-2014, 12:35 PM
I'll make the tea. Holy cow.

spherical
12-12-2014, 09:49 PM
Sorry. English is not my first language, but if there's something a can explain better, feel free to ask.

Not your English, by any means! It was very clear. I was referring to the extent and detail covered that expanded horizons in many directions.

gerardstrada
12-13-2014, 10:14 PM
Agree, not an easy topic.

The most difficult part to understand (at least with the ALEXA LUTs) was probably why the Log2Lin conversions from ARRI LUT generator didn't match its own linear version from the linear .ari raw files. And what happens is that what they call "Linear" in the LUT generator is not what we call linear. Also, what they call "Video" is not really Rec 709 standard.

They have 2 types of "linears". One related to the light values of the original scene and the other related to the values obtained from the camera sensor.

What we know as linear is corresponding to values linearly proportional to the light exposure changes in the scene. i.e: half the measured light in the real scene corresponds to half of luminance value in the registered image of that same scene. That's linear. The other linear term they refer (and this is the linear that uses their LUT Generator and confuses everyone) is the one corresponding to the response of the sensor of the camera. The normalized scale of the signals of the sensor is different than the linear real scene exposure scale and of course, values from the sensor are weaker. i.e. there's no totally black values (0.0) in sensor data when registering a real image, there's also a noise floor, and we need A LOT of light to really saturate the sensor's pixel.

This is the reason why they recommend (i.e. for an exposure index of 800) apply an offset of about 0.0039 and multiply the RGB values by 36, to match the linear normalized sensor data (created by the LUT generator) with the linear scene exposure that we need for CG/VFX work. This can be easily done in the compositing package, but not in the LW CS panel (time for a CS Node Editor?).

The other curious thing is that their Video standard is not really Rec 709, it's a tonemapped version that looks more "film-ish" as we can see when we plot the curves:


http://s28.postimg.org/hgmzzdstp/plottedcurves.png

and this is one of the reasons why applying the Rec 709 View LUT in Fusion doesn't match the ARRI LUT video version. But taking gamut into account we could match same "look" too, from any camera.



Gerardo

Mr Rid
01-27-2015, 04:18 AM
Thanks so much for the detailed explanation!! I drank my Tejava, and read it three times. However, I could read this ten more times, but my simple, non-tech brain will never understand it. I just need to know which brightly colored buttons I push to make it go.

One suggestion when posting gifs, is to put a black frame at the end so it is clear when the order of frames starts and ends.

But to me, the simplest process is to bake the LUT into the plate in Fusion (File LUT tool), and load that in LW. Then my CG integrates with the LUT-baked plate in Fusion, with very little tweaking. Looks great. Done.

The problem is that the client wants the final comp without the LUT baked into it, and I don't see a way to simply 'un-LUT' the LUT at the end of the flow.

But in trying to figure out which buttons I push to make it go...


- Apply the display LUT.

- Set Loader 'log type' to Arri LogC.

- Add OCIOcolor tool, and set Source to linear, and Output to AlexaV3LogC.

My Fusion 7 does not seem to have the same LUT dropdown options as in your example, like "rec709(?)."

126694


... I'm terrible on maths"

You must be joking. You are Hawking compared to me, who failed 'Introduction to Algebra' 4 years in a row. You lose me once anyone starts typing formulas. I wont be able to deal with it until someone makes it a single button. I similarly gave up on LCS workflow in LW9.6, until LW10 made it one button.


...If you take aside all the bla,bla,bla (which explains the WHYs), we're left with the structure (the Whats) which is quite simple:
Uhhh...


...

LogC to Linear (for processings in linear light) => Linear to LogC (for processings in non-linear light) => LogC to Rec 709 (for previewing)...

:stumped::cry::sleeping:

I am instead doing my usual, right-brained workaround.

Btw, I am noticing that EXRs do not save out of LW11.6 the same way they did out of LW10.1. Changing the Output Color Space to sRGB does not work in LW11.6. Anyone else notice this?

gerardstrada
01-29-2015, 02:32 AM
Thanks so much for the detailed explanation!!
Hope it helps!


I drank my Tejava, and read it three times. However, I could read this ten more times, but my simple, non-tech brain will never understand it. I just need to know which brightly colored buttons I push to make it go.
Just in case, the previous explanations and the following ones are not only for you, but also for anyone that might find them useful.


One suggestion when posting gifs, is to put a black frame at the end so it is clear when the order of frames starts and ends.
Thanks for the suggestion. A hint: the last step always stays a bit longer. Additionally, using numbered screen shots would be better, I guess.


But to me, the simplest process is to bake the LUT into the plate in Fusion (File LUT tool), and load that in LW. Then my CG integrates with the LUT-baked plate in Fusion, with very little tweaking. Looks great. Done.

The problem is that the client wants the final comp without the LUT baked into it, and I don't see a way to simply 'un-LUT' the LUT at the end of the flow.
The issue of baking the LogC2Video LUT (besides of you don't know how to 'un-LUT' the result yet) is that the LUT you are baking is a display LUT which clips values from 0 to 1. In other words, they give you a 10/12/14-bd data and you give back 8-bd data (no matter if you save to EXR, data is already clamped). So when you deliver the takes to your client, they have less room for color-grading later, or just even for un-LUT your result (your result with your workaround is un-Lut-'able', by the way - but with clipped results, pitifully). So don't lose your time by trying to 'un-LUT' it.

The other issue is that when we bake a display LUT, we are baking a color appearance that has been adjusted (contrast curve/saturation/etc) for a particular output medium (HDvideo in this case) - it's what we call output-referred. And what your client needs is a linear image that represents the color appearance of the actual scene, without any adjustment that makes it look pretty (scene-referred), so that they can go from there to any other color appearance they need for outputting. That's why your client is asking for a final comp without the LUT baked on it.


But in trying to figure out which buttons I push to make it go...


- Apply the display LUT.

- Set Loader 'log type' to Arri LogC.

- Add OCIOcolor tool, and set Source to linear, and Output to AlexaV3LogC.
If you ask me, a better solution for your case, would be the one described in this line:


A simple solution for this LogC2lin part is just linearizing in the compositing package and exporting from there to LW in OpenEXR format which is recognized as linear automatically by LW:


http://s8.postimg.org/n6lsj0mmt/LWOpen_EXR.jpg


In this way, you'll already have the linear data you need within LW. And you can apply the one-button solution )* you always use with LW.

)* Notice Rec 709 preset from LW CS panel assumes your working monitor is HDvideo. You need to make another preset if you are simulating HDVideo display in a computer monitor.


The advantage of this simpler solution is that your client will get what they're asking for if you are able to match results by previewing just in Rec 709. The downside with this simpler solution is that you will be previewing in HDVideo (Rec 709) not in the ARRI-Video (the log-ish curve) your client is previewing:


http://s4.postimg.org/rzy29ll1p/vid.gif

You would be using the yellow curve while your client is using the red curve:

http://s28.postimg.org/hgmzzdstp/plottedcurves.png

Not so different, but not the same. The simplest solution if you really want to get more similar preview for this is to generate a .cube LUT (works better than .3dl within LW) that goes from Lin to LogC to Video. Again, this will be a display LUT which will clip your values, but hey! it's one button solution!


http://s13.postimg.org/o31flu9hz/clipped_prvw.jpg

..and for just previewing, so you probably don't need anything more.

http://s16.postimg.org/lkxucy5v9/un_clip.gif

But in case you really want to have exactly same unclipped preview as your client, just go with DP IFNE solution as shown in my previous post. It's pretty easy:


http://s14.postimg.org/87dah529t/procedure.gif

- Add Filter Node Editor
- Connect Color output from DP Render Buffer node to the Lin2LogC node I shared previously.
- That's all. Preview with the LUT generated in ARRI LUT Generator and you are ready to go without any visible clipped results.


You must be joking. You are Hawking compared to me, who failed 'Introduction to Algebra' 4 years in a row. You lose me once anyone starts typing formulas. I wont be able to deal with it until someone makes it a single button. I similarly gave up on LCS workflow in LW9.6, until LW10 made it one button.
No, no. I'm serious. I always reprobated maths in school too - I spent my time drawing small animations in the corner of my math books pages or sculpting miniatures with chalk remainders and a safety pin :D The maths I have to learn recently is because of CG work exigence, but it's very difficult for someone like me who never had the proper background. Because I know that, is that I'm sharing the formula in the nodes ready to use. It's the more near to one-button solution you'll get from someone who sucks on maths and have no programming knowledge or background. Sorry.


Uhhh...
:stumped::cry::sleeping:
AlexaV3LogC to Linear (for DOF, mBLur, glares, etc) => Linear to AlexaV3LogC (for color-grading, film grain, etc) => AlexaV3LogC to Video (for previewing).


I am instead doing my usual, right-brained workaround.
We are not only-right-brained or only-left-brained persons, you know? if you ask me, that left-brain/right-brain theory is a non-sense to me. Do you really think I make these things only analytically? I make them intuitively first!!! After making them intuitively is that I analyze later why they work (or they don't). People able to think intuitively like you is able to understand (without realizing) the real structure of the things. And you need to have a good understanding of the nature of the things you are doing to display intuitive thinking. Analytical thinking is just making that understanding, explicit or conscious through the reasoning. All we have understanding capabilities, just unlearn what others have told you about what you can or you can't do and learn to train intuitive and analytical thinking -together- progressively. We are able to make both things to perform better.

Hope some things shared here help you or give you some ideas to improve the parts your workaround aren't doing.


Btw, I am noticing that EXRs do not save out of LW11.6 the same way they did out of LW10.1. Changing the Output Color Space to sRGB does not work in LW11.6. Anyone else notice this?
DP Global Buffers is able to bake the gamma values in EXR. But just in case, the purpose of the EXR format is store linear data. What LW EXR should do is include in the metadata the intended colorspace. The thing is that in LW 11.6x chromaticities in metadata are always aRGB, so in the compositing package we need to assume the correct colorspace by hand.




Gerardo

Mr Rid
02-02-2015, 12:53 AM
I have more to say about all this. But I want to try to keep it simple, and step thru an example process.


A simple solution for this LogC2lin part is just linearizing in the compositing package...

I am still unsure what you mean exactly. I am provided various plates, shot with different cameras, that each "look pretty" under different LUTs (some dont require LUTs). I am also occasionally provided HDRIs. I am just concerned with a computer display, and not HDvideo (I think).

So, I load a particular plate in Fusion... which buttons are you saying I push in order to linearize it?

gerardstrada
02-02-2015, 02:55 AM
I have more to say about all this. But I want to try to keep it simple, and step thru an example process.

I am still unsure what you mean exactly. I am provided various plates, shot with different cameras, that each "look pretty" under different LUTs (some dont require LUTs). I am also occasionally provided HDRIs. I am just concerned with a computer display, and not HDvideo (I think).

So, I load a particular plate in Fusion... which buttons are you saying I push in order to linearize it?
Any camera is linearizable, mainly if they have the RAW files. Forget about display LUTs for linearization, those are just for preview.

For the specific case you have shown (ARRI Alexa V3 LogC stored logarithmically in a DPX file with Rec709 chromaticities). You might want to try the CS settings of my previous post if you computer monitor is in sRGB standard (not completely accurate but the most near for one-button solution). For linearization you can go in this way:

- Load your footage (DPX) and go to Import tab and switch to float32
- Go to Format tab and interpret as ARRI V3 LogC instead of cineon (bypass conversion) - you could ask for the actual ISO/ASA, but most of the times the 800 by default is just Ok.
- Add a Saver node and choose the name and pick .exr extension.
- Go to Format (in the saver) and pick 32bit float and your compression type (for 32-bpc cases use here PXR24).
- Render. That's all. You will have now a linear version of your LogC footage.

When you load that sequence into LW, it will recognize it automatically as linear.

If you want to xport a linear version from LW, you might go in this way:

- Load your footage (DPX)
- Ctrl+F8 and add DP Node Image Filter.
- Connect Color output from DP Render Buffer node to the LogC2Lin node I shared previously.
- That's all. Choose RGBFP exr as output and render.

As you may notice if you don't want to generate a linear version of the footage you can just linearize it directly in LW by doing the previous process in Image Editor instead of (post)Processing editor.



Gerardo

Mr Rid
02-03-2015, 02:50 AM
Thanks Gerardo! I dont know how anyone knows all these formats.

After carefully following your instruction, I just wind up with a jumble of different looking images.

Here is my scene and comp- https://app.box.com/shared/static/518uz0f3anvac9f1xwamyj2xsh557nxz.rar (the Right view in Fusion has the LUT)

The first problem is when I load the linearized image into LW it does not resemble your example, or the plate as it appears in any form in Fusion(?).

https://app.box.com/shared/static/i5p7tzqizmkukkre8313q2qlvnpz4r6v.jpg


The next problem is the EXR saved out of LW appears too dark in Fusion.

https://app.box.com/shared/static/n7hflasgixkta825q4jt1kn66wls77sr.jpg


But I think this is due to what I mentioned earlier- that sRGB Output Color Space seems ignored with EXRS in LW11. LW10 renders the same EXR as normally expected. Rendering a DPX_RGBA16 or RLA out LW 11 also looks correct. Btw, rendering a DPX-RGBFP out of LW is loading into Fusion scrambled(?).

When I render with the DP Filter node setup, the frames appear very dark...

EDIT: I see that I missed a step, and had applied the DP Filter using the exr out of Fusion, instead of the original DPX, which looks ok.

gerardstrada
02-03-2015, 10:36 PM
Thanks Gerardo! I dont know how anyone knows all these formats.

After carefully following your instruction, I just wind up with a jumble of different looking images.

Here is my scene and comp- https://app.box.com/shared/static/518uz0f3anvac9f1xwamyj2xsh557nxz.rar (the Right view in Fusion has the LUT).
Before going to LW, let's see first the linearization setup in Fusion:


http://s23.postimg.org/bin06i2ih/Fusion_Lin_Ok.png

Look perfect! just remember to promote bit depth (bd) to float32 when linearizing the plate. This will avoid possible posterizations and banding later.


http://s22.postimg.org/ewehr05qp/float32.png


The first problem is when I load the linearized image into LW it does not resemble your example, or the plate as it appears in any form in Fusion(?).
Ok. Let's see this part step-by-step:

If we compare with the setup shown in post #26 (http://forums.newtek.com/showthread.php?132928-Display-color-space-file&p=1419588&viewfull=1#post1419588) we can see it's not exactly the same.

- EXR is correctly loaded and recognized as linear automatically. That's Ok.

- Your scene is using sRGB preset so we won't see same results in LW than with Fusion with that preset:


http://s13.postimg.org/oi4w9kg2d/s_RGBpreset.png

We need to use any of the 2 setups shared in post #26 (http://forums.newtek.com/showthread.php?132928-Display-color-space-file&p=1419588&viewfull=1#post1419588). That is to say, by using the lin2ARRIVid LUT I shared before (just save that as a preset for one button solution). In this specific case would be:


http://s13.postimg.org/np62m6e6t/linto_ARRIclip.png

Or even better if you don't want clamped preview, try with the Lin2LogC node in DP Node Image Filter and the LogC2Vid LUT explained in post #16 (http://forums.newtek.com/showthread.php?132928-Display-color-space-file&p=1413496&viewfull=1#post1413496) and shared in post #17 (http://forums.newtek.com/showthread.php?132928-Display-color-space-file&p=1413497&viewfull=1#post1413497). With your sample scene would be:


http://s2.postimg.org/vh8ak82hz/DPsetup.png

- The LogC2Lin setup shared in post #28 (http://forums.newtek.com/showthread.php?132928-Display-color-space-file&p=1420088&viewfull=1#post1420088) with DP Image Filter Node Editor is only for linearization within LW instead of Fusion, this is to be used for the DPX file, not for the already linearized EXR file that comes from Fusion. Because if we use it in an image that's already linearized we would be applying the linearization twice and that's the reason why your image was darker. In fact you could save all this 'Fusion linearizing process' by just using the DP NIF within the DPX file in Image Editor as explained in post #16 (http://forums.newtek.com/showthread.php?132928-Display-color-space-file&p=1413496&viewfull=1#post1413496). In this case is the same within ImageEditor:


http://s15.postimg.org/7rcc16z8r/lin_DPXsetup.gif

At least we save disk space.

Btw, think the output is better to be Cineon in CS panel in this case because your client colorspace in logarithmic (LogC) and because LW uses the output colorspace for applying Adaptive Sampling threshold (it's an elegant solution from LW Development Group if you ask me). As we have seen, Cineon curve is not exactly LogC curve, but it will give you nice AA (from AS) for a wide range of logarithmic curves.


The next problem is the EXR saved out of LW appears too dark in Fusion.
Yes. The issue is that the Fusion setup in this case can not be the same setup we used for linearizing the plate. We need the compositing setup shared in post #16 (http://forums.newtek.com/showthread.php?132928-Display-color-space-file&p=1413496&viewfull=1#post1413496) for the compositing task. Adapted to your sample it would be:


http://s21.postimg.org/xux47mrl3/Comp_Setup.gif

Where we perform colorblendings, integration, post mBlur, DOF, etc in linear light, and add filmgrain, color corrections, etc in logarithmic space after the OCIOColorSpace or CineonLog node (both are able to perform the Lin2Log operation in this case), and we preview with the LUT generated in ARRI LUTGenerator.


http://s12.postimg.org/4nahhliwb/previews_LW_Fsn.png


But I think this is due to what I mentioned earlier- that sRGB Output Color Space seems ignored with EXRS in LW11. LW10 renders the same EXR as normally expected. Rendering a DPX_RGBA16 or RLA out LW 11 also looks correct. Btw, rendering a DPX-RGBFP out of LW is loading into Fusion scrambled(?).
Output from LW should be linear when using a linear format as EXR is. LW 11.6 is doing correctly that part. But if you want to store data in non-linear space within a EXR file, you can save with DP Get Global Buffer node in DP NIF. You could save as DPX too, so a Cineon curve will be applied to the render, but you would be losing dynamic range and you would need to interpret CG element as Cineon later in Fusion for correct linearization. Since integration is made in linear light in post, it's easier exporting in linear EXR which load as linear automatically in the compositing package.



Gerardo

spherical
02-03-2015, 11:19 PM
YIKES! Ok, I need more espresso....

gerardstrada
02-04-2015, 03:19 AM
Sorry, I forgot to attach the scenes and comp - https://www.sendspace.com/file/r2srg5

In the file you'll find both proposed solutions for LW (Lin2ARRIVid.lws and Lin2LogC2ARRIVid.lws), the resulted image (Linear_DPX_0001.exr), Fusion comp setup (lin2log2vidcomp_test.comp) and already shared 3DLUTs (AlexaV3_K1S1_LogC2Video_Rec709_EE_fusion3d.3dl, Lin2ARRIVideo.cube, AlexaV3_K1S1_LogC2Video_Rec709_EE_aftereffects3d.c ube).

There are several (and better ways) to do this, hope the proposed ones give you some ideas.



Gerardo

Mr Rid
02-18-2015, 03:06 AM
Some of the example pics are not showing in your last reply to me (?). But I still don't get it. Does anyone get it?





Output from LW should be linear when using a linear format as EXR is. LW 11.6 is doing correctly that part. Gerardo

If I save EXRs with Output Color Space set to "sRGB," or if I save EXRs with Output Space set to "linear," LW 11.6 saves the exact same thing, either way. There is no difference. Shouldn't there be a difference? But when I save EXRs with different Output space from LW10.1, there is a difference between sRGB and linear. So why is there a difference?

gerardstrada
02-19-2015, 04:26 AM
Some of the example pics are not showing in your last reply to me (?). But I still don't get it. Does anyone get it?
There's a link in my last replay to you with the LW scenes and Fusion comp of the previous explanation - sorry about the images, they are working here but guess the scenes and comp will help more. Do let me know if the link is not working there (it's working here). Everything is already set up there.


If I save EXRs with Output Color Space set to "sRGB," or if I save EXRs with Output Space set to "linear," LW 11.6 saves the exact same thing, either way. There is no difference. Shouldn't there be a difference?
The short answer is not. LW 11.6 is doing this correctly. By convention, OpenEXR was made for storing linear images.
There's a more extensive answer where things are not so simple since the expression 'there's no difference' is not applied only for the storing aspect. But this doesn't change the short answer.



But when I save EXRs with different Output space from LW10.1, there is a difference between sRGB and linear. So why is there a difference?
Because before LW 11, output colorspace was only relevant when saving, and trivial for any other aspect while rendering. But since LW 11, the output colorspace affects not only the output image when is saved but also the Adaptive Sampling threshold when applying AA. In this new context, output space is not trivial and EXR files need to be saved as are intended to be used - linear images for linear workflow.

But if you want to use EXR files in ways outside of its original framework, you can save gamma-corrected EXR files (or linear too) with DP GlobalBuffer in DP NIF:


http://s22.postimg.org/r6952m581/GB_s_RGB.png

Or use a gamma conversion from Linear to sRGB before saving in LW as you always do:


http://s13.postimg.org/65h617gxj/lin2s_RGB.png
using SG_CCTools in this sample

Or use FullPrecisionGamma filter:


http://s24.postimg.org/kyo1ybf0l/gamma22.png
this won't give you sRGB gamma but 2.2 gamma

Or simpler perhaps, just use another FP format.

Gamma correcting the output won't give you of course pixel values proportional to the amount of light in the represented scene, nor within Fusion or any other package (not even within LW). Packages using EXR files expect linear values from them and even simple setups for working with linear EXR files will show washed-out images.



Gerardo