PDA

View Full Version : Making Lightwave into a pathtrace renderer.



Dan_Ritchie
09-05-2014, 06:36 PM
Below is a video explaining how Lightwave can be turned into a path tracer, and several example images. A path tracer is based on a physically accurate rendering equation. Note that attributes like light reflections (specular), soft shadows, radiance, and caustics are naturally part of the equation. *See the "Some more nodes" thread for the nodes mentioned in the video.

124053

124054

124055


https://vimeo.com/105399601

In the old days of computer 3D rendering, limited processor power forced coders to use approximations of various rendering equations, such as Lambert shading and Phong highlights. These approximations were based more on mathematical observations than on natural phenomenon.

Over the last several decades, many additional approximations have been added to further simulate the appearance of natural phenomenon. Raytraced and shadow mapped shadows, ambient occlusion, radiosity, caustics, bent normals, and any of a thousand other techniques have been added to the original approximation, complicating the rendering equation beyond reasonable measures.

The point of Path Tracing is to return the rendering equation to its simplest form, while eliminating approximations in favor of a real world lighting solution.

Path tracing assumes that a surface spot's color (its shading) is the sum of all light that hit that spot. No other techniques are required to gain a host of extra lighting effects, such as soft shadows, ambient occlusion, caustics, global illumination, or radiated light. It's all part of one equation.

Path tracing eliminates the idea of separate object types for geometry and lights. Only geometry is used in path tracing. All illumination comes from geometry.

Path tracing uses brute force methods, and accumulates light by sampling from random directions. The simplicity of the equation keeps the rendering time at reasonable levels. By convention, a grainy intermediate (and continually refined) image is created and averaged until a desired level of smoothness is reached. Several levels of refinement will produce a discernible image, while several hundred passes may be needed to produce acceptable levels of refinement.

The benefit of path tracing over other methods is its physically accurate method, it's ability to produce discernible results in a short amount of time, and it's simplicity of implementation due to eliminating many extra steps to produce realistic results. Physical accuracy is just the nature of the algorithm itself.

lertola2
09-05-2014, 07:03 PM
Wow. Really interesting. In your video you did not talk about path tracing and transparency although I see a transparent object in one of your samples. Can path tracing work with transparency?

-Joe

OFF
09-05-2014, 11:55 PM
Hi. Great method!
I'm pretty new in compaund node construction, so there is my question - how you make a Random Normal node?

djwaterman
09-06-2014, 02:07 AM
I going to have to look into this when I have the time. Thanks for creating it.

sukardi
09-06-2014, 02:07 AM
Really cool. Wonder if this a viable alternative to brute force Monte Carlo in terms of render speed...

erikals
09-06-2014, 05:02 AM
Wow! Inspiring! http://erikalstad.com/backup/misc.php_files/king.gif

Dan_Ritchie
09-06-2014, 11:35 AM
Wow. Really interesting. In your video you did not talk about path tracing and transparency although I see a transparent object in one of your samples. Can path tracing work with transparency?

-Joe

Yes it can. Although the node I presented didn't have a transparency option, it can be done with the method I presented at the end of the video without nodes. Just make your surface transparent as usual, using some refraction, and control the amount of light scattering with your fine bump texture. highly reflective surfaces don't usually scatter light, so rendering will be very similar to regular raytracing. Remember that pathtracing does not calculate for lambert or any other shading method like that (all surface lighting is just a reflection or sub-reflection of something luminous), so diffuse is always set to zero, effectively disabling it. This makes reflective objects physically accurate, because the less light is scattered, the darker the surface appears. (this has been a known issue with reflective objects in lw for years)
You can use the color highlight and color filter attributes however to get some color into your transparent objects.

Sensei
09-06-2014, 12:24 PM
I don't think so Turbulence node is returning truely random vector. Otherwise each time you would press F9, you would be getting slightly different result. Which would result in flickering during rendering animation.

But you can find true Random Vector (Scalar, Color too) nodes in TrueArt's Node Library http://www.trueart.eu
It's using internal function that Node SDK provides to C/C++ nodes.

I have noticed that you used couple times Math > Vector > Multiply when 2nd parameter was scalar. You could/should use Math > Vector > Scale there, avoiding conversion from scalar to vector, that's not needed.

Another thing is that you can check whether normal vector is pointing in the same direction as other vector using Math > Vector > Dot node.
If it's positive value, they're pointing in the same direction.
If it's negative value, they're pointing in opposite directions.
So you can use that knowledge during calculations of normal for RayTrace Ray Direction.
If dot output is negative, negate normal vector (f.e. multiply/scale vector by -1). After that you have vector pointing always in right direction.

Sensei
09-06-2014, 12:32 PM
Physical accuracy is just the nature of the algorithm itself.

Nope.
Physical accurate algorithm would send photons with various wavelengths (from 380 nm to 700 nm at least), then analyze how they travel further, reflect from surfaces, pass through transparent materials etc. How photons are absorbed by materials, and new photons with lower energy are emitted (fluorescence, luminescence).
And materials would have reflectiveness, transparency depending on wavelength.
f.e. glass is transparent at visible spectrum (380-700 nm)
but it's opaque for infra red photons.

Dan_Ritchie
09-06-2014, 12:43 PM
I don't think so Turbulence node is returning truely random vector. Otherwise each time you would press F9, you would be getting slightly different result. Which would result in flickering during rendering animation.
\

I couldn't get nodal to do a true random number, that's why I used turbulence, not because it was truly random, but becase it was simpler.

RebelHill
09-06-2014, 12:59 PM
I couldn't get nodal to do a true random number, that's why I used turbulence, not because it was truly random, but becase it was simpler.

There're random integer and scalar nodes available in the tools section.

Dan_Ritchie
09-06-2014, 01:45 PM
There're random integer and scalar nodes available in the tools section.

but no timer node, which would be the traditional way to seed a random number generator. That would be a nice addition, as well as access to frame and time of the animation.

Sensei
09-06-2014, 01:47 PM
but no timer node, which would be the traditional way to seed a random number generator. That would be a nice addition, as well as access to frame and time of the animation.

TrueArt's Node Library > Render Info

Dan_Ritchie
09-06-2014, 01:49 PM
Nope.
Physical accurate algorithm would send photons with various wavelengths ...

The first thing they taught us in physics class is to ignore reality.

RebelHill
09-06-2014, 02:03 PM
but no timer node, which would be the traditional way to seed a random number generator.

You can add an envelope to a scalar constant, set it increase by a set amount over 1 frame or second, set the curve pre and post to linear... you got the same thing in essence.


The first thing they taught us in physics class is to ignore reality.

Sounds like you had a really bad physics teacher.

Dan_Ritchie
09-06-2014, 02:54 PM
You can add an envelope to a scalar constant, set it increase by a set amount over 1 frame or second, set the curve pre and post to linear... you got the same thing in essence.


Good idea, but I think it would be hard to encapsulate in a node. Envelopes don't appear to be saved with nodes, in my limited experience.

jwiede
09-06-2014, 08:32 PM
It's a cool idea, and interesting insight into Path Tracing implementation. Still, I think I'll stick with Kray for practical Path Tracing rendering in LW. ;)

creacon
09-07-2014, 03:05 PM
I have only come across one piece of software that comes close to physically accurate and that's this one:

http://www.winosi.onlinehome.de/Gallery.htm

This probably works the way Sensei described it.

It's open source, it's old and it renders forever.

creacon

jwiede
09-07-2014, 10:45 PM
Well, both Maxwell and modo handle spectral dispersion the way Sensei described, by sending out rays at different light frequencies and thereby capturing the differing responses.

Maxwell's never failed me in terms of making optical setups and having them render out the exact same way the lenses and prisms work in real life, but I'm sure there's a level of verisimilitude beyond which it fails (also true of WinOSI I expect).

creacon
09-08-2014, 02:49 AM
So you're saying that in Modo you could make the prism setup like in WonOsi? Are you sure about that?
Winosi shoots rays from the lightsources into the scene, some kind of photonmapping.
As far as I know Modo is an inverse raytracer (like LW), and wouldn't create this effect.

creacon




Well, both Maxwell and modo handle spectral dispersion the way Sensei described, by sending out rays at different light frequencies and thereby capturing the differing responses.

Maxwell's never failed me in terms of making optical setups and having them render out the exact same way the lenses and prisms work in real life, but I'm sure there's a level of verisimilitude beyond which it fails (also true of WinOSI I expect).

jwiede
09-08-2014, 11:52 AM
So you're saying that in Modo you could make the prism setup like in WonOsi? Are you sure about that?
Winosi shoots rays from the lightsources into the scene, some kind of photonmapping.
As far as I know Modo is an inverse raytracer (like LW), and wouldn't create this effect.

No, I'm saying that modo handles spectral dispersion the way I described. I have no clue whether it supports the kind of caustics/path tracing/etc. needed to replicate such optical setups. I know some have had success using it that way (there are examples posted on the Foundology forums), but nothing too complex.

I believe Maxwell could deal with similar prism/mirror/lens setups as WinOSI, based on my own experience replicating a few such configs in Maxwell 3.

OFF
09-08-2014, 07:23 PM
but no timer node, which would be the traditional way to seed a random number generator. That would be a nice addition, as well as access to frame and time of the animation.
You can use a texture Smokey, which contains the value of constant bias pattern.

inakito
09-09-2014, 04:43 AM
WOW, it looks really interesting!
Does it mean we have Arnoldish rendering engine inside Lightwave?

zapper1998
09-09-2014, 08:50 AM
wow so cool

thanks

chikega
09-13-2014, 07:25 AM
As I mentioned on Youtube,
Thank you for making these tutorials, Dan. Very insightful! I can't imagine the things you could accomplish if you had the latest version of Lightwave for more than 30 days. PMG, please gift Dan a license of LW, we need his 50 pound brain to create more tutorials! The other 50 pound brains that have chimed in here can add to them, as they have. :)

robertoortiz
02-22-2015, 10:50 AM
Any updates on this?
Should the NT team add an EASY NATIVE implementation of this?,
I am asking this since it seems that the native LW can do this, but you have to jump through hoops.

creacon
02-23-2015, 02:51 AM
Should the NT team add an EASY NATIVE implementation of this?,


Short answer: No
They should spend their time on more important stuff.
This is a very nice "proof of concept" but it is miles from a workable solution, most production renders have at least 10 manyears invested in them; do you really want Newtek to do this?

creacon

zapper1998
02-23-2015, 10:10 AM
can you share the node setup please

Mike

MSherak
02-23-2015, 11:46 AM
Below is a video explaining how Lightwave can be turned into a path tracer, and several example images. A path tracer is based on a physically accurate rendering equation. Note that attributes like light reflections (specular), soft shadows, radiance, and caustics are naturally part of the equation. *See the "Some more nodes" thread for the nodes mentioned in the video.

....

The benefit of path tracing over other methods is its physically accurate method, it's ability to produce discernible results in a short amount of time, and it's simplicity of implementation due to eliminating many extra steps to produce realistic results. Physical accuracy is just the nature of the algorithm itself.

Brilliant Dan!




Any updates on this?
Should the NT team add an EASY NATIVE implementation of this?,
I am asking this since it seems that the native LW can do this, but you have to jump through hoops.

mmm this is not jumping through hoops is just an easy and pretty ingenious way to use native nodes to create a sample pathtracer function. Unless you look as this is some crazy setup. Then if so, take a second to listen and learn what Dan is saying. It seems most artists don't care what is happening under the hood they just want an easy button. As for NTDev making that button, why?? Dan is showing you can do it now.

-M

Sensei
02-23-2015, 11:57 AM
You have already "path tracer" already in LW built-in but nobody is using it because it's so slow.
It's called Monte-Carlo non-interpolated non-cached GI.
It's sending rays in the all directions randomly.
It's just a matter of joining received RGB color coming from all directions (one can use BRDF, other something else)..

pixym
02-23-2015, 07:12 PM
You have already "path tracer" already in LW built-in but nobody is using it because it's so slow.
It's called Monte-Carlo non-interpolated non-cached GI.
It's sending rays in the all directions randomly.
It's just a matter of joining received RGB color coming from all directions (one can use BRDF, other something else)..

Hi Sensei,

I go on using the "Monte-Carlo" mode for my work because it avoids me wasting my time in tweeking settings in GI Interpolating modes… The time is only for my computer while rendering.

3dworks
02-24-2015, 02:34 PM
...and a totally different story is to use octane with path tracing kernel! ;-)

Dan Ritchie
05-05-2015, 10:56 AM
Hi Sensei,

I go on using the "Monte-Carlo" mode for my work because it avoids me wasting my time in tweeking settings in GI Interpolating modes… The time is only for my computer while rendering.

Lightwaves Monte-Carlo is for completely diffuse surfaces. You don't get all the extras like realistic specular and caustics. (Although pathtracing doesn't produce hard edge caustics)