PDA

View Full Version : thickness node ?



erikals
04-17-2010, 07:47 AM
yo-yo,
does this exist?

http://www.newtek.com/forums/attachment.php?attachmentid=84072&d=1271421786

(tried gradient surface thickness, but it doesn't work...)

Captain Obvious
04-17-2010, 08:24 AM
Nope. I'd recommend doing it with weight maps, as it looks like you've done there. The surface thickness is basically surface thickness *through* the surface along the refraction vector. It does not check the local size of the object, and I can't think of a good way of doing it.

Matt
04-17-2010, 08:34 AM
There have been a few threads to try and get the thickness node to do what you want there, none could find a solution. It basically doesn't work in a way that a few of us would love it to.

erikals
04-17-2010, 09:16 AM
i guess i could use sss,
turn off lights, and turn up ambient light,
then bake it, and fix it in photoshop,

it's a bit cumbersome though, and it also would affect thin areas of the mesh. hmm... agh.

Captain Obvious
04-17-2010, 11:25 AM
If the SSS is based on the ambient light, wouldn't the shading be the same everywhere?

You could also use an ambient occlusion shader, firing *into* the mesh. But that won't give very good results either...

zarti
04-17-2010, 05:58 PM
you can get the correct surface thickness ( at least as a buffer ) by;
- using a dielectric node ( play with Absorption value, Refraction should be 1! and Raytrace Refraction enabled! )
- use Split Material node ( free from Trueart ) to extract " the thickness " from Refraction Shading output
- Absorption value can be driven by a weight map ( third screenshot )

841098411084111



p.s.: since it is refraction and is merged with the background, it is a bit tricky to apply it on the same nodal network and make things like; color, shadow, etc.. work right.
theoretically-maybe, reloading the extracted buffer ( after saving it first as a bitmap ) and projecting it in Front Projection mode before applying it in any component of the nodal network, could make it more usable...

maybe ....

erikals
04-17-2010, 07:06 PM
no solutions so far, tried Sigma2, but like Captain Obvious says, it doesn't really work with ambient lighting...
and using regular light setup doesn't work either,... hm...

"two microbes on a date"

erikals
04-17-2010, 07:16 PM
?... could this be something?...
http://www.sidefx.com/index.php?option=com_forum&Itemid=172&page=viewtopic&t=1206&view=previous&sid=88bc7290e47d05494f94b7003304cb36

zarti
04-18-2010, 01:37 PM
it is " the thing ", but gives problems on edges ( silhouette ) ...

zarti
04-20-2010, 06:02 PM
maybe someone can make one?

wellll, if it has been there from some time, why should he make it again ( ?! ) ;D

just discovered it;
load Refraction node from DPKit and adjust the Decay value
do not forget; to have Raytrace Refraction active ( ! )
see the screenshot below this one ...


http://www.boydchan.com/wp-content/uploads/2009/11/this-is-it-poster.jpg

84225

probiner
04-21-2010, 12:36 AM
what material or reference are you trying to achieve erikals?

Zarti solution looks nice, but too much on the transparency side. When i read you post i got the sensation it was more about opaque thickness.

Existing Thickness node can be baked to a UV texture. The issue is that it is calculated from "wall" to "wall", so you can see in the first image (red>thin, blue>thick) that some areas get thick color, because the opposite "wall" has a bevel.

84228

I did another try but this time with a structure in the middle (center of mass maybe?) so the ray could hit there.
I did not do such thing for the first object cause that would be crazy.

So i guess if thickness can be calculate from "wall" to center of mass it would work.
But how to do it? On top of that, in a Nodal way? no clue...

I guess these are volumetric calculations that i don't know if LW has.

Cheers

dnch
04-21-2010, 03:25 AM
zarti: where is that "refraction" node? i have only "refractions" node and there are no decay and backdrop map settings

zarti
04-21-2010, 03:34 AM
follow this path ... 84231

Nangleator
04-21-2010, 09:56 AM
Just a poke in the dark, but doesn't the Advanced Camera let you use the geometry itself as a camera? So, the surface thickness would be relative to each poly's normal?

Then, it would have to be baked.

probiner
04-21-2010, 10:08 AM
Just a poke in the dark, but doesn't the Advanced Camera let you use the geometry itself as a camera? So, the surface thickness would be relative to each poly's normal?

Then, it would have to be baked.

Thats what i think i did with Surface Baking Camera in the last post. Thickness baked according each poly normal (i frozen and tripled the model on the bake time). But you still going to have "wall" to "wall" thickness like in the first example.

I guess/wonder if something like in the second example would work better?

http://i153.photobucket.com/albums/s202/animatics/Lightwave/thickness_examples.png

Cheers

erikals
04-21-2010, 10:21 AM
i'm trying to avoid baking though, as the original idea is to use it as a realflow hack.
(by adding displacement only to the red area)

(realflow uses an object sequence, so it wouldn't be possible to bake, unless you bake each individual frame)

zarti
04-21-2010, 10:36 AM
baking doesn't make sense to me.
we cannot reproduce volumetric shading; everything is applied over the surface.
we can only ' produce ' its appearance as it is seen from the camera.

...maybe, baking could be useful only for relatively thin volumes. ( imo-oc )