PDA

View Full Version : Subdivision Order



Chris Jones
10-03-2013, 06:50 AM
This is my node setup for applying displacement maps. It only works when Subdivision Order is set to "After Motion". Any way of getting it to work when set to "Last"?

117450
117451

Thx.

RebelHill
10-03-2013, 07:04 AM
It DOES work set to last, just not in the same way.

With bones, you're usually taking a lower(ish) poly mesh, deforming it and then subdivide (afterbones/last) so as the bone deforms "smooth out". With displacement... you want a high resolution mesh to capture all the lil details of the map... hence, you either need a higher poly object from the start, or, you want to subdivide BEFORE displacement. By setting subD order to last... then you displacement is applied to the lower res mesh (which fails to capture the details) and then your mesh gets subdivided (last) smoothing out the result of the previous mesh transform.

Chris Jones
10-03-2013, 07:31 AM
That's what I was afraid of... In practice, when applied to a certain face rig my new morph node setup below doesn't work properly unless SO is set to Last. :\ Seems I can have displacement maps, or I can have non linear morphs - but I can't have both.

117452

RebelHill
10-03-2013, 08:07 AM
Yeah... ofc if you're applying morfs in disp node, then ofc, they get the same treatment and you really want to subdivide after the morfs, but before the image based disp... which ofc you cant do. 2 things you can do...

First... ditch the whole non linear morf thing... its pretty much unnecessary. I presume ur after it for eyelids, where its totally not needed, especially for human faces. Model your eyelids in a half closed/sleepy pose for the base, have one morf to close, one to widen, ull get no penetration into the eyeball itself, and anyway human eyelids, especially when closing, exhibit very little "rotational" movement anyhow.

Otherwise... setup ur image based displacement not in disp nodes, but in surface editor, connecting to displacement input, they will then be applied onto the subdivided mesh rather than the cage, allowing u to have the disp node setup the way u need it.

Alla this stuff is in the nodal tutorials I sent u ya know...

Chris Jones
10-03-2013, 08:37 AM
Actually I already have to have more of a gap than I'd like with 4 morphs and base pose, and I can't afford even the slighest penetration of the eyeball since that causes the eyelids to light up (as per that other thread about SSS glow). Besides, the more morphs the merrier when it comes to the unfolding of the epicanthus and such. Bones + morphs is a pain, so NL morphing seems to be my best bet for now.

I remember initially trying to do the wrinkle displacements in surface editor as you suggest, but it wouldn't work for whatever reason. I think somebody persuaded me over to the displacement editor after that (probably Denis). I'll have another look and see if the passing of time has rendered it somehow possible...

Thanx,

(c:

RebelHill
10-03-2013, 08:51 AM
The SSS penetration problem is solved by either reducing or eliminating backscattering on the eyelid areas, or by mixing in a different material in those areas that only does a frontal scatter. As for articulating epicanthal folds and the like... you can always separate that action out to a totally separate morf with its own action sequence, independent of the rest of the eyelid, allowing again for u to get away from multiple string morfs (since ofc there's actually no such things as non linear morfs, just a collection of smaller linear displacements).

As for surface editor... ehhh... Im not entirely sure... Id've thought it ought work. Have to trial it to be sure.

Chris Jones
10-03-2013, 09:49 AM
Are you saying keep the bones for the lid rotation, and morph the folds separately?

As for the morphs not being non linear, there's definitely an arc between them since they're being blended via a gradient, so while perhaps not truly non linear, its an awful lot closer to it than the last time I tried to blend morphs with a single control, which was producing very linear results. I think I was trying to do it with expressions on that occasion.

Backscattering - frontal scatter - whazzat then? I'm using Simple Skin, haven't delved too deeply into SSS as yet.

I'll have to find that discussion about displacements in the surface editor, see what the issue was.

(c:

RebelHill
10-03-2013, 09:59 AM
No, dont bone the eyelids, that never works (well, not without a hell of a lot of work which is easily avoided by not using bones)... What Im saying is that rather than mess around with the other problems of non linear morfing, to instead separate actions/areas out onto different morfs. The way of least hassle is this... Model the eyelids half closed, have one morf to open wide, another to close, and then additional morfs that work on top to move just skin folds/wrinkles/whatever.

Chris Jones
10-03-2013, 10:05 AM
Yeah, but I'm still stuck with the problem of hooking those up to a control and have the morphs blend into one another, aren't I? Or are you suggesting manually doing it in Morph Mixer?

RebelHill
10-03-2013, 10:24 AM
Not at all.

You ensure that the morfs blend correctly in modeler.. you basically model them into one another correctly. As for the control hookup, you use the exact same thing as you have for the other "component" morfs, but just operating off a different input range.

Chris Jones
10-03-2013, 10:38 AM
You mean as in the face demo you referred me to a while ago, shaping the expressions with combinations of simple morphs?

RebelHill
10-03-2013, 10:42 AM
Pretty much the same thing, yup.

Chris Jones
10-03-2013, 11:24 AM
Mmm. Nifty as that technique is, from what I was able to glean it ultimately didn't look like it was going to be suitable for the kind of realism I'm going for. The number of morphs that would be required for all the subtle multi-directional motion going on in any one area would be pretty overwhelming, let alone the logistics of getting them all to work together. I found that incorporating all those vectors into single morphs and having them blend in an irregular fashion gives me something more manageable and organic.

Not to say that you couldn't do it with your method - just to say that I couldn't. ;) Hence my current preoccupation with NL morphs.

Chris Jones
10-03-2013, 11:33 AM
Found that bit about displacement in surface editor by the way, down the bottom here http://forums.newtek.com/showthread.php?131694-Using-Morphs-to-Drive-Textures&p=1282817&viewfull=1#post1282817

Tranimatronic
10-03-2013, 11:52 AM
Chris,
Try the python eye bulge script (the one in one of your other threads), just on the eyeball itself.
It makes everything very slow, but means you can use just one morph, and let the script worry about the intersection. I did this out of laziness, but it worked ok in some tests I did

RebelHill
10-03-2013, 12:20 PM
The number of morphs that would be required for all the subtle multi-directional motion going on in any one area would be pretty overwhelming, let alone the logistics of getting them all to work together.

Its actually a FAR lighter method than the one you're pursuing and frees things up to more easily get all the disparate parts to be able to work together rather than getting painted into a corner... trust me, Ive tried em all in the past.

dpont
10-03-2013, 01:05 PM
Found that bit about displacement in surface editor by the way, down the bottom here http://forums.newtek.com/showthread.php?131694-Using-Morphs-to-Drive-Textures&p=1282817&viewfull=1#post1282817

I would not see my words graved in the marble...

was a recommendation but you may skip it of course
until your experimentions gives you a correct result,
can't predict how things will go with the interaction of all your incredible
but fantastic stuff..
just remember that all Tension nodes iterates the geometry many times,
you have one main displacement setup per object,
but you may have more than one surface setup per object,

Denis.

Tranimatronic
10-03-2013, 01:57 PM
Actually Denis, I have a question (dont mean to hijack the thread, but it's kind of related)
Is it possible in the tension node to be able to specify the subdivision level ? For example get it to sample the display subdivision level instead of the render one ?
For me, I use the tension node as a general area to apply a painted displacement (a mask) and don't need it to give me such detailed (and long rendertime) results.
Is this possible ?

dpont
10-03-2013, 02:20 PM
...Is it possible in the tension node to be able to specify the subdivision level ? For example get it to sample the display subdivision level instead of the render one ?..

No because it is dependant of the MeshInfo data from the LW SDK,
with only two states, the original mesh/cage object and the subD/transformed
mesh including everythings, Tension node use this second state and the level of subD
is the user selection for render.

Denis.

Tranimatronic
10-03-2013, 02:47 PM
oh ok. I should have known if it were that easy you'd already have thought of it.
Thanks anyway ;)

Chris Jones
10-03-2013, 07:39 PM
Chris,
Try the python eye bulge script (the one in one of your other threads), just on the eyeball itself.
It makes everything very slow, but means you can use just one morph, and let the script worry about the intersection. I did this out of laziness, but it worked ok in some tests I did

That script had a couple of issues, might be worth revisiting for this application though. I was trying something similar with an effector actually, but it needs more testing.


Its actually a FAR lighter method than the one you're pursuing and frees things up to more easily get all the disparate parts to be able to work together rather than getting painted into a corner... trust me, Ive tried em all in the past.

Really? I don't see how that would be possible... Take the jaw down motion for example. I have the base position and 2 morphs: down and back. However, within those morphs I can also incorporate the radial neck stretching, skin slide round from under the jaw up to the cheek, sideways stretching of the lips, earlobes pushing outwards, temples pushing inwards etc etc. Wouldn't you need a whole stack of extra morphs to cover all that using linear combination shapes? Essentially what I'm doing is very similar, but I get all that secondary motion built in for free - plus it's much more WYSIWYG when I'm making the shapes. So far it's been working pretty well. :)

The eyelids are a similar beast, the difference being that I need a higher degree of accuracy for the arc than in the jaw, since it has to slide snugly around the eyeball. From what I can tell, I'm not going to gain any better accuracy using the method you're advocating (not that I'm discounting it - to the contrary I'm still mulling it over). I'd be glad to hear otherwise though.


I would not see my words graved in the marble...

was a recommendation but you may skip it of course
until your experimentions gives you a correct result,
can't predict how things will go with the interaction of all your incredible
but fantastic stuff..
just remember that all Tension nodes iterates the geometry many times,
you have one main displacement setup per object,
but you may have more than one surface setup per object,

Denis.

No worries, I just saw that I was having troubles before I took it to displacement editor, then the troubles went away. I intend to have another crack at it in surface.

Just to confirm though, by doing the displacements in the surface editor am I still going to get detailed displacements even if SubD order is set to Last? If not then I'll still have the same problem.

(c:

RebelHill
10-04-2013, 04:59 AM
Take the jaw down motion for example... within those morphs I can also incorporate the radial neck stretching, skin slide round from under the jaw up to the cheek, sideways stretching of the lips, earlobes pushing outwards, temples pushing inwards etc etc. Wouldn't you need a whole stack of extra morphs to cover all that using linear combination shapes? Essentially what I'm doing is very similar, but I get all that secondary motion built in for free...

The eyelids are a similar beast... From what I can tell, I'm not going to gain any better accuracy using the method you're advocating

The jaw is a different case though... ofc if you can have all the added stuff in a single morf, you do, it'd be a waste of time splitting a single morf into many for no reason. With the eyelids (or any other applicable area), the point was that when you have things like wrinkes/folds that need to somewhat time-offset from the major motion... that is when you can put the secondary motion out to a second morf, as it gives you the ability to create the actions subtly (or grossly) separate from one another.

Its not about getting better accuracy, its about an approach that involves less work, is more efficient and is easier to manage.


Just to confirm though, by doing the displacements in the surface editor am I still going to get detailed displacements even if SubD order is set to Last?

It is so.

Chris Jones
10-04-2013, 06:28 AM
With the eyelids (or any other applicable area), the point was that when you have things like wrinkes/folds that need to somewhat time-offset from the major motion... that is when you can put the secondary motion out to a second morf, as it gives you the ability to create the actions subtly (or grossly) separate from one another.

Oh ok, you have a point about separating out the timings.


It is so.

Good to know, thanks. I must say I am a bit mystified as to why there's a separate displacement editor if the displacements can be done in the surface editor. To allow for these kinds of subdivision order sidesteps mayhaps..?

RebelHill
10-04-2013, 06:53 AM
Why have both... well on the one hand its nice to have options... the other though is that the surface editor allows ONLY for 1D displacement (outwards along the poly normal)... wheras the displacement node editor allows for full 3D displacement, vector displacement.

Chris Jones
10-05-2013, 01:32 AM
I couldn't get any results in the surface editor, so I tried this (http://forums.newtek.com/showthread.php?131694-Using-Morphs-to-Drive-Textures/page3) old test which already has displacement set up in the surface editor, and found that there's still pre-subdivision displacement unless it's set to After Motion. Seems that I'm at a dead end.

I wonder if it's possible for there to be a node that sets the displacement order independently of the setting in Object Properties?

RebelHill
10-05-2013, 03:39 AM
Sorry... not subD last... subD before bones... Im getting mixed up by the whole morph in mixer/morf in nodal thing... Conflating different setups together in my head.

It does all work together by using the surfacing for your image based displacement and dispNodes for morfs though...

Set Object subD to AFTER BONES
Set Bump displacement (which is the surface ed disp channel) to BEFORE LOCAL DISP
Set disp nodes (which contain your morf setups) to BEFORE BONES

Bone and morf based displacements will be done on the cage, smoothed out by subDiv, then detailed image based displacement will go on after division.

Chris Jones
10-06-2013, 08:08 PM
Ok that works, but when applied to my human it either slows it down to an absolute crawl or crashes it every time I turn the bump distance up from 0. I'll have to strip it back to basics again and try to locate the source of this new devilry.

Chris Jones
10-08-2013, 12:28 AM
It's is no good, I'm reliving all the crashing and bad deformation I was getting before I moved displacement nodes to the displacement editor. I've passed my threshold for understanding now, and am just randomly plugging things together and entering random values.

If anyone feels like a challenge, the two node networks that need merging are here (http://forums.newtek.com/attachment.php?attachmentid=117132&d=1379377290).