PDA

View Full Version : New Blog Post: The Modifier Stack.



Pages : [1] 2

lino.grandi
12-18-2015, 11:27 AM
Please let's discuss the Modifier Stack here!

Link to the New post! (https://blog.lightwave3d.com/2015/12/the-modifier-stack/)

Ryan Roye
12-18-2015, 11:29 AM
Link here:

https://blog.lightwave3d.com/

jeric_synergy
12-18-2015, 11:36 AM
Very exciting! Now we'll have the possibility of find just the correct order, which has plagued me in the past, only to find it was not even possible.

I do worry about the NUMBER of possible combinations.

probiner
12-18-2015, 11:39 AM
Good job.
And all being a modifier means you can add multiple instances of subdivision, etc.
I just don't get where you set the displacement to be after animation and before, for example. Must be in the modifier UI (?)

VermilionCat
12-18-2015, 11:49 AM
Thanks, Lino.
So those vertices are animated by new deformer or existing FX tools?

kfinla
12-18-2015, 11:50 AM
Very Exciting stuff! Were you editing the corrective morph on the delt in Layout or Modeller ? Watched again looks like Layout :)

JohnMarchant
12-18-2015, 11:53 AM
Oh im so glad to see you little modifier stack, where have you been all my life. Well since i left using AD products.

Looks good Lino

lino.grandi
12-18-2015, 11:55 AM
Good job.
And all being a modifier means you can add multiple instances of subdivision, etc.

No, Morphing (where you can assign an object as a morph target) ,Bone and Subdivision modifiers can only be applied once in the same stack. You can have multiple instances of all the others.


I just don't get where you set the displacement to be after animation and before, for example. Must be in the modifier UI (?)

Displacement can happen before or after any deformation or subdivision. The "After Motion" that was present in the Subdivision Order in 2015 makes no sense anymore.

- - - Updated - - -


Very exciting! Now we'll have the possibility of find just the correct order, which has plagued me in the past, only to find it was not even possible.

I do worry about the NUMBER of possible combinations.

Oh, don't be too worried. That's very logical (please take time to read the post!) and will be widely explained in the video tutorials that will be released with the final product.

lino.grandi
12-18-2015, 11:58 AM
Thanks, Lino.
So those vertices are animated by new deformer or existing FX tools?

What you can see in the video is EditFX, yes.

GregMalick
12-18-2015, 11:59 AM
Is this Lightwave2016?

lino.grandi
12-18-2015, 12:00 PM
Very Exciting stuff! Were you editing the corrective morph on the delt in Layout or Modeller ? Watched again looks like Layout :)

Yes, the corrective morphs were created in Layout, so I could see the effect on the deformed object in real time.

probiner
12-18-2015, 12:00 PM
Thank for the details Lino! Shame no multiple subdivision but I understand it's not a pure "deformer".

Can you screenshot what you used to copy point positions? :) Just wanted to know if something that we can have multiple instances of and link to several meshes and not just one since multiple delta link is something I use in other software.

Cheers

prometheus
12-18-2015, 12:01 PM
Great stuff lino...great to see enhancements of mesh editing tools in layout..and as a side note reflection, I like the UI tab changes, one could of course change that oneself..but I think it´s more appropiate with a tab called model than modeler tools..whatever is hidden in that tab remains to be seen :)

lino.grandi
12-18-2015, 12:01 PM
Oh im so glad to see you little modifier stack, where have you been all my life. Well since i left using AD products.

Looks good Lino

Thanks!

lino.grandi
12-18-2015, 12:05 PM
Lino can you screenshot what you used to copy point positions? :) Just wanted to know if something that we can have multiple instances of and link to several meshes and not just one since multiple delta link is something I use in other software.

Cheers

Not home at the moment. But I can tell you we have now a new dedicated node able to do this and more. Our Jarno van der Linden made an amazing job with both the Modifier Stack and some awesome new nodes.

DrStrik9
12-18-2015, 12:05 PM
Beautiful.

Chris S. (Fez)
12-18-2015, 12:17 PM
Our Jarno van der Linden made an amazing job with both the Modifier Stack and some awesome new nodes.

Awesome!

50one
12-18-2015, 12:19 PM
Epic.

Tranimatronic
12-18-2015, 12:19 PM
VERY excited about this. Looks like you guys hit the ball out of the park. Cant wait to get my hands on it
When did you say it was going to be released again? ;)

ernesttx
12-18-2015, 12:34 PM
So, I won't be able to just modify, tweak or deform the mesh directly? that is, use a magnet tool to grab some mesh verts and drag them left or right. In the video (at 1:00), I have to deform another mesh and that deforms the mesh I want deformed?? That needs some explaination on how to use or workflow. Or better would have been a narrator in the video explaining what was happening.

erikals
12-18-2015, 12:40 PM
So, I won't be able to just modify, tweak or deform the mesh directly?
i too was expecting this, but then again, 2016 is not out just yet... let's see...

still nice stuff though, it's getting there.

WilliamVaughan
12-18-2015, 12:45 PM
So, I won't be able to just modify, tweak or deform the mesh directly? that is, use a magnet tool to grab some mesh verts and drag them left or right. In the video (at 1:00), I have to deform another mesh and that deforms the mesh I want deformed?? That needs some explaination on how to use or workflow. Or better would have been a narrator in the video explaining what was happening.


Actually ... you can ...It's a simple process. Use Edit FX to manipulate the mesh. I have an old video that shows it but not sure what the link is... but I did find this one that was created around the same time and shows the same process... just not on a character:

ftp://ftp.newtek.com/pub/LightWave/Tutorials/Vidz/MatchBackground_web.mov

If I can find the video showing the process on a character that is rigged and animated walking I will share the link... just dont remember where it is saved as it was over 10 years ago. While not the magnet tool Edit FX can give you identical results. Awesome feature in LW.

erikals
12-18-2015, 12:50 PM
yes, but i think he knew about that though, like i do.

i too use editfx in troublesome times... like here >


https://www.youtube.com/watch?v=Zyk6pV15nSI

lino.grandi
12-18-2015, 12:57 PM
So, I won't be able to just modify, tweak or deform the mesh directly? that is, use a magnet tool to grab some mesh verts and drag them left or right. In the video (at 1:00), I have to deform another mesh and that deforms the mesh I want deformed?? That needs some explaination on how to use or workflow. Or better would have been a narrator in the video explaining what was happening.

We need to distinguish the corrective morph creation process from the actual morphs applied to the rigged mesh.
At the moment you need a second non-rigged mesh to create the corrective morphs. And, believe me that makes applying them and get thje exact same shape a very simple process.
Of course the corrective shapes are saved on the main objects as well, and you don't need anything but that in the final scene.

ernesttx
12-18-2015, 12:58 PM
Thanks William, that video would be nice to watch, I hope you can find it. and thanks Erikals.

I guess, what is confusing, is the video states "A second mesh can be used to edit the corrective shape ..." Why would I want to use a second mesh? And if I can edit a second mesh, then, why can't I just edit the first mesh??

WilliamVaughan
12-18-2015, 12:59 PM
yes, but i think he knew about that though, like i do.

i too use editfx in troublesome times... like here >


https://www.youtube.com/watch?v=Zyk6pV15nSI


Gotcha... wasn't sure if everyone was aware of it.

ernesttx
12-18-2015, 01:04 PM
Thanks for trying to explain it Lino, but I'm not seeing the advantage of altering a second mesh. That's more mesh/object management than necessary me thinks.

If I'm animating a character:
1. do I have to stop,
2. then load second mesh,
3. make corrective morph,
4. then unload second mesh?? or make it invisible to get it out of my way,
5. then go back to animating, trying to keep the flow of the action and inspiration for how i'm moving the character and keep the pace, then DOH! I have to stop and repeat steps 1 through 5 again.

DrStrik9
12-18-2015, 01:10 PM
Gotcha... wasn't sure if everyone was aware of it.

Not everyone. :D Thanks.

Kuzey
12-18-2015, 01:11 PM
Great stuff

Is that object panel going to stay like that..free floating etc?

I'd love to have that panel pinned to the right of the main window so you get easier & more efficient access...Matt...Matt..LW has tooooo many panels :P

lino.grandi
12-18-2015, 01:15 PM
Thanks for trying to explain it Lino, but I'm not seeing the advantage of altering a second mesh. That's more mesh/object management than necessary me thinks.

If I'm animating a character:
1. do I have to stop,
2. then load second mesh,
3. make corrective morph,
4. then unload second mesh?? or make it invisible to get it out of my way,
5. then go back to animating, trying to keep the flow of the action and inspiration for how i'm moving the character and keep the pace, then DOH! I have to stop and repeat steps 1 through 5 again.

You're confusing point/sculpt animation with actually creating corrective morphs linked to the bone rotations of a rigged character.

You can do point animation in 2015 alredy, using EditFX or ChronoSculpt.

wyattharris
12-18-2015, 01:15 PM
Thanks Lino. Not intimately familiar with modifier stacks but seems every time I see one in different programs they look totally different than the last implementation.
Lol, no worries, I'll learn it.

ernesttx
12-18-2015, 01:22 PM
Lino, I feel the two are not that very different and besides, in the video, you are point sculpting/animating his shoulder to create a corrective morph. I can understand creating corrective morphs (or any morph for that matter) and have those available. However, having the ability to point sculpt a mesh depending on how the mesh is being animated (exaggerated, cartoony run cycle vs real world man run cycle) an animator would want to be able to create/sculpt that 'morph' and hopefully have the ability to save just created morph. Or, the the shape or sculpt was only a one-off, at least the animator could create that on the fly and not have to stop and go make a corrective morph.

For example, in messiah, I can animate using IK/FK bones and then on top of that, I can point animate. Now during animation, if I like a pose or sculpt, I can save that out as a morph, and then re-import it as a morph back onto the mesh, and then use it again and again.

erikals
12-18-2015, 01:30 PM
an alternative method is doing it in both Modeler / Layout at the same time,
this way the deformed area can always be in focus,
perfect deformation, but the technique itself is a bit limited...


https://www.youtube.com/watch?v=313H-xNh8y8


------------------

2nd alternative is to use the LW2016 technique with a CCTV in VPR to always focus on the deformed area
( haven't made a video of that... yet... )
but this might be over-doing it, might be better just to zoom in/out in LW2016... need to test...

------------------

the best would be to have as ernestfx says the first and the second mesh in the same place,
with the correct math, it should be possible.

challenge > might be a lot of code though.
it would require some reverse engineering.

------------------

but again, it's Very good that we're getting this Modifier Stack

http://erikalstad.com/emoti/homer-96.jpg

MAUROCOR
12-18-2015, 01:38 PM
Our Jarno van der Linden made an amazing job with both the Modifier Stack and some awesome new nodes.

Amen for that!:)

Really nice addition.

Satera
12-18-2015, 02:46 PM
Ohh I see where all this is going! XD Amazing!!

So, a quick question. Where is all this information stored?

So if I create a movement like shown where a character rises his arm and down again. Then create corrective morphs. Where does this leave the information? Is it on the model as a different type of morph so when I import the character into another scene the corrective morphs are applied on it, or is it dependent on the scene to have the information, and corrective morphs are saved there? Im just thinking workflow wise. If I do several characters with corrective morphs on them. Someone animates one, then another animator is on another, we would then want to bring these together. Or something as simple as bringing in an animated character into a scene where is this information held and how simple is it to transfer this information from one scene to the other for simple management?

Hope I make sense, I tend to handle objects and animation separately and combine it all at the end for better memory management.
Im loving this!

DogBoy
12-18-2015, 03:10 PM
Ohh I see where all this is going! XD Amazing!!

So, a quick question. Where is all this information stored?

It's a morph, so will be stored in model.

Cageman
12-18-2015, 03:13 PM
For example, in messiah, I can animate using IK/FK bones and then on top of that, I can point animate. Now during animation, if I like a pose or sculpt, I can save that out as a morph, and then re-import it as a morph back onto the mesh, and then use it again and again.

I am pretty sure that the workflow will improve over the course of developement (not sure if those improvements will make it to the release though).
EDIT: With this I am refering to your suggestions....which are good ones! :)

But... hand on your heart; do you think that this is a step (a single step that is) in the right direction, or, do you feel that it is a step back?

pinkmouse
12-18-2015, 03:19 PM
Can we have a fracture tool as part of the modifier stack? Or a Boolean?

allabulle
12-18-2015, 06:20 PM
Thanks for the blog post and video! It's really appreciated.

It must have been hard, working for years on the new architecture and as it was maturing not being able to show it or talk about it. It looks really good! I'm eager to play with the new LightWave as I ever was.

dsol
12-18-2015, 07:29 PM
A really nicely written and informative blog post. Like others, I'm a little disappointed there's no option for multiple subdivisions, but it's not the end of the world - and the new pipeline will still provide a huge boost over the existing setup. Can't wait to play with this in the final product!

lightscape
12-18-2015, 07:52 PM
Thanks Lino.
Why do we need a second mesh to make morph adjustments to the character? That's not very lightwavy.
Always hated that in max and maya.

Snosrap
12-18-2015, 10:16 PM
Wow did I see quads in Layout! :)

m.d.
12-18-2015, 10:55 PM
Can we have a fracture tool as part of the modifier stack? Or a Boolean?

That's probably a bigger change than you might think
This is deformation order stacking same as in Maya and the rest of the 3d world.....what you are talking about would be modifier stacking, including changes to the topology...totally different ball game....the model would have to stay live and cook through the model stack at every frame (houdini does this while procedural modelling) but at some point the memory gets out of hand.

m.d.
12-18-2015, 10:58 PM
Great job Lino and Co.

Slowly checking off the list of everyone's hopes and complaints....

If you guys keep this up, we won't have anything to whine about :)

lino.grandi
12-18-2015, 11:54 PM
Lino, I feel the two are not that very different and besides, in the video, you are point sculpting/animating his shoulder to create a corrective morph. I can understand creating corrective morphs (or any morph for that matter) and have those available. However, having the ability to point sculpt a mesh depending on how the mesh is being animated (exaggerated, cartoony run cycle vs real world man run cycle) an animator would want to be able to create/sculpt that 'morph' and hopefully have the ability to save just created morph. Or, the the shape or sculpt was only a one-off, at least the animator could create that on the fly and not have to stop and go make a corrective morph.

For example, in messiah, I can animate using IK/FK bones and then on top of that, I can point animate. Now during animation, if I like a pose or sculpt, I can save that out as a morph, and then re-import it as a morph back onto the mesh, and then use it again and again.

Yes, Messiah Point Animation is absolutely amazing. I've been playing with that, and yes, you can sculpt the mesh and create "mesh" keyframes that stays consistent with the bone rotations.

So you can actually sculpt in pose.

A question...how do you save the morph in Messiah?

Something important about the video: you can already make edits on the same mesh you've rigged in this "next" version of LightWave.
The fact you can use a different one (again, that happens through a new dedicated node) should be considered a plus, and opens for some interesting new possibilities.

Anyway, just remember we only show what is currently available in the current development state of the software, and that we're constantly aiming for improvements all over the place. ;)

Thank you for feedback about this!

lino.grandi
12-19-2015, 12:11 AM
Ohh I see where all this is going! XD Amazing!!

So, a quick question. Where is all this information stored?

So if I create a movement like shown where a character rises his arm and down again. Then create corrective morphs. Where does this leave the information? Is it on the model as a different type of morph so when I import the character into another scene the corrective morphs are applied on it, or is it dependent on the scene to have the information, and corrective morphs are saved there? Im just thinking workflow wise. If I do several characters with corrective morphs on them. Someone animates one, then another animator is on another, we would then want to bring these together. Or something as simple as bringing in an animated character into a scene where is this information held and how simple is it to transfer this information from one scene to the other for simple management?

Hope I make sense, I tend to handle objects and animation separately and combine it all at the end for better memory management.
Im loving this!

Morphs are stored in the main model, so that's the only mesh you need in the scene while animating.

lino.grandi
12-19-2015, 12:36 AM
Thanks Lino.
Why do we need a second mesh to make morph adjustments to the character? That's not very lightwavy.
Always hated that in max and maya.

I agree! Again, you can do the same just using one mesh. The fact you can "connect" the mesh shape to another is just a nice new feature.

DogBoy
12-19-2015, 01:40 AM
Thanks Lino.
Why do we need a second mesh to make morph adjustments to the character? That's not very lightwavy.
Always hated that in max and maya.

My guess would be so you can access points that might be inaccessible in the rigged/posed version.

Surrealist.
12-19-2015, 01:49 AM
Yes, Messiah Point Animation is absolutely amazing. I've been playing with that, and yes, you can sculpt the mesh and create "mesh" keyframes that stays consistent with the bone rotations.

So you can actually sculpt in pose.

A question...how do you save the morph in Messiah?

Something important about the video: you can already make edits on the same mesh you've rigged in this "next" version of LightWave.
The fact you can use a different one (again, that happens through a new dedicated node) should be considered a plus, and opens for some interesting new possibilities.

Anyway, just remember we only show what is currently available in the current development state of the software, and that we're constantly aiming for improvements all over the place. ;)

Thank you for feedback about this!

Yeah I had quite a time with point animation in Messiah. I think it is an amazing feature. This video and blog post was very cool. Nice to see this coming to LW.

I don't think you intend this modifier stack to be the same as in Blender with Booleans array and so on in the future do you? Is this more of just a deformation stack? Do you intend to have other mesh changing, or modeling stacks? Or will you build all of that into the same stack as in Blender?

pinkmouse
12-19-2015, 02:35 AM
That's probably a bigger change than you might think..

Quite possibly. But needed, I think.

creacon
12-19-2015, 02:38 AM
I asked a question on the forum, and only 9 years later it is already done? Amazing :-)

http://forums.newtek.com/showthread.php?59152-displacement-trouble&p=460675&viewfull=1#post460675

But the fact that there is only one subdivision and one bone defomer makes this more a deformation stack combined with a "fixed steps order arranger" (for lack of better words)
This is a good intermediate solution and better than anything we had before but it's not what you expect from a modifier stack.
Is this a temp solution, or in other words can we expect that these limitations are taken care of in the (near) future?

creacon

lino.grandi
12-19-2015, 02:49 AM
Both subdivision and bone deformer are relative to the currently implemented subpatches and bone systems. So, this is a real modifier stack. Basically anyone can write his own deformed so it can be used in the stack.

erikals
12-19-2015, 02:52 AM
I agree! Again, you can do the same just using one mesh. The fact you can "connect" the mesh shape to another is just a nice new feature.

Awesome!! http://erikalstad.com/backup/misc.php_files/king.gif http://erikalstad.com/backup/misc.php_files/bowdown.gif http://erikalstad.com/backup/misc.php_files/smile.gif

bazsa73
12-19-2015, 02:57 AM
I just saw the new video, looks promising.

Skywatcher_NT
12-19-2015, 03:56 AM
Did anyone notice the smoothness of the deforming mesh during playback ? Looks pretty fast ...

CaptainMarlowe
12-19-2015, 04:43 AM
Maybe a benefit from the UGE ?

erikals
12-19-2015, 05:02 AM
Did anyone notice the smoothness of the deforming mesh during playback ? Looks pretty fast ...
looks damn fast... http://erikalstad.com/backup/misc.php_files/smile.gif

http://erikalstad.com/backup/misc.php_files/goodluck.gif crossing fingers

jwiede
12-19-2015, 05:05 AM
Lino, the modifier stack as currently implemented looks quite nice, thanks for showing us! I agree with the others that it'd be nice in future to allow multiple distinct subdivision and bone deformation modifier instances be applied, but initially the single-instance limitation for them is acceptable (provided the design allows multiple later).

Since we're discussing subdivision, has anything been added to LW's subdivision surface support (f.e. OpenSubDiv, or Pixar Psubs) for LW2016?

creacon
12-19-2015, 05:08 AM
In the strict sense of the word yes, it modifies stuff so it can be called a modifier stack. But it is too limited to be compared to other implementations.
What I mean with that is that I could do that with the SDK already. A complete modifier stack would allow creation and destruction of geometry, would allow any number of instances of any modifier.
What my question really means is, is this a temporary solution? In other words will the "semi hard coded bonedeformer" be rewritten as a separate deformer using a skeleton that consists of other scene elements?
Will the subdivision become a "node" that can be duplicated/repeated?

for example:

- Animated a big bulge
- Subdivide
- Animate smaller bulges/deformations
- Subdivide
- Animate even smaller bulges/deformations


creacon


Both subdivision and bone deformer are relative to the currently implemented subpatches and bone systems. So, this is a real modifier stack. Basically anyone can write his own deformed so it can be used in the stack.

3DGFXStudios
12-19-2015, 05:33 AM
I totally agree with creacon here. I think the bone and subpatch systems should be separate items that could be added multiple times. Bones should be a deformers too in the way that displacement, bend or twist modifiers are.
You could for example create a complex folding system with bones.

bobakabob
12-19-2015, 07:37 AM
This is looking very exciting, thanks for posting Lino.
Sculpting in pose and point manipulation and animation directly in Layout would take LW to the next level.

lino.grandi
12-19-2015, 08:14 AM
I totally agree with creacon here. I think the bone and subpatch systems should be separate items that could be added multiple times. Bones should be a deformers too in the way that displacement, bend or twist modifiers are.
You could for example create a complex folding system with bones.

No doubt that's the way to go. We should consider what we have now more as a deformation stack than a fully implemented modifier stack (where geometry can normally be subdivided multiple times). As a first step, it opens for some very nice workflows in LightWave, especially considering you can add multiple node editor modifiers.

- - - Updated - - -


This is looking very exciting, thanks for posting Lino.
Sculpting in pose and point manipulation and animation directly in Layout would take LW to the next level.

I totally agree.

RebelHill
12-19-2015, 08:22 AM
I know that this blog post and thread are mainly about the modifier stack... tbh, not THAT exciting... we could already re-order deformers and displacements for a long time now... the new stack makes those operations clearer and cleaner, so that's nice. A much needed and appreciated change, but hardly revolutionary.

However... the bits that are almost tacked on as an aside to this... new nodes for evaluating meshes, their vertices, etc... and thus use one mesh as a deformer for another (displacement mesh referencing)... now that IS a huge step forward for LW here, and a WAY bigger deal than the stack alone imo. That should allow for some VERY cool stuff to be done that at the moment you wouldnt even think of attempting in LW.

prometheus
12-19-2015, 08:44 AM
A bit off topic perhaps..Any hints on wether or not weightmaps can be shown in Layout open GL at the first release of Lightwave 2016, or simply if you guys are working on it..or has it on the agenda?
I like what I see though so far.

lino.grandi
12-19-2015, 08:49 AM
However... the bits that are almost tacked on as an aside to this... new nodes for evaluating meshes, their vertices, etc... and thus use one mesh as a deformer for another (displacement mesh referencing)... now that IS a huge step forward for LW here, and a WAY bigger deal than the stack alone imo. That should allow for some VERY cool stuff to be done that at the moment you wouldnt even think of attempting in LW.

The new nodes are of course extremely important, especially now we can have multiple deformers of the same kind (Nodal Dispalcement above all) and will be showed in one of the next Blog Posts.

lino.grandi
12-19-2015, 08:54 AM
A bit off topic perhaps..Any hints on wether or not weightmaps can be shown in Layout open GL at the first release of Lightwave 2016, or simply if you guys are working on it..or has it on the agenda?
I like what I see though so far.

That's something we consider extremely important, but probably out of scope for the next release.

- - - Updated - - -


Can we have a fracture tool as part of the modifier stack? Or a Boolean?

Out of scope for the next version, but for sure something on our agenda.

lino.grandi
12-19-2015, 08:57 AM
Did anyone notice the smoothness of the deforming mesh during playback ? Looks pretty fast ...

Yes it is! Especially if we consider the detail you see comes from a displacement map.

erikals
12-19-2015, 10:09 AM
i'm curious to if importing large meshes and scrubbing large meshes will be fast.

thinking of for example scrubbing huge Houdini / RealFlow water fluids meshes here...

jasonwestmas
12-19-2015, 10:18 AM
So nice to finally see this kind of stacking deformation technology in Lightwave! I know it was kinda in there but it was weak. Definitely looking forward to giving it a go especially with the nice performance enhancements. . .so important.

hrgiger
12-19-2015, 11:34 AM
Thanks for the blog post Lino. It's really shaping up to be a very important release for LW.

Surrealist.
12-19-2015, 11:44 AM
^ Yes. :)

Dexter2999
12-19-2015, 11:44 AM
Thanks for the post Lino, and for taking time to do follow up on the forums. Especially since ALL of this is done on your time.

Oh, and Merry Christmas, Sir!

lino.grandi
12-19-2015, 12:33 PM
Thanks! Merry Christmas to you all!

probiner
12-19-2015, 12:38 PM
The nodes are important as much as they don't drag the application down. I've done my own point deformers with nodes. But kills the application. Let's see how the geometry engine tackles that, hence more curiosity about that.

ernesttx
12-19-2015, 03:24 PM
Lino, I owe you a bit of an apology. When the blog post came out, my mind was excited but my perception was clouded. I was thinking it was more UGE than modifier stack and after reading and watching more, I see how the modifier stack is utilized. I was blinded by being able to do what I do in messiah. When I saw the manual deformation and timeline scroll, I was thinking of manipulating the mesh directly when animating. So, with this just being modifier stack related, I'm please with what I see. I'm hoping for the ability to animate the mesh in the future.

To answer your question about saving morphs in messiah with point animation, here is a short description: Pose your character and fix geometry with Point Animation, go to setup and change Point Animation before Bone Deform. Go to animate and see that its probably borked, now you need to eyeball and experiment: turn off Bone Deform and adjust geometry with Point Animation, check how it looks with Bone Deform on and repeat this until you're happy. When you're done turn off Bone Deform and save your model with Point Animation applied. Load this model you saved and use as regular morph. Here is the thread in their forum: http://setuptab.com/index.php?PHPSESSID=b6c4fc7c4faf774e9b7fda17466759 37&topic=1891.0

lightscape
12-19-2015, 07:16 PM
Especially since ALL of this is done on your time.



What does THAT mean? He works part time only?

Dexter2999
12-19-2015, 09:25 PM
It means, he has stated blog posts are only done when they have "spare" time but he has also said they are super busy. And, responding to questions on the forums probably isn't part of his job description at all.

You think chatting on the forum is what they pay him for?

Dexter2999
12-19-2015, 09:26 PM
Duplicate

Surrealist.
12-19-2015, 09:57 PM
Oh, boy. Let's just not even get started again. Shall we?

lightscape
12-20-2015, 12:25 AM
It means, he has stated blog posts are only done when they have "spare" time but he has also said they are super busy. And, responding to questions on the forums probably isn't part of his job description at all.

You think chatting on the forum is what they pay him for?

Did I say chatting is what they pay him for?

Its a product video. Unless I'm mistaken Lino's primary duties are workflow development, QA and product demos.
If he was a programmer on lightwave itself then that would be his spare time to create a video. Btw people multi task these days.

Some young guy I keep forgetting his name was supposed to create tutorial vidoes for lightwave but only produced less than 10 afaik. Maybe he has no spare time, too.

erikals
12-20-2015, 12:29 AM
guys, seriously...

lino.grandi
12-20-2015, 02:09 AM
Please let's stop this and get back on topic! ;)

3DGFXStudios
12-20-2015, 03:25 AM
Are the bullet dynamics also part of the deformer stack?

Ztreem
12-20-2015, 03:33 AM
Thanks Lino for the blog post and all answers here. Looking great and a good start for future updates.

Surrealist.
12-20-2015, 03:42 AM
It does sound thematically, like the UGE itself - whatever that is (joking :D) - that it is laying the groundwork for future updates. I was also taking note of "there is a specific node for that", so we are talking nodes then Lino? Is this new system basically going to be based on nodes?

Do you care to elaborate on that ? As much as you can of course. :)

VermilionCat
12-20-2015, 03:52 AM
What kind of benefits we get from referencing node? Any hint would be nice.

lino.grandi
12-20-2015, 06:34 AM
Are the bullet dynamics also part of the deformer stack?

The deformation based absolutely are (Deforming Bodies).

nez
12-20-2015, 06:37 AM
The deformation based absolutely are (Deforming Bodies).


That makes the horizon something nice to look at.

lino.grandi
12-20-2015, 06:56 AM
It does sound thematically, like the UGE itself - whatever that is (joking :D) - that it is laying the groundwork for future updates. I was also taking note of "there is a specific node for that", so we are talking nodes then Lino? Is this new system basically going to be based on nodes?

Do you care to elaborate on that ? As much as you can of course. :)

It appears to me that nodes are growing faster into the system. I can see the node system becoming something very important for the future releases of LightWave (I can't stress enough the final "S" in "releases"! :D ).

lino.grandi
12-20-2015, 06:58 AM
What kind of benefits we get from referencing node? Any hint would be nice.

I'll make a video showing some nice stuff about using the new nodes. Just think about cutting the head of a character, create some facial setup and connect its deformation to another mesh (the whole character)...that's just the tip of the iceberg. ;)

- - - Updated - - -


That makes the horizon something nice to look at.

Yes it does!

hrgiger
12-20-2015, 07:05 AM
Lino, was just thinking you don't have to show new stuff every blog post. I would like to see more examples of the new hair shader, skin and maybe some more volumetric examples

lino.grandi
12-20-2015, 07:31 AM
Lino, was just thinking you don't have to show new stuff every blog post. I would like to see more examples of the new hair shader, skin and maybe some more volumetric examples

I agree.

Marander
12-20-2015, 08:00 AM
I agree.

Hi Lino

Thanks for all the work and information you provide!

A bit off topic: Also interesting would be simple PBR rendering examples (simple primitives) of metal and glass materials. And maybe a hint what engine you use (e.g. metallic / roughness vs specular / gloss) and if there's a Substance integration.

Thanks!

OFF
12-20-2015, 08:04 AM
I agree.

And rendering! )

Surrealist.
12-20-2015, 08:28 AM
It appears to me that nodes are growing faster into the system. I can see the node system becoming something very important for the future releases of LightWave (I can't stress enough the final "S" in "releases"! :D ).

lol yeah. Understood. Thanks for the answer.

ernesttx
12-20-2015, 12:04 PM
"Just think about cutting the head of a character, create some facial setup and connect its deformation to another mesh (the whole character)"

Oh, now I'm very intriqued. :) Thanks again Lino.

Kaptive
12-20-2015, 12:43 PM
Excellent stuff, thanks Lino. It looks like we'll be having a lot of fun with the new version, while having many gripes addressed and fixed. Love it!

bobakabob
12-20-2015, 01:33 PM
Must say there is a genuine feeling of momentum and it's great to see the dev team directly and openly address known issues such as lack of point manipulation in Layout.

I've used LW since 5.5 and Autodesk products at work but have always rooted for Newtek because they democratised 3D back when you had to shell out the equivalent of a car or even a house to do 3D. A lot of people here would not have careers or a creative life in 3D because of this. I still see LW as amazing value despite the advent of Blender as it has an excellent UI, great renderer and is solid in production. Today I use Maya a lot (mainly for animation (rendering in LW) and believe me, despite its strengths, there are issues aplenty. If in doubt just check out the forums. And of course Maya is way overpriced and out of reach for many. You can't even 'own' it anymore. The combination of RHiggit and LW renderer is already very powerful. However, the rendering, animation and Unified Geometry developments in LW are very exciting because they will close the character animation gap with Maya considerably. Thanks for all the communication so far, Lino.

VermilionCat
12-20-2015, 05:07 PM
I'll make a video showing some nice stuff about using the new nodes. Just think about cutting the head of a character, create some facial setup and connect its deformation to another mesh (the whole character)...that's just the tip of the iceberg. ;)

Thanks, Lino. Looking forward to it!
It sounds like the mesh would be evaluated based on the local point coordinates like Cage Deformer does. Now what I wish is decent editing capabilities, point manipulation, to fully unleash this beast!

lino.grandi
12-21-2015, 03:31 AM
For example, in messiah, I can animate using IK/FK bones and then on top of that, I can point animate. Now during animation, if I like a pose or sculpt, I can save that out as a morph, and then re-import it as a morph back onto the mesh, and then use it again and again.

The Point Animation tool in Messiah is absolutely awesome. What I'd like to see is the process of saving the morph and then using it on a rigged character as a corrective shape. Something like:

- load you character in messiah and rig it, even just an arm
- bend the arm using bones
- use Point Animation to correct the deformation
- Export the Morph
- Get rid of the Point Deformation modifier
- Import the morph in the main rigged mesh and apply it as a corrective morph

Can you please show me those steps in Messiah as you have some time?

pinkmouse
12-21-2015, 03:33 AM
I know CA's your thing Lino, but could we have some examples of the deformation stack for other workflows please?

lino.grandi
12-21-2015, 03:51 AM
I know CA's your thing Lino, but could we have some examples of the deformation stack for other workflows please?

Motion Graphics is for sure another area that can greatly benefit from the deformation stack. Just think about what you can do already using Denis's awesome DPKit...

But yes, there'll be examples not necessarily related to CA.

Niko3D
12-21-2015, 05:04 AM
I'm very happy to see a lot of improvements in LW!;)...
But...new Render Passes system?
Delete the old Scene Editor?
and...to have the Enable/Disable Collision in the Instances?

...Any chance?

OFF
12-21-2015, 05:11 AM
I realize that my question is a bit out of topic, but do not want forget it again - whether there will be a tool to mirroring transfer vertex maps (Weight's maps, Morph maps, etc), say, from the left half of the body of the character on the right?
This would help to many people save a lot of time on this rutine works.

Niko3D
12-21-2015, 05:16 AM
yes sorry...I think my question is a bit out of topic too...

Anti-Distinctly
12-21-2015, 05:42 AM
I know CA's your thing Lino, but could we have some examples of the deformation stack for other workflows please?

Here's an example of why this is immensely useful to me and no doubt some others...
Below is an image of a node network I'm currently using. It's not the most complex node network I've created, but it is large, difficult to read and difficult to debug. Everything in a red box can, and possibly should, be in it's own node editor instance. Sort of like a function in programming. Take some input points, does one thing, produces outputs.
There are also some blue boxes, these are places where I've had to manually compensate for existing deformations in order to properly calculate another deformation.
For example, if I have a displacement...something like effector where a mesh is pushed away from a null or something, but there is also a noise deform calculated before that, Ihave to tell the effector displacement where the new point position will be after my noise deformation. Currently in Lightwave everything is calculated from the undeformed base position of the mesh.
With this new system, I can create a node network for my noise, then have a second node network perform my effector type displacement on the new point positions.
131559

lino.grandi
12-21-2015, 05:52 AM
I'm very happy to see a lot of improvements in LW!;)...
But...new Render Passes system?
...Any chance?

Not in this thread for sure! ;)

ernesttx
12-21-2015, 06:47 AM
Lino, I'll see if I can work up something this evening for ya on the messiah process.

lino.grandi
12-21-2015, 07:05 AM
Here's an example of why this is immensely useful to me and no doubt some others...
Below is an image of a node network I'm currently using. It's not the most complex node network I've created, but it is large, difficult to read and difficult to debug. Everything in a red box can, and possibly should, be in it's own node editor instance. Sort of like a function in programming. Take some input points, does one thing, produces outputs.
There are also some blue boxes, these are places where I've had to manually compensate for existing deformations in order to properly calculate another deformation.
For example, if I have a displacement...something like effector where a mesh is pushed away from a null or something, but there is also a noise deform calculated before that, Ihave to tell the effector displacement where the new point position will be after my noise deformation. Currently in Lightwave everything is calculated from the undeformed base position of the mesh.
With this new system, I can create a node network for my noise, then have a second node network perform my effector type displacement on the new point positions.
131559


Thank you so much Luke!

Yes...with the new system you can actually add a new Nodal Displacement modifier and use the update vertex position and normals for the new transformations.

- - - Updated - - -


Lino, I'll see if I can work up something this evening for ya on the messiah process.

Thank you so much!

jasonwestmas
12-21-2015, 08:42 AM
Anything deforming can be considered to have character and thus be an expressive character. Therefore any info On the deformers and how they stack together can be useful for many animations. Facial expressions are nothing more than deforming shapes after all.

lino.grandi
12-21-2015, 08:59 AM
Anything deforming can be considered to have character and thus be an expressive character. Therefore any info On the deformers and how they stack together can be useful for many animations. Facial expressions are nothing more than deforming shapes after all.

I totally agree.

pinkmouse
12-21-2015, 09:08 AM
So do I, to a point, but that's a bit like saying a 3D graphics artist should be equally as skilled with a hammer and chisel and block of marble. It's all a bit too generalist. ;)

jasonwestmas
12-21-2015, 09:10 AM
Double post

jasonwestmas
12-21-2015, 09:14 AM
So do I, to a point, but that's a bit like saying a 3D graphics artist should be equally as skilled with a hammer and chisel and block of marble. It's all a bit too generalist. ;)

I am a generalist Like many here are. Although i do like to specialize in creating expression using deformations. Im quite fascinated with that aspect of animation. Motion graphics can be just as involved as any organic or folding collapsing sliding bulging pinching thing. ;)

pinkmouse
12-21-2015, 09:25 AM
I like to think of myself as a generalist too, just one who can't be bothered will all the fuss involved in CA. Now if I could do it with nodes, without all that tedious keyframing, curve tweaking, morphing, parenting, etc. etc. then I'd be all over it. :D

jasonwestmas
12-21-2015, 09:33 AM
I like to think of myself as a generalist too, just one who can't be bothered will all the fuss involved in CA. Now if I could do it with nodes, without all that tedious keyframing, curve tweaking, morphing, parenting, etc. etc. then I'd be all over it. :D
Well i must argue the fact that outstanding animation tools will benefit lower budget performances as well as the higher grade ones.

hdace
12-21-2015, 02:26 PM
Been very busy so quite late to the party. Just wanted to say that the new corrective morph system is the holy grail I've been waiting for. LW3DG is really doing the business! Thanks very much! Can't wait to get my hands on it.

wyattharris
12-21-2015, 03:56 PM
Okay, I have a better idea of what I'm looking at now. "Modifier" stack was throwing me off compared to what I've seen in other packages. Deformation stack makes more sense.
Bullet integration, now that's my sweet spot. Glad to hear all the pieces are working together.

Nicolas Jordan
12-21-2015, 09:46 PM
The new modifier stack is looking good! :thumbsup:

jwiede
12-22-2015, 01:28 AM
The deformation based absolutely are (Deforming Bodies).

Lino, does that "Deforming Bodies" designation specifically represent Bullet SBDs?

Also, this may already have been answered, but are there entries for when Instances, Particles, Hair, and volumetrics are processed* relative to deformation stack entries? If so, do they all have their own, discrete entries, or if not, how are they combined into entries?

*: To clarify, in this case I mean "processing" as when those subsystems "see"/access geometry relative to deformation stack modifiers.

lino.grandi
12-22-2015, 01:37 AM
Lino, does that "Deforming Bodies" designation specifically represent Bullet SBDs?

Yes.


Also, this may already have been answered, but are there entries for when Instances, Particles, Hair, and volumetrics are processed* relative to deformation stack entries? If so, do they all have their own, discrete entries, or if not, how are they combined into entries?

*: To clarify, in this case I mean "processing" as when those subsystems "see"/access geometry relative to deformation stack modifiers.

Instances, Hair and volumetrics are always processed at the end of the relative object deformation (it would make no sense to have an object deformed by dynamics and have FiberFX or Instances applied before that deformation, so to mention one simple case).

pinkmouse
12-22-2015, 02:57 AM
Instances, Hair and volumetrics are always processed at the end of the relative object deformation (it would make no sense to have an object deformed by dynamics and have FiberFX or Instances applied before that deformation, so to mention one simple case).

So no use of Instances as collision objects to deform a surface then then?

mav3rick
12-22-2015, 03:54 AM
Yes.



Instances, Hair and volumetrics are always processed at the end of the relative object deformation (it would make no sense to have an object deformed by dynamics and have FiberFX or Instances applied before that deformation, so to mention one simple case).

what if i want lattice on top of that?

put furry character into glass bowl ? it is all i want for long time... on top of all deformations, instances, fur etc.. wrap up stuff into lattice deformer..... or deforming object with applied bullet soft fx simulation with bones or lattice after bullet sim?

lino.grandi
12-22-2015, 04:14 AM
what if i want lattice on top of that?

put furry character into glass bowl ? it is all i want for long time... on top of all deformations, instances, fur etc.. wrap up stuff into lattice deformer..... or deforming object with applied bullet soft fx simulation with bones or lattice after bullet sim?

Thank you for suggesting some great demos for the next version Berislav! You just gave me one more reason to love you!

- - - Updated - - -


So no use of Instances as collision objects to deform a surface then then?

No, at the moment that's not possible.

pinkmouse
12-22-2015, 04:18 AM
But it might be? :D

I now have visions of lots of cute furry animals being launched at glass objects...

lino.grandi
12-22-2015, 04:19 AM
But it might be? :D

I now have visions of lots of cute furry animals being launched at glass objects...

Very unlikely in the next version. But on our agenda for sure.

lightscape
12-22-2015, 04:39 AM
I like to think of myself as a generalist too, just one who can't be bothered will all the fuss involved in CA. Now if I could do it with nodes, without all that tedious keyframing, curve tweaking, morphing, parenting, etc. etc. then I'd be all over it. :D

A generalist excels at modelling, rigging, animation, effects, lighting, render management, storyboarding, camera direction and compositing.

Here's a real generalist :D
http://getwrightonit.com/smash-bros-trailer-making-of/


https://www.youtube.com/watch?v=2hiJVYAS_Ng

GraphXs
12-22-2015, 07:38 AM
Thanks Lino, that looks really sweet! One thing, when you say you can use one mesh (using softfx in the video) you can do it directly on the deforming mesh with the bones? (pose for pose??) Or do you need the cloned mesh to do it like your video? Does it then add a morph for every edit? What is the current workflow?

Really looking forward to it! Merry Christmas, Happy Holiday!!!!

wyattharris
12-22-2015, 08:37 AM
One thing, when you say you can use one mesh (using softfx in the video) you can do it directly on the deforming mesh with the bones? (pose for pose??) Or do you need the cloned mesh to do it like your video?
I think I misread that at first too. The bit in the video says "A second mesh CAN be used to edit the corrective shape using any available deformer." You could conclude that you CAN use a second mesh but its not necessary.

Lino clarifies it here but his clarification threw me a bit.

We need to distinguish the corrective morph creation process from the actual morphs applied to the rigged mesh.
At the moment you need a second non-rigged mesh to create the corrective morphs. And, believe me that makes applying them and get the exact same shape a very simple process.
Of course the corrective shapes are saved on the main objects as well, and you don't need anything but that in the final scene.

lino.grandi
12-22-2015, 09:21 AM
You can create the corrective shape using only one mesh as well.

jeric_synergy
12-22-2015, 07:15 PM
No, at the moment that's not possible.
Lino, one thing I've learned from watching everybody else is that if there's the possibility of something that "doesn't make sense", such as, ohhhh, negative brightness in a light, Lighwave users will come up with a use, no matter how little sense it makes.

With that in mind, I urge you to remind the developer's to NOT exclude any possible functionality, just because "it doesn't make sense". If it can be represented mathematically, they should err on the side of including it ANYWAY. Someone, somewhere will find some amazing use for it.

Ztreem
12-22-2015, 07:52 PM
Lino, one thing I've learned from watching everybody else is that if there's the possibility of something that "doesn't make sense", such as, ohhhh, negative brightness in a light, Lighwave users will come up with a use, no matter how little sense it makes.

With that in mind, I urge you to remind the developer's to NOT exclude any possible functionality, just because "it doesn't make sense". If it can be represented mathematically, they should err on the side of including it ANYWAY. Someone, somewhere will find some amazing use for it.

I agree!

lino.grandi
12-23-2015, 12:50 AM
Lino, one thing I've learned from watching everybody else is that if there's the possibility of something that "doesn't make sense", such as, ohhhh, negative brightness in a light, Lighwave users will come up with a use, no matter how little sense it makes.

With that in mind, I urge you to remind the developer's to NOT exclude any possible functionality, just because "it doesn't make sense". If it can be represented mathematically, they should err on the side of including it ANYWAY. Someone, somewhere will find some amazing use for it.


Yes...and no.

I agree it is nice to be open to any combinations and possibilities. But too much freedom sometimes means giving a user the possibility to do things that really don't make any sense and make an object or deformation look just wrong. Negative lights are something different and more clear to the user, so that's a good thing. But we're talking about something else here, and can't generalize too much.

Just think about a subpatch object deformed by bones. You apply the subdivision after the bone deformation, and it will look absolutely better than if you apply it before bones. And there's no reason why you should have your model look worse, right?

I've seen a lot of user just setting the Subdivision order to Last, always. Then wondering why they can't see the detailed effect of a displacement...freedom is a good thing for sure, as long as some basic concepts are well understood.

Now....think about a deformed object with instances applied, and give me one possible situation where you would not want the instances to stay consistent with the deformed object. Just one single situation, then I'll shut up. ;)

pinkmouse
12-23-2015, 02:37 AM
Now....think about a deformed object with instances applied, and give me one possible situation where you would not want the instances to stay consistent with the deformed object. Just one single situation, then I'll shut up. ;)

Rice or confetti on a drumskin. Trees or buildings on a landscape affected by an earthquake/liquefaction.

Or how about a CA example, a blob monster. You might want to edit the movements as it advances relentlessly on our heroes as per normal CA, but then afterwards add deformers to show bullet hits and the resulting ripples spreading through the body.

By all means have a standard "preset" arrangement that makes things usable to newbs, but allow those that want to to dive in and swap things around. Please! :)

lino.grandi
12-23-2015, 02:54 AM
Rice or confetti on a drumskin. Trees or buildings on a landscape affected by an earthquake/liquefaction.

Or how about a CA example, a blob monster. You might want to edit the movements as it advances relentlessly on our heroes as per normal CA, but then afterwards add deformers to show bullet hits and the resulting ripples spreading through the body.

By all means have a standard "preset" arrangement that makes things usable to newbs, but allow those that want to to dive in and swap things around. Please! :)

No, none of this examples fit what I've been asking for. And of course all you're mentioning is logical and perfectly doable already.

I'm asking for a real example where you're applying instances to a deformed object and still (for some unknown reason!) want the instances to be applied to the mesh in its undeformed state.

pinkmouse
12-23-2015, 04:13 AM
Okay, off the top of my head:

I'm commissioned to do a rush infographic for a news channel after a mudslide disaster. I fire up LW2016, load in a DEM of the area with the new LW file interchange tools, ( :) ), apply it to a mesh, and use weightmaps and instancing to add generic buildings to the terrain in the appropriate areas. Then I create an animated mudslide using a nodal network to raise the ground surface in affected areas and apply a new texture. I'm adding another displacement to the existing deformation, yet I want the instances to stay fixed to the original displaced mesh so they become buried.

Does that fit your criteria? :D

lino.grandi
12-23-2015, 04:45 AM
Okay, off the top of my head:

I'm commissioned to do a rush infographic for a news channel after a mudslide disaster. I fire up LW2016, load in a DEM of the area with the new LW file interchange tools, ( :) ), apply it to a mesh, and use weightmaps and instancing to add generic buildings to the terrain in the appropriate areas. Then I create an animated mudslide using a nodal network to raise the ground surface in affected areas and apply a new texture. I'm adding another displacement to the existing deformation, yet I want the instances to stay fixed to the original displaced mesh so they become buried.

Does that fit your criteria? :D

Sure. An undeformed copy of your mesh used to instance the buidlings or a simple "Use Undeformed Mesh State" option in the instances would work perfectly in that case. :D

jeric_synergy
12-23-2015, 09:18 AM
Is this technically difficult a feature to include? The principle is valid: weird nonsensical procedures often lead to useful techniques. It might take a while and a LW genius to come up with the use, but it happens repeatedly.

pinkmouse
12-23-2015, 09:21 AM
... and a LW genius to come up with the use...

I already did! :D

Jarno
12-23-2015, 04:32 PM
Is this technically difficult a feature to include? The principle is valid: weird nonsensical procedures often lead to useful techniques.

The counterforce to implementing weird things is that it can make future improvements difficult or impossible. And then we get questions and complaints about why this thing can't be made faster, or that thing supported, or the other thing be more memory efficient. The answer, depressingly often, is because we have to support some weird esoteric thing which is fundamentally incompatible.

For example, one of the things I silently curse about when dealing with LW geometry are "holagons". Polygons which have a hole in them made by using the same vertex or edge multiple times in the same polygon. This falls outside the definition of a polygon in many textbooks, and numerous standard mesh algorithms will fail on them. Similarly with meshes where an edge is shared by more than two polygons.
So we have to spend processing time trying to detect such situations, and come up with ways of filtering them out or handling them in some graceful way. That makes the code slower, more complicated, more prone to bugs, and longer to develop.

Another example is with using the undeformed versus deformed meshes. Plugins can currently ask for mesh information in four different states:

Undeformed base mesh
Deformed base mesh
Undeformed subdivided mesh
Deformed subdivided mesh

The mesh evaluation system has to generate and keep in memory all four of them. And some of them are rather tricky. If subdivision occurs after some deformation has been done, how do we get an undeformed subdivided mesh? By having to do the subdivision twice: once with the deformed base mesh, and once with the undeformed base mesh. That negatively impacts speed and memory usage.
And that is now causing some headaches as we move towards arbitrary mesh editing in the modifier stack. What is the undeformed position of a vertex that is produced by a procedural mesh edit done after some deformations? (Subdivision only gets away with it by making some assumptions, such as that it is based on interpolation and that it produces the same number of vertices and polygons independent of the vertex positions).

What makes it more frustrating is that of those four mesh states, only two are actually visible to the user: the undeformed base mesh (the initial state of the mesh), and the deformed subdivided mesh (the final evaluated state of the mesh). The other two states are never seen by the user, so it is actually quite confusing for the user to create dependencies on them (e.g. placing instances on the undeformed subdivided mesh).

The mesh system would be a lot better, both for developers and almost all users, if plugins only had access to the undeformed base mesh, and the deformed final mesh (or whatever the state of the mesh is when the plugin gets the information, if it is part of the mesh's evaluation).

So that is something that we are currently trying to get a grip on. What would we lose by removing some of the esoteric stuff? Can it be replaced by something more sane?

---JvdL---

jeric_synergy
12-23-2015, 05:37 PM
Jarno, thank you for the informative answer! "Holagons" was especially interesting. Happy Holidays!

jasonwestmas
12-23-2015, 06:51 PM
yes please remove and replace some of the esoteric systems that are clogging the arteries of Lightwave Development. Thanks!

Surrealist.
12-23-2015, 09:53 PM
Am I being too presumptuous to assume that by replacing the mesh system with the UGE that all previous methods of handling mesh and deformation are history? Aside from modeler, I would assume once anything passes into that system is now converted and thus playing by new rules.

Not saying that instantly all of the things we want to see happen will be there. But I think it would be safe to say - especially development wise - that all that was before is gone.

mattc
12-24-2015, 12:04 AM
But I think it would be safe to say - especially development wise - that all that was before is gone.

That would not be altogether unwelcome given the issues that have been all too apparent over the years. As long as users/developers are well informed (as per Jarno's excellent and well detailed post above) then it shouldn't be an issue. If anything, this sort of thing can be used to educate people clearly so that feature requests/enhancements can be well articulated to development staff at NT and then implemented in a sensible way. Systems Engineering 101 really.

jeric_synergy
12-24-2015, 12:06 AM
Am I being too presumptuous to assume that by replacing the mesh system with the UGE that all previous methods of handling mesh and deformation are history?
WHile that may be true for coders, that hardly stands for users, those working thru the user interface. I'd expect some plugins to become non-functional, but the other native stuff could still apply.

Surrealist.
12-24-2015, 12:31 AM
I meant merely from a architectural standpoint. There would be, I would think, a new structure in place. Making moot any discussion about previous methods of handling data. Just pure logic there.

As far as users go, I think we will benefit from speed and ability to handle mesh data right out of the gate along with a new rendering paradigm.

The user interface and any tools written from there on out I would say would be fairly malleable, within certain limits. But the rest of it, I'd say we are stuck with, for better or for worse, based on the work that they have done on the UGE up to now.

I'd say mostly for the better. :)

Jarno
12-24-2015, 12:38 AM
Am I being too presumptuous to assume that by replacing the mesh system with the UGE that all previous methods of handling mesh and deformation are history?

Existing methods are mostly still available. For example, existing deformer plugins are automatically wrapped to work in the new mesh modifier stack. Some may do weird things due to the greater flexibility, putting them in situations that weren't anticipated by the plugin writer.
For plugin writers the familiar mesh interfaces are still in place. The main problem that we have encountered with plugins that we ship is getting information about one mesh through an interface obtained for another mesh. That used to work because of how it was tied to the implementation details of the old mesh system.

A lot of time and effort has gone into ripping out the old mesh system, and replacing it with something that is not only better, but can also be easily replaced in the future without having to spend another 6 months making changes all over Layout. There were a lot of internal dependencies on the exact implementation details of the old mesh system.
(E.g. code which assumes that a vertex identifier is actually a memory address to a block of data about that vertex, which in turn includes the memory address of the mesh that vertex is in; which all had to be rewritten so that the vertex identifier could be anything, such as an index, and take a mesh identifier alongside it because requiring the mesh system to be able to look up a mesh based on only the vertex identifier is very limiting on the possible mesh implementations).

The new mesh system is now at a stage where the old mesh system isn't involved with Layout at all anymore. There is no conversion on loading. I would not be surprised if there are some particularly weirdly constructed meshes which it doesn't like, but so far they haven't come up in testing. And if there are, you can always fix them up in Modeler.

---JvdL---

CaptainMarlowe
12-24-2015, 01:44 AM
Thanks Jarno for your inputs. I'm not sure to understand everything but it seems to be really promising.

pinkmouse
12-24-2015, 02:16 AM
Fascinating stuff Jarno. Personally, I'd much prefer it if the blog posts covered this sort of topic, rather than the "Ooh, that's pretty" feel they have at the moment. ;)

creacon
12-24-2015, 02:58 AM
Jarno,

What would make the most sense to me is to have a nodal system that has a mesh as input/output.
Mesh data comes in, the node does something with it (subdivide, displace, remove parts, add parts,...), mesh data goes out. the final output is fed into the render.

If you want access to the mesh at another time, you can "fork" the output (which means that you will be making a copy of that data).

A "normal" use case would just be a list of sequential nodes, an advanced use case could be a complex nodal network, eg resize geometry, displace, subtract from base mesh, output something completely different at the end.

The "normal" use case can be represented as the stack you already have, that's all you need to see for day to day usage. If you need deep access press the magic button and get access to "the guts" ie the nodes.

creacon




The mesh system would be a lot better, both for developers and almost all users, if plugins only had access to the undeformed base mesh, and the deformed final mesh (or whatever the state of the mesh is when the plugin gets the information, if it is part of the mesh's evaluation).

So that is something that we are currently trying to get a grip on. What would we lose by removing some of the esoteric stuff? Can it be replaced by something more sane?

---JvdL---

tischbein3
12-24-2015, 03:25 AM
The mesh system would be a lot better, both for developers and almost all users, if plugins only had access to the undeformed base mesh, and the deformed final mesh (or whatever the state of the mesh is when the plugin gets the information, if it is part of the mesh's evaluation).
Wouldn't you need the deformed basemesh for weight painting ?

edit
thats one usage scenario wich comes into mind, of course the solution for this would be if
layout allocates the needed data upon request.

lino.grandi
12-24-2015, 03:28 AM
Jarno,

What would make the most sense to me is to have a nodal system that has a mesh as input/output.
Mesh data comes in, the node does something with it (subdivide, displace, remove parts, add parts,...), mesh data goes out. the final output is fed into the render.

If you want access to the mesh at another time, you can "fork" the output (which means that you will be making a copy of that data).

A "normal" use case would just be a list of sequential nodes, an advanced use case could be a complex nodal network, eg resize geometry, displace, subtract from base mesh, output something completely different at the end.

The "normal" use case can be represented as the stack you already have, that's all you need to see for day to day usage. If you need deep access press the magic button and get access to "the guts" ie the nodes.

creacon

I'm sure Jarno will have a lot to say about this.

My point of view is that you're describing modeling operations (with mesh topology changes!) more than just deformation changes. Something like Houdini does. And something raising up the complexity needed to perform even some basic operations.

As always, I would like to read about some real world examples describing situations that such a system would let you do and that are impossible to achieve in LightWave for now.

creacon
12-24-2015, 03:44 AM
That was the whole point of one of my previous posts. You were the one who called this a modifier stack, I was the one who called it a deformation stack ;-)

You're talking about Houdini, but Maya has this ability too. Our TD even uses a node to cut away geometry that he doesn't want in a further stadium. Giving you usage case examples would be giving too much away of our tricks. But believe me there are plenty.

And you're wrong in saying that this would add complexity to the simplest tasks, as I described in my previous post the "normal" use wouldn't be any different to the user than the stack you have now.
What it would add is memory overhead and it will have consequences for speed. But that's less of a problem in today's systems.


I'm sure Jarno will have a lot to say about this.

My point of view is that you're describing modeling operations (with mesh topology changes!) more than just deformation changes. Something like Houdini does. And something raising up the complexity needed to perform even some basic operations.

As always, I would like to read some real world examples describing situations that such a system would let you do and that are impossible to achieve in LightWave for now.

lino.grandi
12-24-2015, 04:03 AM
That was the whole point of one of my previous posts. You were the one who called this a modifier stack, I was the one who called it a deformation stack ;-)

I already stated it is a deformation stack multiple times. And I think what I wrote in the blog post clarifies that.



You're talking about Houdini, but Maya has this ability too.

Yes, but on a different level than Houdini.


Our TD even uses a node to cut away geometry that he doesn't want in a further stadium. Giving you usage case examples would be giving too much away of our tricks. But believe me there are plenty.

I'm talking about describing practical 3D final animation production examples, not about the process involved to reach the final result. I've been working with Maya and often took advantage of its architecture. But I learned there's always a cost for that complexity.
There are different ways to reach the same target. And different kind of software to do that. Like Cinema4D for example. Now that's a good compromise between ease of use and the visual complexity of the final result.



And you're wrong in saying that this would add complexity to the simplest tasks, as I described in my previous post the "normal" use wouldn't be any different to the user than the stack you have now.

Mesh evaluation system would be totally different, and more complex for sure. Of course I'm talking about what needs to exist in the system once you've created just a simple box.
In Maya you can (and you have to) "purge unused nodes" often, so you can free the system from dependencies you don't need anymore.



What it would add is memory overhead and it will have consequences for speed. But that's less of a problem in today's systems.

Are we talking about consumer or professional level? Not every user can afford a super fast PC. LightWave works great on a Surface Pro. Maya and Houdini suffer (well, Houdini doesn't even start!).

RebelHill
12-24-2015, 04:07 AM
Fascinating stuff Jarno. Personally, I'd much prefer it if the blog posts covered this sort of topic, rather than the "Ooh, that's pretty" feel they have at the moment. ;)

Gah... on the one hand, totes, +1 jarno blog... on the other... no way, -1 jarno blog, he's got enough to be dealing with already. So conflicted.

pinkmouse
12-24-2015, 04:23 AM
...So conflicted.

:D

pinkmouse
12-24-2015, 04:37 AM
Just a thought...

If the way plugins work is being altered, is is worth trying to create a more "platform neutral" way of writing them? Is something like Python fit for purpose so that we no longer have to have different versions and compiles for Windoze and OSX? Just thinking of the amount of extra work those that support both have to go through, let alone the amount of plugins that just aren't available for Mac.

creacon
12-24-2015, 04:47 AM
I already stated it is a deformation stack multiple times. And I think what I wrote in the blog post clarifies that.

>> yes you did, after I pointed you to it ;-)

Yes, but on a different level than Houdini.

>> Indeed, and even LW has it to a certain degree, I did a test where I created a custom node input/output passing particle buffers from one node to another and it all works fine, so I don't see why the same couldn't be done for meshes (and btw I am not saying it's easy or trivial)

I'm talking about describing practical 3D final animation production examples, not about the process involved to reach the final result. I've been working with Maya and often took advantage of its architecture. But I learned there's always a cost for that complexity.

>> Yes, and in Maya's case that cost is almost always speed.

There are different ways to reach the same target. And different kind of software to do that. Like Cinema4D for example. Now that's a good compromise between ease of use and the visual complexity of the final result.

>> I agree

Mesh evaluation system would be totally different, and more complex for sure. Of course I'm talking about what needs to exist in the system once you've created just a simple box.

>> Sorry, but I can't follow, and i mean this language wise, what are you saying?

In Maya you can (and you have to) "purge unused nodes" often, so you can free the system from dependencies you don't need anymore.

>> Yes, that's the memory and speed overhead I was talking about, in a program like
Fusion this is solved by implementing some kind of smart caching, but I guess it would be a nightmare to get that translated to a full 3D application.

Are we talking about consumer or professional level? Not every user can afford a super fast PC. LightWave works great on a Surface Pro. Maya and Houdini suffer (well, Houdini doesn't even start!).

>>Any of those, for the price of a surface Pro you can buy a pretty powerful PC.

creacon

lino.grandi
12-24-2015, 05:09 AM
I already stated it is a deformation stack multiple times. And I think what I wrote in the blog post clarifies that.

>> yes you did, after I pointed you to it ;-)

The blog post was out well before your comment. :)

https://blog.lightwave3d.com/2015/12/the-modifier-stack/



[I]
Yes, but on a different level than Houdini.

>> Indeed, and even LW has it to a certain degree, I did a test where I created a custom node input/output passing particle buffers from one node to another and it all works fine, so I don't see why the same couldn't be done for meshes (and btw I am not saying it's easy or trivial)

It is exactly that "certain degree" that can make a huge difference in performances.



I'm talking about describing practical 3D final animation production examples, not about the process involved to reach the final result. I've been working with Maya and often took advantage of its architecture. But I learned there's always a cost for that complexity.

>> Yes, and in Maya's case that cost is almost always speed.

Exactly.




There are different ways to reach the same target. And different kind of software to do that. Like Cinema4D for example. Now that's a good compromise between ease of use and the visual complexity of the final result.

>> I agree

Mesh evaluation system would be totally different, and more complex for sure. Of course I'm talking about what needs to exist in the system once you've created just a simple box.

>> Sorry, but I can't follow, and i mean this language wise, what are you saying?

Open Houdini. Create a Box. Take a Look at what's there node wise.



In Maya you can (and you have to) "purge unused nodes" often, so you can free the system from dependencies you don't need anymore.

>> Yes, that's the memory and speed overhead I was talking about, in a program like
Fusion this is solved by implementing some kind of smart caching, but I guess it would be a nightmare to get that translated to a full 3D application.

Are we talking about consumer or professional level? Not every user can afford a super fast PC. LightWave works great on a Surface Pro. Maya and Houdini suffer (well, Houdini doesn't even start!).

>>Any of those, for the price of a surface Pro you can buy a pretty powerful PC.

creacon

Probably. But I'm talking about Surface's hardware here, not about its price (when we talk about the Surface we're talking about a Tablet more than a notebook, it is way more portable, and that is of course making it not so cheap at all).

Not anyone can spend more than $1000 for a pc.

Every4thPixel
12-24-2015, 05:36 AM
If the poly count of a mesh can't change in a stack it's a deformer stack simply because the only thing it does is deform mesh.

lino.grandi
12-24-2015, 06:06 AM
If the poly count of a mesh can't change in a stack it's a deformer stack simply because the only thing it does is deform mesh.

Sure. Even if in this case the poly count can change, since you can subdivide the mesh and then displace the subdivided verteces. But of course you can't do topology changes at the moment. The Subdivision modifier (I would not call it a deformer in this case) is the only one able to provide access to new geometry based on the cage mesh that can actually be manipulated.

Let's not focus on the Modifier/Deformation stack term too much. I think we've clarified that enough (and, again, my initial blog post is pretty clear about that...I never mentioned the creation of new geometry). A deformer modifies the mesh shape (though not creating new geometry)...so it is somehow a modifier, isn't it?

Renaming it Deformation Stack could be an option.

Marander
12-24-2015, 06:33 AM
LightWave works great on a Surface Pro. Maya and Houdini suffer (well, Houdini doesn't even start!).

Houdini works ok on the Surface Pro (for basic things at least).

But I agree LW works very well on almost every device due to its small footprint and architecture. For example houdini crashes a whole VMware session when trying to start while LW runs well within VMware.

131581

lino.grandi
12-24-2015, 07:02 AM
[QUOTE=Marander;1460293]Houdini works ok on the Surface Pro (for basic things at least).

Not on my Surface Pro 2. :(

jasonwestmas
12-24-2015, 08:39 AM
Renaming it Deformation Stack could be an option.

I assumed you guys were going to add more modifiers to this "Modify" tab/stack that weren't necessarily all about deformation. Also seeing as modifying geometry is such a general term. Calling it a deformation stack would be dramatically more specific. I'm also assuming that lightwave isn't going the 3dsmax route where every modifier is applied to a single modifier stack however.

I also found that primitive tab to be an interesting name. Again, a very generic term.

jasonwestmas
12-24-2015, 08:49 AM
[QUOTE=Marander;1460293]Houdini works ok on the Surface Pro (for basic things at least).

Not on my Surface Pro 2. :(

yeah those "other" programs out there tap into the GPU of your hardware much heavier than lightwave does. Integrated graphics just don't work well with CG software in general I find. Unless we are talking about something like zbrush which doesn't really need much GPU power at all.

jasonwestmas
12-24-2015, 09:05 AM
If the poly count of a mesh can't change in a stack it's a deformer stack simply because the only thing it does is deform mesh.

Not necessarily, why would you assume that this stack "that is in the modify tab" is only going to do deformations? Because you saw one video?

Surrealist.
12-24-2015, 09:53 AM
In Blender all changes to the mesh even dynamics and particles live in the Modifer stack. So everything from Booleans, arrays to multires for sculpting all live in the stack. It is a handy way of doing things. I assumed that they have something more or less like this in mind. Add to that the ability to handle such large bits of data could open the door to interesting possibilities, should they go that route.

Kuzey
12-24-2015, 10:37 AM
Existing methods are mostly still available. For example, existing deformer plugins are automatically wrapped to work in the new mesh modifier stack. Some may do weird things due to the greater flexibility, putting them in situations that weren't anticipated by the plugin writer.
For plugin writers the familiar mesh interfaces are still in place. The main problem that we have encountered with plugins that we ship is getting information about one mesh through an interface obtained for another mesh. That used to work because of how it was tied to the implementation details of the old mesh system.

A lot of time and effort has gone into ripping out the old mesh system, and replacing it with something that is not only better, but can also be easily replaced in the future without having to spend another 6 months making changes all over Layout. There were a lot of internal dependencies on the exact implementation details of the old mesh system.
(E.g. code which assumes that a vertex identifier is actually a memory address to a block of data about that vertex, which in turn includes the memory address of the mesh that vertex is in; which all had to be rewritten so that the vertex identifier could be anything, such as an index, and take a mesh identifier alongside it because requiring the mesh system to be able to look up a mesh based on only the vertex identifier is very limiting on the possible mesh implementations).

The new mesh system is now at a stage where the old mesh system isn't involved with Layout at all anymore. There is no conversion on loading. I would not be surprised if there are some particularly weirdly constructed meshes which it doesn't like, but so far they haven't come up in testing. And if there are, you can always fix them up in Modeler.

---JvdL---

Does this mean we will finally get past LW's internal point order system that produces crazy results in Modeler with tools like bridge tool?


https://www.youtube.com/watch?v=HxnjiQjaZw4

That would be major awesome in my book :)

lino.grandi
12-24-2015, 11:05 AM
Does this mean we will finally get past LW's internal point order system that produces crazy results in Modeler with tools like bridge tool?

The new mesh system has nothing to do with Modeler.

Kuzey
12-24-2015, 11:08 AM
The new mesh system has nothing to do with Modeler.

Wasn't it going to be added to modeler as it went along until Layout/Modeler unification?

EDIT:

Opps..I got excited when Jarno mentioned vertices....still, modelling tools will have to use the new mesh system sooner or later..right?

So to recap:

No Modeler engine/core work (starting from scratch rebuilding etc.)

New Modeler tools will be added only to existing framework but will be easier to migrate to Unified app later on.

Is that right?

hrgiger
12-24-2015, 11:28 AM
New mesh engine is only for Layout and does not affect modeler. Layout now has the ability to recognize and create geometry. But that's just the system itself, modeling tools will likely come later. But there will never be a new geometry engine inside of Modeler.

Kuzey
12-24-2015, 11:35 AM
New mesh engine is only for Layout and does not affect modeler. Layout now has the ability to recognize and create geometry. But that's just the system itself, modeling tools will likely come later. But there will never be a new geometry engine inside of Modeler.

Yes..I understood that.

My concern is how long will it take to get a modern modelling system..no matter if LW is going to be a two app program or one.

Layout has be rewritten twice/or three times already..modeler is pretty much the same as it ever was.

hrgiger
12-24-2015, 11:41 AM
You'll likely not get any realistic answer for how long it will take. My thought is, LW2016 is about laying the foundation for modeling within Layout, future releases will be about creating tools to operate in the new environment.

If Layout has ever been rewritten before, that would be news to me. Never has both the geometry engine and rendering engine been replaced before. Such deep changes take time and its a realistic approach to address the foundation before new tools and workflows are added.

Kuzey
12-24-2015, 11:47 AM
You'll likely not get any realistic answer for how long it will take. My thought is, LW2016 is about laying the foundation for modeling within Layout, future releases will be about creating tools to operate in the new environment.

If Layout has ever been rewritten before, that would be news to me. Never has both the geometry engine and rendering engine been replaced before. Such deep changes take time and its a realistic approach to address the foundation before new tools and workflows are added.

I thought the rendering engine got replaced..anyway, it doesn't matter.

Layout has had constant development like forever & Modeler is the same from say 7.5 but with a few new tools.

lino.grandi
12-24-2015, 12:26 PM
I thought the rendering engine got replaced..anyway, it doesn't matter.

Layout has had constant development like forever & Modeler is the same from say 7.5 but with a few new tools.

The render is totally new, yes.

jasonwestmas
12-24-2015, 12:28 PM
In Blender all changes to the mesh even dynamics and particles live in the Modifer stack. So everything from Booleans, arrays to multires for sculpting all live in the stack. It is a handy way of doing things. I assumed that they have something more or less like this in mind. Add to that the ability to handle such large bits of data could open the door to interesting possibilities, should they go that route.

Right, I find the "one stack to control them all" approach to be a pretty good system ime. Just guessing of course but it looks like future lightwave may have multiple modifier stacks categorized via the old tabbing categorization system. That's just what it looks like atm.

khan973
12-24-2015, 12:29 PM
While you guys are dealing with deformations, is there a chance that we see in a near future nice bending, taper or even a cage deformer ?

Surrealist.
12-24-2015, 12:46 PM
Right, I find the "one stack to control them all" approach to be a pretty good system ime. Just guessing of course but it looks like future lightwave may have multiple modifier stacks categorized via the old tabbing categorization system. That's just what it looks like atm.

If that is truly the case then I am not really in favor of that decision. Bad idea in my opinion. I certainly hope that is not the case.

hrgiger
12-24-2015, 01:37 PM
The render is totally new, yes.

Lino, I think Kuzey was implying before this new version as in it was replaced in previous versions. It has changed quite a bit but I don't know that it has ever been replaced completely like it has been for the next version.




Layout has had constant development like forever & Modeler is the same from say 7.5 but with a few new tools.

I doubt many people would argue with you on that point. A lot of us have been very unhappy with how little has been done to improve modeling over the years in LightWave. You have to consider though that LW has had some management shifts over the time since version 7 with differing stragies on how best to develop (and fix) LightWave. We know of course about the exodus of what would become Modo developers, their decision to rewrite LW to LW CORE and now Rob's decision to slowly revamp LW architecture. One never went anywhere, one got off the launch pad and exploded before orbit, and the third is in progress... But if you're thinking longer term, replacing the mesh engine (as well as making it modular and easy to replace like Jarno mentioned) is a huge step towards making a more modern modeling environment for LightWave. But if you are looking for that modern modeling environment, it won't be in the next version.

Kuzey
12-24-2015, 03:18 PM
What I'm looking for is an indication that LW3D group can actually develop/update/innovate modelling tools & it seems the big problem is this "internal point order" system that Modeler uses, that and the requirement that objects need to be in world center for some/many tools to work. If they fix these two issues, then modeler will already look and feel modern..just saying.

We get a few new tools added once in awhile and that's it for next few years or so.

I'd also like to see LW2016 retina compatible..it's simple stuff..do it...DOOOOO IT :)

EDIT:

Yes I know that's a simplification and I'm sure some tools will need to be updated but some might not.

Jarno
12-24-2015, 03:48 PM
It is called a modifier stack because right now it does include the subdivision step (or more accurately, the mesh freezing step which converts the various polytypes into basic polygons), which creates geometry.
It can do so because of all sorts of special handling and restrictions that are in place. It doesn't yet allow us to extend it to arbitrary mesh editing operations, in a way that provides a good user experience.

We are working towards implementing mesh edits on the modifier stack, but it has to be usable. It would be useless if it drops the OpenGL viewport drawing to 1 frame per second for example, or if there are no tools available to make use of it, or if it makes Layout an unstable mess at release because we didn't have the time to find all the unexpected consequences in testing.

---JvdL---

Chris S. (Fez)
12-24-2015, 04:45 PM
If Lightwave just works far faster exactly as it works now then that is certainly enough for 2016. Having said that, it seems to me it will be a much more daunting task to implement mesh editing later rather than account and plan for it now.

Surrealist.
12-24-2015, 05:41 PM
It is called a modifier stack because right now it does include the subdivision step (or more accurately, the mesh freezing step which converts the various polytypes into basic polygons), which creates geometry.
It can do so because of all sorts of special handling and restrictions that are in place. It doesn't yet allow us to extend it to arbitrary mesh editing operations, in a way that provides a good user experience.

We are working towards implementing mesh edits on the modifier stack, but it has to be usable. It would be useless if it drops the OpenGL viewport drawing to 1 frame per second for example, or if there are no tools available to make use of it, or if it makes Layout an unstable mess at release because we didn't have the time to find all the unexpected consequences in testing.

---JvdL---

Did I hear a subliminal call for beta testing? lol :D

Seriously, I think if the development team is going in this direction, opening up that is, I wonder if you guys have considered this? At some appropriate time in the future, is it possible we might see an open LightWave beta for 2016 and/or beyond?

Jarno
12-24-2015, 09:21 PM
When we started all this work on the new mesh system, we did have to make a choice on whether to do it for Layout, Modeler, or both. Keep in mind that such a drastic large-scale change hadn't really been done to LW within the working memories of pretty much everyone on the development team.

When looking at what would be affected and what would have to be rewritten, it quickly became clear that Modeler would be by far the biggest challenge. It has many many tools which work with the mesh (obviously), and are written in such a way that they only work with the old mesh system.
Additionally, the old mesh system was made for Modeler, and then used in Layout by bolting on a lot of stuff to satisfy its quite different requirements.

So Layout had the greater need with the lesser amount of work needed, and we tackled it first. Then with the experience gained, lots of new code in place, and the release clock reset, we can look at the modelling tools including the possibility of making them work in Layout.

Unless you all promise to give us money now and wait until 2017, the work had to be split up with release milestones.
I would have been happy if all we managed this cycle was to replace the mesh system in Layout with equivalent functionality as the old system. That we also already got a modifier stack and some significant speed improvements is just Christmas pudding on top.

---JvdL---

Surrealist.
12-24-2015, 09:34 PM
Cheers to that. And well done. And Merry Xmas!

allabulle
12-24-2015, 11:57 PM
Thanks for all the explanations so far, Jarno. And as Chris S. (Fez) said above, if LightWave stayed the same but with much faster deformations that alone would make me want to upgrade. And we're having a lot more than that! Hopefully by the tame it's published it will be stable enough; changes that deep can be tricky to polish, I guess. By the looks of it you've done a really good job indeed. I can't wait to try this next version of LightWave.

Kuzey
12-25-2015, 05:51 AM
All sounds good Jarno,

So how far is Layout, will it be complete by end of the 2016 cycle or will it take another year or two to get there? (I'm not asking for exact dates before someone else jumps in :)

Just wondering, have you guys played around with creating actual modeling tools in Layout yet? I wouldn't mind if you create a proper bridge or untangle(creating circles on existing mesh) tool in layout

lino.grandi
12-25-2015, 08:23 AM
All sounds good Jarno,

So how far is Layout, will it be complete by end of the 2016 cycle or will it take another year or two to get there? (I'm not asking for exact dates before someone else jumps in :)

Just wondering, have you guys played around with creating actual modeling tools in Layout yet? I wouldn't mind if you create a proper bridge or untangle(creating circles on existing mesh) tool in layout

Can you please clarify what you mean by "complete"?

If with that you mean having modeling tools in Layout, then no, that's not going to happen in the next release.

JohnMarchant
12-25-2015, 08:37 AM
When we started all this work on the new mesh system, we did have to make a choice on whether to do it for Layout, Modeler, or both. Keep in mind that such a drastic large-scale change hadn't really been done to LW within the working memories of pretty much everyone on the development team.

When looking at what would be affected and what would have to be rewritten, it quickly became clear that Modeler would be by far the biggest challenge. It has many many tools which work with the mesh (obviously), and are written in such a way that they only work with the old mesh system.
Additionally, the old mesh system was made for Modeler, and then used in Layout by bolting on a lot of stuff to satisfy its quite different requirements.

So Layout had the greater need with the lesser amount of work needed, and we tackled it first. Then with the experience gained, lots of new code in place, and the release clock reset, we can look at the modelling tools including the possibility of making them work in Layout.

Unless you all promise to give us money now and wait until 2017, the work had to be split up with release milestones.
I would have been happy if all we managed this cycle was to replace the mesh system in Layout with equivalent functionality as the old system. That we also already got a modifier stack and some significant speed improvements is just Christmas pudding on top.

---JvdL---

You can take my money already. From what we have seen so far and the roadmap 2016 is well worth the upgrade.

Kryslin
12-25-2015, 09:15 AM
A question: Does the new deformation stack deal with Bug LDB-13738 (Dealing with interactive updates of nodally driven morphs not happening, and requiring a nasty hack or two to get working without enabling studio live)? I figured since everything has been smashed to bits and re-assembled, this might have been taken care of...

And yes, where do I send my money? :)

Kuzey
12-25-2015, 09:30 AM
Can you please clarify what you mean by "complete"?

If with that you mean having modeling tools in Layout, then no, that's not going to happen in the next release.

No, I meant complete as in how Layout 2015 is complete today.

What I'm asking is when will the modelling side of things be tackled...either in modeler or Layout or both.

Is it going to take 4 years before modelling issues start to get addressed?

jasonwestmas
12-25-2015, 11:10 AM
From an animation perspective, the modifier stack makes layout look faaar more interesting to me. Ill pay just to play test. Thanks Jarno!

Surrealist.
12-25-2015, 04:35 PM
I'll chime in on the rendering part. +1 that what is revealed so far is worth the upgrade. That along with the ability to handle large mesh data well. I think the rigging animation side of LightWave is still not where I'd like it to be, but it is good to see improvement in that area. But as a final destination for rendering, it is looking even better. :)

lightscape
12-26-2015, 01:08 AM
When we started all this work on the new mesh system, we did have to make a choice on whether to do it for Layout, Modeler, or both. Keep in mind that such a drastic large-scale change hadn't really been done to LW within the working memories of pretty much everyone on the development team.

When looking at what would be affected and what would have to be rewritten, it quickly became clear that Modeler would be by far the biggest challenge. It has many many tools which work with the mesh (obviously), and are written in such a way that they only work with the old mesh system.
Additionally, the old mesh system was made for Modeler, and then used in Layout by bolting on a lot of stuff to satisfy its quite different requirements.

So Layout had the greater need with the lesser amount of work needed, and we tackled it first. Then with the experience gained, lots of new code in place, and the release clock reset, we can look at the modelling tools including the possibility of making them work in Layout.

Unless you all promise to give us money now and wait until 2017, the work had to be split up with release milestones.
I would have been happy if all we managed this cycle was to replace the mesh system in Layout with equivalent functionality as the old system. That we also already got a modifier stack and some significant speed improvements is just Christmas pudding on top.

---JvdL---

Is it already possible? Or its still just being looked at?
If its proven possible then I'm pretty sure people including me will give NT money now and the next releases because this is the biggest flaw in lightwave being split in two. I'm locked in with the new lw 2015 deal anyway so that's atleast a couple more releases I'm willing to put faith on NT.

lino.grandi
12-26-2015, 02:09 AM
Is it already possible? Or its still just being looked at?
If its proven possible then I'm pretty sure people including me will give NT money now and the next releases because this is the biggest flaw in lightwave being split in two. I'm locked in with the new lw 2015 deal anyway so that's atleast a couple more releases I'm willing to put faith on NT.

While that's not going to happen in the next release, for sure that's one of our targets. Having amazing people such as Jarno in our team gives us and users the guarantee that this process is developing in the best possible way, system/features/time wise.

hrgiger
12-26-2015, 03:50 AM
Based on Jarnos comments, it's hard to assess where things are headed. From the initial blog post and subsequent discussions, it seemed clear that modeling moving to layout seemed like an eventuality, but going from Jarnos comments here, it seems like they're still trying to determine if that is even possible. Everything I've seen from 2016 looks good, it would just be nice to get some clarification on what (and where) the future of modeling is for LightWave.

VermilionCat
12-26-2015, 03:56 AM
Ditto. A bit confusing.

nez
12-26-2015, 04:49 AM
I think that the point made is quite clear, being Layout the part being completely overhauled i.e with the UGE wich means that it can recognize vertex, points and whatnot, and THAT is the BASE for the modelling tools YET to come, wich means that modeler will be slowly ported into layout most likely ( or completely new tools ), so modeler as a standalone I think that will have some love, but not as much as Layout.

hrgiger
12-26-2015, 05:00 AM
I think that the point made is quite clear, being Layout the part being completely overhauled i.e with the UGE wich means that it can recognize vertex, points and whatnot, and THAT is the BASE for the modelling tools YET to come, wich means that modeler will be slowly ported into layout most likely ( or completely new tools ), so modeler as a standalone I think that will have some love, but not as much as Layout.

That's the impression I got initially but now Jarno is merely talking about the 'possibiliity' of modeling tools working in layout which is why I am unsure.

12-26-2015, 05:10 AM
Seems they have these tools but the tools are horrendous in their operation/how the user experiences it.
Sounds like they are having mesh explosions and the like.

Ya know, I miss the beta programs, being able to play with the newStuff and watch it mature, and then having the next incarnation come with the many point updates afterward.
Hope the LW3D group has studios and other bold users stressing the system.

It all looks interesting.

nez
12-26-2015, 05:17 AM
When looking at what would be affected and what would have to be rewritten, it quickly became clear that Modeler would be by far the biggest challenge. It has many many tools which work with the mesh (obviously), and are written in such a way that they only work with the old mesh system.
Additionally, the old mesh system was made for Modeler, and then used in Layout by bolting on a lot of stuff to satisfy its quite different requirements.

So Layout had the greater need with the lesser amount of work needed, and we tackled it first. Then with the experience gained, lots of new code in place, and the release clock reset, we can look at the modelling tools including the possibility of making them work in Layout.


---JvdL---
The "existing" tools being ported into layout is a possibility, but I assume it just depends on specific tools, because some of them if are written for layout are probably going to be a LOT different than the ones written for modeler with legacy code and the likes. Just imagine a tool for modelling IDK, just say a box tool, but has complete nodal connection, with the deformation stack or whatever, or just the node editor, and you can go all the way procedurally, being the box still a primitive so it doesn't limit to the cuts in X,Y,Z that has now in modeler but a wide range of options that you define. This a pretty sweet case scenario IMO but for the matter of explaining what it means having the actual tools compared with what can become in layout, is maybe overexaggerated but gets to the point.

lino.grandi
12-26-2015, 05:24 AM
Nothing confusing. Jarno said "modeling" tools, not "Modeler" tools.

lightscape
12-26-2015, 05:29 AM
While that's not going to happen in the next release, for sure that's one of our targets. Having amazing people such as Jarno in our team gives us and users the guarantee that this process is developing in the best possible way, system/features/time wise.

Thanks Lino. Hopefully Jarno can atleast clarify what's the current status. It really doesn't have to be a full blown modeller. But we need to hear what is already possible.



Based on Jarnos comments, it's hard to assess where things are headed. From the initial blog post and subsequent discussions, it seemed clear that modeling moving to layout seemed like an eventuality, but going from Jarnos comments here, it seems like they're still trying to determine if that is even possible. Everything I've seen from 2016 looks good, it would just be nice to get some clarification on what (and where) the future of modeling is for LightWave.

Exactly. Its a bit confusing what is the current status.
People who are going to invest time and money(talking about non-lwvers) are more willing to do so if they know that layout is now aware of vertex, edge, poly at this point. That's where it starts. Tools and optimizations come later.



Nothing confusing. Jarno said "modeling" tools, not "Modeler" tools.
Waahaatt? :D
There's a difference between modelling tools and modeller tools? :D

Kuzey
12-26-2015, 05:33 AM
I think that the point made is quite clear, being Layout the part being completely overhauled i.e with the UGE wich means that it can recognize vertex, points and whatnot, and THAT is the BASE for the modelling tools YET to come, wich means that modeler will be slowly ported into layout most likely ( or completely new tools ), so modeler as a standalone I think that will have some love, but not as much as Layout.

It can recognize vertices etc. but that does not mean full modelling tools will come into Layout...they are hoping it will but they aren't sure YET.

We have two mesh systems, one in Modeler & one in Layout, they tackled Layout first, they learned a lot in the process (which is wonderful & exciting) and are hoping to either tackle the Modeler mesh system (still two apps) or expand the new mesh system(one unified app).

When that process beings is a different story, will it start in 2017 once Layout 2016 functions like Layout 2015(but with stack & is faster)...or way down the road?

robertoortiz
12-26-2015, 06:48 AM
This is a great development.Would it be possible in future developments see on the modifier stack a mini representation of the project timeline . That way you could KEY certain parameters of the items of the stack over time. Similar to the timeline in After Effects.

Surrealist.
12-26-2015, 08:08 AM
Just don't confuse what Jarno said about the Modifier Stack in post 181...

http://forums.newtek.com/showthread.php?149194-New-Blog-Post-The-Modifier-Stack&p=1460349&viewfull=1#post1460349

with talk about modeling in Layout. And in this post he was merely making it clear that they need more time to solve issues before rushing into making mesh edits in the stack. He was not referring to specific modeling tools at all. Even though, yes, we could use a modifier stack to model with, Booleans and so forth. It is clear that things are not ready to go to that level just yet.

I am good with that. Make sense. When they can implement things, they will.

And I'd say there is a vast difference between modeling tools (by today's standards) and Modeler tools. Not to knock the greatness that is in Modeler. But if those ever got ported over, I hope them to be more consolidated and brought up to date with proper snapping a real manipulator and the rest of it. :)

jasonwestmas
12-26-2015, 08:11 AM
Thanks Lino. Hopefully Jarno can atleast clarify what's the current status. It really doesn't have to be a full blown modeller. But we need to hear what is already possible.




Exactly. Its a bit confusing what is the current status.
People who are going to invest time and money(talking about non-lwvers) are more willing to do so if they know that layout is now aware of vertex, edge, poly at this point. That's where it starts. Tools and optimizations come later.



Waahaatt? :D
There's a difference between modelling tools and modeller tools? :D

Perhaps this is obvious but:

Modeling tools won't necessarily be brought into layout directly from modeler but they might be. My interpretation is that they are still determining the possibility of converting the old system of developing modeling tools into the new system that they have recently built.

A subsequent reason could be that lw3dg haven't attempted yet (or haven't shown us) how they will bring Lightwave modeler's tools over into layout because they still need to design and build the systematic workflow for how the modeling tools will be used inside of the new layout environment from a user perspective. This would be a framework or system of modeling that is laid on top of the new geometry engine; The engine that lays the foundation to allow layout to merely but more effectively identify and receive commands to manipulate component parts of geometry.

Snosrap
12-26-2015, 08:25 AM
it seemed clear that modeling moving to layout seemed like an eventuality, but going from Jarnos comments here, it seems like they're still trying to determine if that is even possible. I would think they've done preliminary mockups of a few tools to prove out the possibility of modeling functions inside of Layout and I wouldn't be surprised if we had some basic interactive primitive construction in the next release.

lightscape
12-26-2015, 08:42 AM
how they will bring Lightwave modeler's tools over into layout because they still need to design and build the systematic workflow for how the modeling tools will be used inside of the new layout environment from a user perspective.

They have the other major packages to derive ideas. And they basically work the same way.
Object/Item mode, Subobject/component mode with (mesh)local space and (scene)world space with a switcher from/to the modes
Everyone will have to get used to going down a subgroup or clicking a switcher to do mesh editing. It won't be like modeller. Pretty sure it will be like modo.

The basic editing tools are generic across appz with minor differences. I hope they are not over thinking any of this.

jasonwestmas
12-26-2015, 08:47 AM
They have the other major packages to derive ideas. And they basically work the same way.
Object/Item mode, Subobject/component mode with (mesh)local space and (scene)world space with a switcher from/to the modes
Everyone will have to get used to going down a subgroup or clicking a switcher to do mesh editing. It won't be like modeller. Pretty sure it will be like modo.

The basic editing tools are generic across appz with minor differences. I hope they are not over thinking any of this.

well I'm sure that lightwave will develop its own set of unique yet strong qualities like I see in every single modeling software I have used. But yeah at the base of things its best to keep it simple yet flexible and not over think it.

nez
12-26-2015, 11:32 AM
It can recognize vertices etc. but that does not mean full modelling tools will come into Layout...they are hoping it will but they aren't sure YET.

We have two mesh systems, one in Modeler & one in Layout, they tackled Layout first, they learned a lot in the process (which is wonderful & exciting) and are hoping to either tackle the Modeler mesh system (still two apps) or expand the new mesh system(one unified app).

When that process beings is a different story, will it start in 2017 once Layout 2016 functions like Layout 2015(but with stack & is faster)...or way down the road?

Well, it is not shown yet, but from the posts is what I do understand in a, let's say, 4 years period:

- " The true limitations of the historical system basically broke the data for scenes and meshes into two distinct pieces: Modeler able to work on the details of the mesh like polygons, vertices and nodes, while Layout was able to animate and deform those meshes. Starting in the next version of Lightwave we are breaking down those barriers to allow Layout to have full access to the entire set of data that forms a scene – including the elements that previously only Modeler understood. "

- " This is a nice step forward architecturally for LightWave Layout because it gives Layout the ability to create geometry and it provides full awareness of vertices, polygons, and edges. Layout now having the ability to actually create geometry is a very significant change because of the major limitations that it removes for LightWave Layout development. "

- " The other mesh system was created primarily to render geometry and allowed interactive deformation and animation of the mesh. However, it lacked the ability to create geometry and it was also “blind” to the basic elements of polygon types, vertex maps and edges in many ways. "

These are all words of Rob Powers in his posts on the blog, so all that I've said before here is what I understand from it ( marking this as a personal opinion, of course ). I also think what I think because of what I see: Houdini, Fabric engine, Sylyn (not yet, but soon will be) everything goes nodal and procedural, and at the end I see it as something good for the user/artist, because it ditches the "this button does this" to the "create your button" wich is huge.

I don't really know where all of this will end, to be honest. But observing what's going on and reading those posts leads me to think that way. I may also be wrong of course.

lightscape
12-27-2015, 07:49 PM
Well, it is not shown yet, but from the posts is what I do understand in a, let's say, 4 years period:

- " The true limitations of the historical system basically broke the data for scenes and meshes into two distinct pieces: Modeler able to work on the details of the mesh like polygons, vertices and nodes, while Layout was able to animate and deform those meshes. Starting in the next version of Lightwave we are breaking down those barriers to allow Layout to have full access to the entire set of data that forms a scene – including the elements that previously only Modeler understood. "

- " This is a nice step forward architecturally for LightWave Layout because it gives Layout the ability to create geometry and it provides full awareness of vertices, polygons, and edges. Layout now having the ability to actually create geometry is a very significant change because of the major limitations that it removes for LightWave Layout development. "

- " The other mesh system was created primarily to render geometry and allowed interactive deformation and animation of the mesh. However, it lacked the ability to create geometry and it was also “blind” to the basic elements of polygon types, vertex maps and edges in many ways. "

These are all words of Rob Powers in his posts on the blog, so all that I've said before here is what I understand from it ( marking this as a personal opinion, of course ). I also think what I think because of what I see: Houdini, Fabric engine, Sylyn (not yet, but soon will be) everything goes nodal and procedural, and at the end I see it as something good for the user/artist, because it ditches the "this button does this" to the "create your button" wich is huge.

I don't really know where all of this will end, to be honest. But observing what's going on and reading those posts leads me to think that way. I may also be wrong of course.


Yeah that's why I posted they're not really showing anything substantial with UGE in the blog. Is it all just concepts and ideas?
Does it actually exist right now in layout?

ernpchan
12-27-2015, 11:45 PM
Does it actually exist right now in layout?

Yes.

sukardi
12-28-2015, 12:04 AM
... Does it actually exist right now in layout?

I saw vertex manipulation in the last video (towards the end) by Lino. So, I am hopeful...

hrgiger
12-28-2015, 03:00 AM
If you're talking about at the very end, those are bones being moved, not the vertices directly. And right before that part, he's using editFX to move vertices which is in LW already. So even though Layout may now recognize points, polys or edges, it doesn't appear that we can edit those yet.

sukardi
12-28-2015, 03:16 AM
Actually. I was referring to the second last part, where it looks like some kind of proportional editing tool, the vertex no (I assume) comes up when it mouses over. Maybe I was mistaken. Got to admit that I have not read everything on this thread ...

hrgiger
12-28-2015, 03:49 AM
That's editfx.

sukardi
12-28-2015, 05:07 AM
That's editfx.

Wow. Thanks hrgiger. Time for me to open the manual again ...

bobakabob
12-28-2015, 05:18 AM
If corrective morphs are going to be this simple and efficient to apply as illustrated in Lino's video, this is a big step forward for CA in Lightwave. It will be a significant enhancement to workflow say, to apply facial expressions directly in Layout or adjust existing morphs rather than going back to Modeler. Interestingly, adjusting blend shapes in the latest release of Maya with the new sculpt tools looks similar, so this can't be bad. In the meantime, 3rd Powers Cage deformer works well as long as the mesh is economically constructed.

robertoortiz
12-28-2015, 09:29 AM
Ok I have made a mock-up of what I stated before...
I posted..
"This is a great development.Would it be possible in future developments see on the modifier stack a mini representation of the project timeline . That way you could KEY certain parameters of the items of the stack over time. Similar to the timeline in After Effects. "
I would add to that the the ability to expand individual parameters of the deformation controls and add the ability to add key-frames to their parameters like in After effects.
If legacy controls cant do this, fine just don't allow that functionality on those old controls.

If feasible this kind of control would add a lot of power to do transformations OVER TIME .

ernpchan
12-28-2015, 09:39 AM
If they can envelope those items it's probably more realistic to make it an envelope in the actual item.

GraphXs
12-28-2015, 10:01 AM
Hmmm, Hey Lino, Is it possible to add an "EditFX modifier" to the stack? Or a simple way to to flip that switch on? I know it exist in the dynamic tab, but would be sweet to add it there? Unless it is under the corrective morph modifier?

lino.grandi
12-28-2015, 10:20 AM
Hmmm, Hey Lino, Is it possible to add an "EditFX modifier" to the stack? Or a simple way to to flip that switch on? I know it exist in the dynamic tab, but would be sweet to add it there? Unless it is under the corrective morph modifier?

About the "Corrective Morph" modifier you see in the video, that's just a Nodal Displacement modifier properly renamed. Yes, because of course now you can rename any modifier you add in the stack, so its scope it's clear. That's very important, because you can have several modifiers of the same type doing different things.
In the video, as some users have guessed, I'm using ClothFX's EditFX tool. The big difference is that now you can decide when in the stack the EditFX should operate (and of course that's not possible in LW2015). And believe me, that makes a huge difference. We're going to consider any "old" LightWave deformer under a brand new Light! ;)

GraphXs
12-28-2015, 10:47 AM
Sweet! So you can rename all the modifiers? So you saying we could have multiple Nodal Displacements in the stack? Two separate corrective morphs in stack? Is it possibly to export these modifiers or load them (presets) for various easy set up use? Like the corrective morph, or easy ocean waves, or whatever a user can come up with?

It would be sweet to export the stack setup as well, to share between scenes? Does this stack also work with point cache as well? Can that be mixed with corrective morphs or additional displacements?

lino.grandi
12-28-2015, 11:04 AM
It looks like we have some questions here! :D Let's try to satisfy your curiosity!



Sweet! So you can rename all the modifiers?

Yes you can.


So you saying we could have multiple Nodal Displacements in the stack?

Absolutely yes.


Two separate corrective morphs in stack?

Sure!


Is it possibly to export these modifiers or load them (presets) for various easy set up use? Like the corrective morph, or easy ocean waves, or whatever a user can come up with?

No, at the moment you can't save out a preset from the modifier stack of an item.


It would be sweet to export the stack setup as well, to share between scenes?

You can do that using Load Items From Scene to load an object you want to use the stack of, and then do some copy/paste (or, why not, directly replace the object with the stack with the new mesh). Being able to save/load the whole stack would be a nice feature for sure.


Does this stack also work with point cache as well?

Yes it does. Both the MD Reader and the MDD Displacement node are working perfectly fine for the scope.


Can that be mixed with corrective morphs or additional displacements?

You can mix and order any modifier with any other.

nez
12-28-2015, 11:53 AM
It looks like we have some questions here! :D Let's try to satisfy your curiosity!




Yes you can.



Absolutely yes.



Sure!



No, at the moment you can't save out a preset from the modifier stack of an item.



You can do that using Load Items From Scene to load an object you want to use the stack of, and then do some copy/paste (or, why not, directly replace the object with the stack with the new mesh). Being able to save/load the whole stack would be a nice feature for sure.



Yes it does. Both the MD Reader and the MDD Displacement node are working perfectly fine for the scope.



You can mix and order any modifier with any other.

That looks nice! In this line of questions. .. would be the dynamics act as a modifier? If so will them be tweakable nodally? thanks

jwiede
12-28-2015, 12:08 PM
Yes.

Thanks for the clarification!


Instances, Hair and volumetrics are always processed at the end of the relative object deformation (it would make no sense to have an object deformed by dynamics and have FiberFX or Instances applied before that deformation, so to mention one simple case).

Well, yes and no. As Pinkmouse points out, processing them at end effectively denies them the ability to be part of the input drivers for deformation, and there are definitely times when a user might want/need that. Further, once the possibility for multiple subdivision entries exists, a user may want to apply them "between" subdivision entries, as opposed to always at max subdivision (this is particularly relevant with instances where poly shape and count impact distribution).

I think having them "always at end" would be fine for the first implementation, but once the implementation supports cases like multiple subdivision entries I also believe the ability to control when subsystems like instances, hair, etc. get to take their "snapshot" of the geometry relative to deformers, subdivision entries, etc. will become a necessary requirement as well. Beyond just the subdivision/distribution-vs-instances/hair issue, such controls also allows fine tuning of resource consumption, compute, etc. essential in production environments.

GraphXs
12-28-2015, 12:34 PM
Sounds AWESOME! Thanks Lino!

jwiede
12-28-2015, 12:38 PM
The new mesh system is now at a stage where the old mesh system isn't involved with Layout at all anymore. There is no conversion on loading. I would not be surprised if there are some particularly weirdly constructed meshes which it doesn't like, but so far they haven't come up in testing. And if there are, you can always fix them up in Modeler.

This brings up a recurrent UX issue in LW: The lack of detail/specificity on errors, and failure to provide users with info needed to address them. As you mention, the user can always fix them up in Modeler, but doing so in a remotely efficient manner requires Layout to provide the user with specific info about what caused the problems. In the past, loaders/importers were notoriously opaque in failures, leaving the users with little idea why loading failed. It's a general issue within LW as well, but in other situations there are often (not always, more info on errors is generally needed) other diagnostic approaches available.

For cases such as what you've described, has the loader/importer system been modified to provide clearer diagnostic information about which specific element in the mesh caused problems, in order to facilitate fixing? If not, statements like "you can always fix them up in Modeler" might be theoretically true given infinite time and effort, but in practice need to be recognized as improbable in most cases without additional information provided.

Just as a suggestion, error cause info doesn't even necessarily need to be human-parsable (though should), so long as it is generated in a way that can subsequently be read into Modeler and _presented_ in a human-comprehensible manner to guide fixing the mesh issues. For example, if loader failures generated a kind of "error sidecar file" which could be opened by Modeler and would then tell Modeler which mesh file was the problem, and which part of the mesh was an issue (both for highlighting as presentation to user of issue, as well as perhaps a hint to Modeler's importer to "relax" parsing for fixing).

My main point is that error presentation and handling in LW frequently provides inadequate info to the user about what went wrong (or worse, too often operations error out without providing any info), making resolution of errors nigh-impossible for users, and needs priority consideration in any code undergoing modification -- the frequent recommendations of "nuke from orbit and start over"-type solutions to errors/failures (amply visible here in forums) are a classic telltale for inadequate error resolution info.

jeric_synergy
12-28-2015, 12:52 PM
This brings up a recurrent UX issue in LW: The lack of detail/specificity on errors, and failure to provide users with info needed to address them.
::cough:: Rounder. ::cough::

Jarno
12-28-2015, 06:20 PM
would be the dynamics act as a modifier?

The application of the deformation calculated by the dynamics system is a modifier. The modifiers do not feed back into the dynamics calculations, as dynamics is a separate system using Bullet.

---JvdL---

Surrealist.
12-28-2015, 07:28 PM
I think a 2016 open Beta at some point would be a great idea. It would be the next step in this more open line of communication we are getting with LW 3D Group.

Thoughts on this LW 3D Group?

lightscape
12-28-2015, 07:34 PM
Pretty sure they don't want a CORE open beta repeat. They're afraid.

jeric_synergy
12-28-2015, 09:01 PM
"Afraid" is too strong a term: "lacking the patience for this crap" is probably more like it.

lightscape
12-28-2015, 10:04 PM
"Afraid" is too strong a term: "lacking the patience for this crap" is probably more like it.

The creative industry is where you need a thicker skin and should be open to criticism.

Surrealist.
12-28-2015, 11:13 PM
I would not jump to any conclusions and I don't wish to spark a debate. Keep it on topic, it is just a question to the team. Curious if they have thought about this and have any plans.

lightscape
12-28-2015, 11:49 PM
I would not jump to any conclusions and I don't wish to spark a debate. Keep it on topic, it is just a question to the team. Curious if they have thought about this and have any plans.

They already have a few studios that do beta testing. And they have Lino for internal testing.
Would be great if they did open up the beta for current lw 2015 users though.

pinkmouse
12-29-2015, 04:09 AM
An open Beta would be nice, but I don't see it happening.

However, it would be nice to see an implementation of this (http://www.cs.cornell.edu/projects/stochastic-sg14/) in the new render engine. :)

robertoortiz
12-29-2015, 07:52 AM
Ugh guys as a long time LW user that has been with NewTek throught the Good times and bad times in the trenches, (and taken quite a bit of flack for it BTW) I would love to hear some feedback on the ideas I have posted.
Agree, disageee? Right now I feel like a speedbumb to the other comments.
Thanks guys,

Here is a link to my last post:
http://forums.newtek.com/showthread.php?149194-New-Blog-Post-The-Modifier-Stack&p=1460638&viewfull=1#post1460638

Having said that, the Idea of an open beta is a good one.
-R

pinkmouse
12-29-2015, 07:57 AM
You can already "key" stuff quite straightforwardly in nodes, but if it adds flexibility, why not? :)

RebelHill
12-29-2015, 08:02 AM
I dont see how you can key a whole modifier... only individual parameters of that modifier. Either way, I find the mockup kinda ugly, stuffing in too much in to too small a space, also not forgetting how it'd be inconsistent for some modifiers which may be there but which dont have or need keyable parameters. Better, I think, would be an option on the r-click menu to open graph, which would load into the GE all available channels for the nodes/whatever contained within that modifier.

ernpchan
12-29-2015, 09:07 AM
+1 to what RH said. We don't need another place to adjust timings.

jeric_synergy
12-29-2015, 10:30 AM
+1 to what RH said. We don't need another place to adjust timings.
and +1 to ernpchan for stating it so clearly.

robertoortiz
12-29-2015, 11:59 AM
Appreciate the feedback guys.
The way I see it is that it would be useful to see in one place the different deformations that affect a specific object OVER TIME.
And the ability to be able to pick THE FRAME where these those deformations will be activated would allow for more intuitive
interaction of the different deformation elements of an object.
Again

Nodes are great and very powerful, but require a bit of imagination.(not that that is a bad thing)

For quick a dirty interaction I like the way the controls are setup for attributes in AfterEffects.
131672

RebelHill
12-29-2015, 12:19 PM
Wanting it like that is all well and good, but u have to integrate it somehow in the interface, and stuffed into that lil window is ugly, and inconsistent with the rest of the interface... You'd wonder why the same isnt present in the lists for Fx, instances, custom objects, etc, etc.

In after effects, this design/layout works because it is centralised, based on the fact that the only thing to exist in the timeline are the layers (upon which the effects are applied)... and the effects themselves also centralise under the effects window. You would want something similar for LW, the keys accessible in a centralised location for all things, and that place, is the graph editor. Having it all in the one place like that is far quicker and much less dirty.

jwiede
12-29-2015, 12:52 PM
About the "Corrective Morph" modifier you see in the video, that's just a Nodal Displacement modifier properly renamed. Yes, because of course now you can rename any modifier you add in the stack, so its scope it's clear. That's very important, because you can have several modifiers of the same type doing different things.
In the video, as some users have guessed, I'm using ClothFX's EditFX tool. The big difference is that now you can decide when in the stack the EditFX should operate (and of course that's not possible in LW2015). And believe me, that makes a huge difference. We're going to consider any "old" LightWave deformer under a brand new Light! ;)

Lino, can you comment on whether/how much these modifier stack operations (enumerating stack, reordering stack, renaming stack entries, writing new modifiers, etc.) are planned to be exposed to third-parties via APIs around release of LW2016? Put more generally w.r.t. modifier/deformation stack, if the user can do it in the app, is there an equivalent programmatic means to do the same exposed in the APIs? And if so, is the plan to make that available to customers & third-party developers via SDK around LW2016 release?

Also, around LW2016's release will examples be provided in SDK on coding up new modifiers, accessing geometry info as modifier, and so forth? Will the various new APIs and modifications to existing ones w.r.t. modifier/deformation stack all be documented in the SDK docs around LW2016's release?

To clarify, not looking for commitments/schedule, "around release" just meaning whether the intention is for this stuff to be SDK-exposed and -documented concurrent with LW2016 itself. Also, whether intention is to expose in both C/C++ and Pythonic APIs as well or just C/C++?

Thanks!

GraphXs
12-29-2015, 12:58 PM
Maybe we can have an update to the spreadsheet/scene editor to show keys for displacements? That would be more like after effects. Or maybe a new view for the graph editor to be a layer based key view? Assuming you just want to offset multiple keys at once. It would be cool in LW timeline had a mode switch for seeing morph keys only or displacement/ texture keys and only.

kfiram
12-29-2015, 03:05 PM
Lino, I would REALLY like to see a simple "Transform Points" modifier in LW 2016.
Using EditFX is nice, but quite cumbersome.

I'm sure future versions of LW will have awesome modeling tools in Layout, but until that happens, a simple tool that allows us to do what EditFX does in a less convoluted way would be much appreciated.

It could work just like EditFX, with simple Drag/Dragnet-like operators, or it could be slightly more appealing, with the ability to select points, edges or polygons and use something like Modeler's new Transform tool.

Aside from being extremely useful, such a modifier would go a long way in letting people understand the basic concepts and advantages of modeling in Layout and of what a modifier stack can be used for.

Any chance of that happening?

Jarno
12-29-2015, 10:01 PM
Lino, can you comment on whether/how much these modifier stack operations (enumerating stack, reordering stack, renaming stack entries, writing new modifiers, etc.) are planned to be exposed to third-parties

That modifier stack, and the operations such as reordering, is just a list of plugins like motion modifiers and channel modifiers and the current deformation plugins. So whatever is available to manipulate those (mostly through commands) will also work on the modifier stack.

For LWSDK work that I do, my rule is: if it isn't documented, it doesn't belong in the public LWSDK.
Python APIs should be there, as they are generated almost entirely automatically (almost). Though there will be obvious speed penalties and probably no multithreading with Python.

I do expect some new example code to be added as well. I know the example code in the LWSDK has been neglected for a long time. I recently went through them, removing obsolete ones, reworking the build system so that it is easy to add new examples, and made compiling them part of our automated build process.

---JvdL---

raw-m
12-30-2015, 02:15 AM
Maybe we can have an update to the spreadsheet/scene editor to show keys for displacements? That would be more like after effects. Or maybe a new view for the graph editor to be a layer based key view? Assuming you just want to offset multiple keys at once. It would be cool in LW timeline had a mode switch for seeing morph keys only or displacement/ texture keys and only.

Off topic, but someone made an lscript a couple of years ago that added the graph editor so it lined up with to the timeline. His original mockup where it was all seamlessly built into the UI was brilliant, instant access to all keys, would love to see that.

Update: found it, vncnt's Legato: http://forums.newtek.com/showthread.php?136828-How-to-change-a-rig-in-45-scene-files&p=1353802&viewfull=1#post1353802