1001 ways to use LW and Blender together

prometheus

REBORN
And some blender pipeline fun..
when exporting out that gas solver sim processing to fog to level set and save to mesh, open in moder save out as obj, send to blender
Add empty volume, use mesh to volume modifier and target the obj file, add a volume mesh displacement and choose noise texture of your liking, that part is where it is fun in blender, and it would be so much more convoluted to try and
approach it in lightwave, in blender you simply add a remesh modifier, and duplicate the obj as a segment, and again, and again..move and scale, rotate etc..and it will remesh fuse as one mesh if you work in edit mode with "display modifier in edit mod" active.

While doing this, the volume to mesh is maintained and you can can move around the segments and scaling and the volume to mesh will follow, a bit of a lag depending on resolution, but it works.
Same with sculpting and using snake hook and all kinds of sculpting tools, the volume to mesh and its volume displacement is adapted to that during the work of sculpting.
That said, it is of course more smoother to not work on sculpting the mesh and remesh while having the volume to mesh activte but only turn it on when you are done with that.

But scultping like this with a remesh active, and a volume active on it, as well as volume displace to distort the volume..not possible in Lightwave.
Just the workflow of hooking up cgs boolean nodes in blender to remesh, is while flexible and powerful also not anywhere near the way of efficiency for just a fuse of the mesh as we can do with remesh modifier in blender and working in edit mode.

I tend to post expanded image canvases of fullscreenshots instead of posting several images and reload each time, if that may cause any issues of slow loading here in the forums, please let me know.
Snake hook sculpt to deform additionally, just testing..so just get the picture of the ease of workflow for this stuff, which I would say is pretty much impossible in Lightwave.
A process of actuall fluid volume gas simulation, then converting to mesh, then to sculpt work on that mesh, then to convert back to volume, and add additional noise.

never mind the looks of it, it´s just conceptual to demonstrate the appliance of it all, I didn´t bother at all to make anything of looking great or paying attention to features here, just technological testing of the features.


lw to blender.jpg
 
Last edited:

prometheus

REBORN
The post above was moved by moderator, and subsequentially it broke the workflow process description of using Lightwave and blender together, so heres additional information that was left behind in that VDB thread..

so the below should of course be adapted before you get to the point of the blender vdb import and volume to mesh..



Initially simulating in lightwave...from the other thread..
Yes, the basic, got that working.
But Hey Safarifx, we shouldn´t hide the workflow for other lw users
The basics for this kind of stuff should be openly explained I think.
In this thread there is none mentioning the fog to level set, while you main try your own way or try and find the usage of it in the docs, which is poorly documented.

So yeah, got that working...feed the gas solver density to the fog to level set and then that fog to level set to the saver node.
add another null if you want, use object replace and vdb evaluator, add vdb info and it´s grid output to the openVDB grid input, see images.
And as SafariFX says, if you are pleased with the vdb level set shape, save out as transformed object or sequences.

Images, nevermind the resulting shape etc, it´s just to showcase the node workflow for anyone to take a look at and try, but this is just my setup..SafariFX may have worked differently..at least for the more advanced stuff.
But really, take a look at the first picture, as I have explained before about Lightwaves horrible tick display approach for the gas solver, why not discard it and apply a slice volume display as blender has...if possible, In this case I couldn´t zoom in properly because the preview of the gas solver would disappear due to the distance between this static tick display, would be so much sweeter if it could match that of blenders display, which also have options to control thickness of the density and slices, you wouldn´t have that problem whith the zooming.



 

prometheus

REBORN

lOL? What,
you have been having fun in two threads now with very little effort, why not share your funny moments and make them understandable for others, simply why can you make an effort to post something more interesting..it kind of brings nothing to the discussion really nothin of value at all as a single post without any context.
if a Lol is directly in relation to an actual discussion to something someone said, I can understand your delightful post, but this is just nonsense posts from you.

It just litters the forum posts with garbage...or did I really miss an connection to the posts we just made and the topics? just asking you since you should know.
 
Last edited:

LightWaveGuru

Active member
And some blender pipeline fun..
when exporting out that gas solver sim processing to fog to level set and save to mesh, open in moder save out as obj, send to blender
Add empty volume, use mesh to volume modifier and target the obj file, add a volume mesh displacement and choose noise texture of your liking, that part is where it is fun in blender, and it would be so much more convoluted to try and
approach it in lightwave, in blender you simply add a remesh modifier, and duplicate the obj as a segment, and again, and again..move and scale, rotate etc..and it will remesh fuse as one mesh if you work in edit mode with "display modifier in edit mod" active.

While doing this, the volume to mesh is maintained and you can can move around the segments and scaling and the volume to mesh will follow, a bit of a lag depending on resolution, but it works.
Same with sculpting and using snake hook and all kinds of sculpting tools, the volume to mesh and its volume displacement is adapted to that during the work of sculpting.
That said, it is of course more smoother to not work on sculpting the mesh and remesh while having the volume to mesh activte but only turn it on when you are done with that.

But scultping like this with a remesh active, and a volume active on it, as well as volume displace to distort the volume..not possible in Lightwave.
Just the workflow of hooking up cgs boolean nodes in blender to remesh, is while flexible and powerful also not anywhere near the way of efficiency for just a fuse of the mesh as we can do with remesh modifier in blender and working in edit mode.

I tend to post expanded image canvases of fullscreenshots instead of posting several images and reload each time, if that may cause any issues of slow loading here in the forums, please let me know.
Snake hook sculpt to deform additionally, just testing..so just get the picture of the ease of workflow for this stuff, which I would say is pretty much impossible in Lightwave.
A process of actuall fluid volume gas simulation, then converting to mesh, then to sculpt work on that mesh, then to convert back to volume, and add additional noise.

never mind the looks of it, it´s just conceptual to demonstrate the appliance of it all, I didn´t bother at all to make anything of looking great or paying attention to features here, just technological testing of the features.


View attachment 149508


As an Octane user, I can only yawn because I can use 3D objects modeled in real time from the LW Modeler to render them as clouds in the layouter using Octane Volumetrics. If I change the geometry of the object in the modeler, the geometry of the octane volume changes as well. So stop messing around with your Gender-Blender. LOL :)

So now that we've cleared up the technical stuff I want to see your best cloud renders with your pipeline, and then I'll show you mine!

Come let's have a little battle. ;)

ROFL :)

snip safx
 

UnCommonGrafx

Wandering about
As an Octane user, I can only yawn because I can use 3D objects modeled in real time from the LW Modeler to render them as clouds in the layouter using Octane Volumetrics. If I change the geometry of the object in the modeler, the geometry of the octane volume changes as well. So stop messing around with your Gender-Blender. LOL :)

So now that we've cleared up the technical stuff I want to see your best cloud renders with your pipeline, and then I'll show you mine!

Come let's have a little battle. ;)

ROFL :)

snip safx
This could be informative. Curious about your pipeline/process.
 

LightWaveGuru

Active member
Bunny1.jpg
Bunny2.jpg



Come Prometheus, let's play a little. :)

snip safx

BTW

The word "BLENDER" in German means.
foam stick, Bluffer, someone who cheats...did you know that? ROFL :)

 

LightWaveGuru

Active member
This could be informative. Curious about your pipeline/process.

Hi UncommonGrafx,

yes I am an Octane user since Octane version 1.2
so of course I use the pipeline of the Octane Plug in by Juanjo Gonzales for LightWave.
This includes that you can build TFD, LightWave objects, open VDB`s and also Procedural Volumetrics via Null objects
with the Octane Volumetrics Option.

snip safx
 

prometheus

REBORN
As an Octane user, I can only yawn because I can use 3D objects modeled in real time from the LW Modeler to render them as clouds in the layouter using Octane Volumetrics. If I change the geometry of the object in the modeler, the geometry of the octane volume changes as well. So stop messing around with your Gender-Blender. LOL :)

So now that we've cleared up the technical stuff I want to see your best cloud renders with your pipeline, and then I'll show you mine!

Come let's have a little battle. ;)

ROFL :)

snip safx
safarifx guru..

Really?

Have you ever tried the mesh to volume in Blender? it´s in scene context, in Lighwtave you have to model in modeler, you can not model it in Layout.
As for blender, you have two ways, either just use a principled volume material directly on to any object, how that differs from octane I do not know, it is however not the same as the mesh to volume where you voxelize it all, with voxel resolution dependency and not only volume material step dependency.

Also, I do not know if you can displace the octane volume with a displacement modifier as we can in blender, that is what I asked you to show..but nada from you.
Also..I have already challenged you to use the disney asset to perform the multiple scattering bounces within the volume..I posted images of that, but you didn´t respond.

You have probably seen standard primitive item clouds from lightwave ..that is also what you have to match :)

but you are a bit late when I asked for the battle....then you didn´t pick a fight :)
And currently I do not have the time or passion for it right now..it will come to me later, but not right now since I have more important things to do than clouds right now..
so where where you some time ago and others when I posted cloud samples here and there, both from Lightwave and blender?
but oh no, you were busy with your clothes and dresses and the fancy walk, and drooling with liquids :)

As said, in blender you model at once, and adjust shape inside of blender, and can have it update with opengl realtime feedback, you can´t in Lightwave, you cant do that either in layout with rendering, which you can in blender, since blender allows for modeling directly ..you have to switch to modeler, save and sync/switch to layout.

Lightwave is simply missing the direct link to allow for realtime feedback when you model in scene context, since there are none.

Darn..every image search result on google for newtek forums will not work to locate threads with image posts, they are now broken as a link to the old forums...I used to use that to look up previous posts.

So..linking to the blender forums then...

But these images, not a matter of actual volume creation, it´s not mine but the disney asset, but it´s all about the shading and lighting, rendering..what I need to know is if octane can produce proper multiple scattering...












Now this is the essential lighting and volume shading experiments, Lets start with that.

Actual shaping of clouds is actually of lesser importance to technology, since both you and I can use similar principle with the final result that will not be much different, with maybe the exception of volume displacement ..which I am not certain that octane can do, and with that I mean actual full volume displacement, not volume surface displacement or texturing, if you have tried that volume displace modifier you know what I mean, I have also demonstrated that with images for you in other threads, but they were moved around.

So if we start with Lighting and shading, all images you see here is from blender cycles render, except for the last image where the third image which says hyperion original render, which obviously is the Hyperion render to aim for :)

When you can produce similar/close or better shading and lighting in octane for this disney cloud asset, we may start look into shaping of clouds and advectional noise fractal distortions.
 

prometheus

REBORN
View attachment 149543View attachment 149544


Come Prometheus, let's play a little. :)

snip safx

BTW

The word "BLENDER" in German means.
foam stick, Bluffer, someone who cheats...did you know that? ROFL :)



Well..that is honestly not good...that kind of effects is just noise distortions that aren´t very good applied in distance to the mesh volume really, and the shading is just effects for show, it´s shows nothing of the realism of any cloud shading and lighting.

You wanted a fight, look at my cloud lighting samples and try to mach that in lighting and shading, for cloud shapes and distortions, I can get things better than that really.
all honestely mentioned, you need to up your game with this.
 

prometheus

REBORN
But don´t use a bunny, even I and hugh hefner could do that with same or similar results
No fighting. Competing, ok ... but fighting? No thank you. :)
It´s just a friendly fight, since safarifx is in the mood..I think he and I have a certain tolerance when adressing eachother, I know he goes in to a certain manner that can sound rough, but deep inside I think he got a good heart :)

Now..and as monty python says, and now to something completely different, not a bunny, but an ...

elephant1.jpg


elephant2.jpg


elephant3.jpg



And never do your cloud test renders against a black background, and not with neon colors either..Your move safarifx :)
This is easy as a Pie to do in whatever I used you know, and I got realtime feedback in opengl, can choose noise displacement offset as I like, change fractals etc, start to remesh and edit this elephant shape directly in scene context.
Now..once I add this shape to the volume to mesh modifier, it automaticle set up a volume material, so I really don´t have to create a new material..which I didn´t for this one, however..If I just add a new material, this default material will be intact, but I can continue to add curves, contrast, increase density and adjust anisotrophy, add light path nodes to control the lighting to yet another level.
 

prometheus

REBORN
And adding a principle volume material adding light path and curve nodes, two different setups with the light path, the third last image is however just an increasement of voxel amount from 200 to 400, so hard to tell difference ..but a little sharper in edge detail.


elephant4.jpg
 

prometheus

REBORN
Testing Alembic export to blender.
Lightwave 2019.5 - Blender 2.92 Alpha

Findings..

✅ Camera matching: Yes (I believe it comes through to blender as "pixel perfect")

✅ Nodal motion: Yes (nice)

✅ Particles nodal motion cloning/object clones: Yes... which means you could use nodal motion and clone items on to the particles as object clones.

✅ Fx motion: Yes(nice).

✅ Displacement/animated ok.

✅ Bone deformation on mesh/animated: Yes.

❌ Rig/Bones: No.

❌ Textures: No, but surfaces and uv´s ok, you can however.. and would have to re-assign all textures.

❌ Particles: No (use particle, mdd scan instead and apply blenders native particle system on that).

❌ Lights: No ..But you get null placement for position, rotation(all lights)..and also if animated in sequenced cache, so you may parent blenders light´s to that.

Make sure to select all objects when exporting, since it seems only to take in account what you have selected.



So one workflow/pipeline that will probably work well, that is to setup motion and camera in lightwave and mesh cloud objects, play in opengl and see a rough preview of it all, send it to blender with alembic exporter and make sure to set the view to the proper camera (I would suggest delete any default cameras before importing).
Use blenders add empty volume, and use volume to mesh modifier and volume displacement, and you have volumetric clouds based on whatever cloud mesh you made in whatever software you did the main mesh in, but you are utilizing the nice setup and camera and motions from Lightwave.
Renders of that comes though ok after my tests.

With this approach if you find it easier and nice to move around camera and objects in lightwave for flights etc, you can tap in to that, and then tap in to blenders better multiple scattering in volumetrics, and it´s faster workflow with mesh to volume and fractal displace of that, instead of waiting for calculating any noise advection on to vdb elements in Lightwave...it´s just simpler and faster this way, and you get better volume quality as well.

The rest is up to your ability to set proper real scale and make good cloud model cloud meshes.
VDB files with clouds is a different approach, you can not see them in opengl and not sculpt design them either (well you can but it has to be processed yet more steps before getting it right with converting to vdb volumes)

Another case would be for motion graphics where you may want the whole nodal displacement package in Lightwave, but to render it out in realtime with glowing bloom surfaces in eevee, or lights if parenting it all can be made smoothly with a lot of lights, or use cycles slower global volumetrics.

comping back eevee volumetrics or surface emission bloom renders on to Lightwave renders is also something One may want to explore, since it seems to be a perfect match with the cameras in Lightwave/Blender.
 
Last edited:
Top Bottom