PDA

View Full Version : Best way to import motion and Morph animations for characters in Lightwave.



paulg625
10-25-2017, 02:31 AM
Ok so now I have my character in lightwave and want to bring in motions from outside what are the best options?

I know I can use BVH but want to be able to retarget I have Nevronmotion but it seems cumbersome I think Rebelhill's RHiggit is the best way to go for retargeting. Any info on this would be appreciated.

Second getting morph animations applied to characters. I want to be able to bring in morph animations I have created and apply them to the same character in lightwave then be able to polish the animation.

So the end goal is bring in a BVH motion and a Morph animation, (maybe MDD?), apply it to a character then us Lightwave tools to polish the animation.

any help would be greatly appreciated thanks!!!:D

Spinland
10-25-2017, 04:00 AM
I think Rebelhill's RHiggit is the best way to go for retargeting. Any info on this would be appreciated.


https://youtu.be/6PgNXjHJTv4

paulg625
10-25-2017, 07:32 AM
Thanks for responding, Yes I have seen this video it is what makes me think this is a good way to go. Do you use it this way? I also have Nevronmotion (by luck when I upgraded to 2015). But RHiggit seems to have other advantages as well.

Spinland
10-25-2017, 09:44 AM
Yes, I use RHiggit almost exclusively these days for my character work, including mocap. I generally record my own stuff using Brekel and/or Neuron but in a pinch I've leveraged the CM library for quick and dirty stuff and his retargeting tools work a treat.

Greenlaw
10-25-2017, 10:36 AM
For me, the best way to get 'outside' motion onto a LightWave rig has been to prepare the motion in FBX and use Load Items From Scene with Merge Only Motion Envelopes enabled. This option extracts only the animation from the FBX scene and applies it to your existing LightWave character rig. This way, you can set up the character's surfaces and controls only once in LightWave and keep re-using that for each motion you import.

Naturally, the hierarchical structure and bone-naming of the rig in the FBX needs to match the LightWave scene exactly for this to work. You can have extra bones and targets in the LightWave so long as they don't disrupt the main skeletal structure.

Also, LightWave doesn't support FBX with multiple Takes, so make sure your 'LIFS' scene only contains one Take.

My mocap come out of iPi Mocap Studio as .bvh, and I retarget that to a rig in Motion Builder 2010 and output the FBX from there. There are other programs that can do this too, for example, 'Webanimate Standalone' and 'Reallusion iClone 7 with 3DXchange'. I've been playing around with these two lately and will post more info about the experience when I get the chance. BTW, iPi Mocap Studio can re-target to an imported character directly and output FBX too...I just use the .bvh to MB workflow because I need to use some MB specific-features, and mostly out of habit. (My goal is to eventually move away from my aging MB license.) :)

As shown in above posts, LightWave has this capability too. I think the workflow is a little clunky compared to third-party mocap re-targeting workflows, but it works well enough if your re-targeting needs aren't too extreme. (Note: no slight to RH is meant. He's always super helpful and what he's been able to do for LightWave has been fantastic.)

Any of these systems requires careful planning and following specific rules and naming conventions exactly.

Hope this helps.

paulg625
10-25-2017, 10:41 AM
Yes, I use RHiggit almost exclusively these days for my character work, including mocap. I generally record my own stuff using Brekel and/or Neuron but in a pinch I've leveraged the CM library for quick and dirty stuff and his retargeting tools work a treat.

Thank you very much. Cool I have a Neuron system as well. How are you getting your stuff into Lightwave are you coming in direct through a BVH format from the Neuron software? and are you using only RHiggit rigs. I watch some cool stuff where you could modify a character existing rig making it more controllable with the RHiggit software as well.

Just wanting to explore as many workflow options as possible. :)

Greenlaw
10-25-2017, 10:52 AM
One more thing: At some point you might wonder if you should be using LightWave Joints or Z-Bones. Short answer: stick with Bones.

There are technical advantages when using Joints but LightWave has a weird weight map offset that makes them a lot harder to use in LightWave than they should be. This offset issue doesn't exist when using Bones.

(I wish they would just fix Joints or at least give us the option to assign weights the way it works in other programs, but I don't think that's going to happen. FWIW, except for one personal project I've been working on forever, I pretty much staying with z-bones nowaday.)

paulg625
10-25-2017, 10:55 AM
For me, the best way to get 'outside' motion onto a LightWave rig has been to prepare the motion in FBX and use Load Items From Scene with Merge Only Motion Envelopes enabled. This option extracts only the animation from the FBX scene and applies it to your existing LightWave character rig. This way, you can set up the character's surfaces and controls only once in LightWave and keep re-using that for each motion you import.

Naturally, the hierarchical structure and bone-naming of the rig in the FBX needs to match the LightWave scene exactly for this to work. You can have extra bones and targets in the LightWave so long as they don't disrupt the main skeletal structure.

Also, LightWave doesn't support FBX with multiple Takes, so make sure your 'LIFS' scene only contains one Take.

My mocap come out of iPi Mocap Studio as .bvh, and I retarget that to a rig in Motion Builder 2010 and output the FBX from there. There are other programs that can do this too, for example, 'Webanimate Standalone' and 'Reallusion iClone 7 with 3DXchange'. I've been playing around with these two lately and will post more info about the experience when I get the chance. BTW, iPi Mocap Studio can re-target to an imported character directly and output FBX too...I just use the .bvh to MB workflow because I need to use some MB specific-features, and mostly out of habit. (My goal is to eventually move away from my aging MB license.) :)

As shown in above posts, LightWave has this capability too. I think the workflow is a little clunky compared to third-party mocap re-targeting workflows, but it works well enough if your re-targeting needs aren't too extreme.

Any of these systems requires careful planning and following specific rules and naming conventions exactly.

Hope this helps.

It helps extremely well, Because Iclone is where I am starting this work flow. I had a small hurdle with the FBX and feel I'm about half way across this. (it was a bit extreme because its Toon Mouse from daz imported into Iclone then exported from there) and FBX is a way I would like to get stuff in but lack the knowledge as to some ins and outs of Lightwave. I have also purchased the Faceware plug-in for Iclone and was hoping for a good way to get these motions into lightwave also.

ipi mocap is great I didn't initially have a good space to work in and opted for the perception Neuron when it came out. works good but like many Mocap systems it has it quirks. But can capture and prep in Iclone using the same character I plan on using in Lightwave. And have all the tools needed to cleanup the animation there, except maybe mocap jitters clean up which I'm considering Web animate.

But if I am using a character out of Iclone and Bringing the motion out of Iclone as FBX this should limit my naming issues, Yes?

also a question this was one thing I am unfamiliar with:" LightWave doesn't support FBX with multiple Takes, so make sure your 'LIFS' scene only contains one Take."

paulg625
10-25-2017, 10:59 AM
One more thing: At some point you might wonder if you should be using LightWave Joints or Z-Bones. Short answer: stick with Bones.

There are technical advantages when using Joints but LightWave has a weird weight map offset that makes them a lot harder to use in LightWave than they should be. This offset issue doesn't exist when using Bones.

(I wish they would just fix Joints or at least give us the option to assign weights the way it works in other programs, but I don't think that's going to happen. FWIW, except for one personal project I've been working on forever, I pretty much staying with z-bones.)

Problem I have coming out of Iclone and trying to come in as bones is the bones are all pointed in the wrong direction. the rig is weired out. I have been coming in as Joints because of this. Do you know a good fix for correcting this problem?

Surrealist.
10-25-2017, 11:45 AM
I don't like using FBX and mixing it up in LightWave. Just me. I prefer Alembic and the mdd loader. Usually I will import the Alembic in a separate scene. Then export that for each object in mdd. (or bring it in as mdd when possible) then in the scene I use for animation and rendering, I just have a character set up with materials. I apply the mdd, use Object Space and I can move the character anywhere in the scene. I will also usually adjust the start frame and/or timing of the animation.

Greenlaw
10-25-2017, 12:18 PM
Doesn't that depend on what you want to bring into LightWave though?

Sorry, I've got almost no Alembic experience but I thought that was mainly for mesh and displacement animation? For bones animation, I thought FBX was still the way to go.

In my case, I often need the actual bones animation to drive certain effects in LightWave, and I don't think I can do that with just MDD. (Not accurately or efficiently anyway.)

Surrealist.
10-25-2017, 12:23 PM
Exactly. I prefer not to use bones. I know. That was the question. But I found the best workflow was to skip it entirely. Do all of the animation and editing outside LightWave. In LightWave, I just set it up so all I need is the final mesh data.

Just offering my off topic perspective.... lol Sorry.

But agree. Use RH tools and retargeting if that is what you want to do. I think it would be better to ditch the FBX rig as soon as possible in the process in LW. And retargeting is the way. I have never tried it in LW though.

paulg625
10-25-2017, 12:30 PM
Exactly. I prefer not to use bones. I know. That was the question. But I found the best workflow was to skip it entirely. Do all of the animation and editing outside LightWave. In LightWave, I just set it up so all I need is the final mesh data.

Just offering my off topic perspective.... lol Sorry.

But agree. Use RH tools and retargeting if that is what you want to do. I think it would be better to ditch the FBX rig as soon as possible in the process in LW. And retargeting is the way. I have never tried it in LW though.

No, please do. This is exactly what I want. I want opinion because at this point I don't have one. I thought about MDD but well this first project I'm doing using Lightwave workflow is Toon Mouse. and I need to ba able to animate tail in Lightwave to get proper motion. Iclone sucks for multi-bome tail animation. So if I bring in in MDD then I really cant edit can I? you would really need to polish before hand, Yes?

Greenlaw
10-25-2017, 12:44 PM
Either way is valid...it depends on your intentions.

When I worked at the Box at Rhythm, we did both. In the later years, we did most of our character animation in Maya and exported MDD's for LightWave. This worked for most things. Sometimes I wanted to use dynamics in Lightwave, but since there were no bones, I had to bake a selected vertex to a null and then parent the item to the null. This usually works better than you might think but it has its limitations. For hair with dynamics, I typically needed the bones...dynamics is much faster with bones and proxy collision objects. But depending on your setup, it's possible to do this with MDDs too, so long as the T-Pose is available in the first frame.

In the case of my personal projects, my LightWave rigs are doing all the deformations natively. There are advantages to that when the character needs to interact with things in the LightWave scene, especially dynamics and hair/fur. Or if you just want to edit the motions in LightWave without jumping back to other programs.

I would suggest trying out a bunch of stuff and ask questions here when you get stuck. That's how I learned. :)

Surrealist.
10-25-2017, 02:42 PM
Absolutely, if your intention is to make use of those things. This is why I realize I am being a bit off topic. I don't know. Just thought about tossing it out there.

But if you want more of my un-asked-for-advice - lol....

I have a tendency not to want to mix animation from one source to another. The exception might be bringing in mocap data into MotionBuilder. But that environment is tailor made for importing, mixing,and editing mocap and animation data between rigs and characters. So I am wanting to do all of that and finish it there. Cloth Dynamics I also do in Maya. MotionBuilder to Maya is real smooth and I set up a character in Maya with all of the dynamics properties and I only have to re-import the baked animation data right to the bones. From there it is Alembic out.

Hair is a workflow I have avoided entirely and will continue to do so. I just hate CG hair. But I would likely be trying to do that in Maya or someplace else and importing guides. I do like the Hair tools in Maya.

So if you wanted to really make use of LightWave animation FX etc, my suggestion would be Genoma or RH and simply bring in the character mesh, and set it up in LightWave. Not the best animation workflow in the world, but really it aint impossible. And I would be leaning more toward that just to avoid all of the various interop snafoos with FBX and bones and all of that.

So that leads me back to. I just import mesh cache data. And then do camera animation and rendering in LightWave.

But Greenlaw is right. And he has a lot of experience with these things, so, I just guess it depends on what you want to do.

Greenlaw
10-25-2017, 03:30 PM
Problem I have coming out of Iclone and trying to come in as bones is the bones are all pointed in the wrong direction. the rig is weired out. I have been coming in as Joints because of this. Do you know a good fix for correcting this problem?

It has to do with Joints being zeroed out in the T-Pose and bones having specific angles. (I think it's why I stuck with Joints for the Brudders thing--at the time I felt it was too much trouble to re-engineer the rigs.) However, if you set it up right, I don't think it matters because the retargeting process should consider the rotation offsets. I don't think I can do a good job explaining this so hopefully somebody with more expertise can explain.

(It doesn't help that I don't have a whole lot of experience with iClone and 3DXchange yet either.)


...I need to ba able to animate tail in Lightwave to get proper motion.
I'm using Motion Builder for tail animation for 'Brudders'. It's 'fake' dynamics though, basically a 'drag' effect applied to each of the tail-bones. (Sorry, I don't remember the details but it was pretty easy to set up.) When it doesn't look convincing or I need a specific motion, I just kill the MB keyframes in LightWave and reanimate the bones manually. But mostly I just let it be what it's going to be.

If you want to automate tails in Lightwave there are other options. You might consider using Bone Dynamics in Bullet or in IK Boost. You can probably fake it with Follower too.

For manual keyframing, you can use IK and targets or you could probably use Spline Control. If I'm working directly with the bones, I'd probably use as few bones as possible and make the bone influences a little 'squishy' to make it easier to animate.

Then there's SoftFX but that might have limited use. The nice thing is you can set it up and just let it run. But it's likely to fall apart with big sudden moves since it's basically just swinging around a squishy balloon. TBH, I'm not sure how it will look for a tail but it works okay for faking a cloth sim when you just want a little 'springy' motion in it on top of the bones deformation and you don't have time for setting up full dynamics.

Just some idea...I haven't actually tested all of the above myself for this purpose.

paulg625
10-25-2017, 04:04 PM
Well Greenlaw, Surrealist you guys have given me a lot to think about and to test out. Thanks so much. On my current project I think FBX will be my go to because I know I will need to work things out in lightwave . But will also look at MDD . I will look to see what I can find on Zeroing out bones. See if possible coming out of Iclone or if I will need to re-rig and use RHiggit. I will be asking plenty of questions as I move forward.

Thanks again. And by all means it you guys have any other thoughts on this subject please share. As suggested I'm going to trug forward until I find my next question :)

Greenlaw
10-25-2017, 05:30 PM
Sure. I'm sorry I'm not more helpful...I haven't been as active with mocap as I used to be, but I've been easing back into it this past month or so. When I start focusing on iClone 7 mocap i/o for LW, I'll post whatever I learn.

GraphXs
10-27-2017, 08:00 AM
Ya can use both as well, at the same time... as long as the animation is final in the other application. Once I used mdd cache to get the perfect skinned version of the mesh, and then used fbx and the bones to help with other dynamic items I needed to use in LW. It worked great!

paulg625
10-27-2017, 01:38 PM
Ya can use both as well, at the same time... as long as the animation is final in the other application. Once I used mdd cache to get the perfect skinned version of the mesh, and then used fbx and the bones to help with other dynamic items I needed to use in LW. It worked great!

That is the problem with Iclone right now I cant really get to a complete state there. I need to finalize my animations in Lightwave to put the polish on it. They are working on a curve editor so could get closer. But still no real animation handles on rigs. How did you get both to work together because I thought it removed the bones from the rig when you applied the MDD?

Greenlaw
10-27-2017, 02:09 PM
Generally speaking, once a character is scanned for MDD, the mesh becomes animated with the MDD independently of the bone rig. That's actually an the advantage of MDD--in rendering, for example, MDD more reliable and faster to process because the deformations are 'baked in'. That can free up the computer for more complicated tasks. It also means the MDD (or Point Cache) deformations are always going to be the same no matter which program you're in.

I think what GraphXs means is that additional deformations can be applied to an existing MDD (mixing multiple MDDs with nodes for one example,) and you can still use the original bone rig to drive dynamics independently of the character mesh (that's being animated by MDD.)

That's just a few examples I can immediately think of. The system may not always be intuitive but its uses can be surprisingly flexible once you know a few tricks.

If you do a search in these forums you can find all sorts of neat examples. (Thanks goodness for that...I can't possibly remember all the tricks myself.) :)

Greenlaw
10-27-2017, 02:27 PM
...I thought it removed the bones from the rig when you applied the MDD?
A little more info:

When you create an MDD, the rig isn't removed, it just becomes unnecessary if you apply the MDD to that mesh.

You can choose to remove the rig if you like but I think it's better to work in two different scenes: one for animation and MDD scanning, and a separate master scene for lighting and rendering, which contains the mesh sans rig and the MDD applied. This way, I have the option to revise the character at anytime in a 'light-weight' scene and scan a new MDD. When the new MDD overwrites the previous MDD, it automatically updates the master scene.

(BTW, I do this for dynamics too, not just character animation. I find it faster since I'm asking the computer to do less processor-intensive work at each stage.)

Hope this helps.

paulg625
10-27-2017, 03:13 PM
That sounds like a better work flow because I also have Chronosculpt and would be nice when needed to export MDD from the animation scene into Chronosculpt to tweak it if necessary then into the animation scene. The dynamics makes since to for MDD work flow.
Cool thanks!

A little more info:

When you create an MDD, the rig isn't removed, it just becomes unnecessary if you apply the MDD to that mesh.

You can choose to remove the rig if you like but I think it's better to work in two different scenes: one for animation and MDD scanning, and a separate master scene for lighting and rendering, which contains the mesh sans rig and the MDD applied. This way, I have the option to revise the character at anytime in a 'light-weight' scene and scan a new MDD. When the new MDD overwrites the previous MDD, it automatically updates the master scene.

(BTW, I do this for dynamics too, not just character animation. I find it faster since I'm asking the computer to do less processor-intensive work at each stage.)

Hope this helps.

Spinland
10-27-2017, 07:17 PM
Sorry not to get back to you, Paul (busting *** to make a film festival deadline on one of my VFX gigs, have about zero life right now), but you're in very good hands here. Looking forward to your future success in this! :jam:

paulg625
10-28-2017, 08:36 AM
Sorry not to get back to you, Paul (busting *** to make a film festival deadline on one of my VFX gigs, have about zero life right now), but you're in very good hands here. Looking forward to your future success in this! :jam:

No worries, I can relate to the zero time factor. I run there most of the time myself!!!