PDA

View Full Version : How hard is to use mocap?



rednova
08-08-2013, 04:48 PM
Dear friends:

I am very interested to use real mocap combined with lightwave in the future.
Is possible in the future I can come up with the money to get my own mocap
equipment. My question is: how hard is it to learn and use mocap (combined with
lightwave) ?
Thank you !!!

nickdigital
08-08-2013, 04:52 PM
Nevron makes it very easy and affordable. So I guess the answer is "Not very."?

Greenlaw
08-08-2013, 07:19 PM
It really depends on what you anticipate doing with mocap and how ambitious your project it. If you're looking to make the next Avatar, well, that's a bit too ambitious for Nevron and a single Kinect. A single Kinect setup will work well if your performances don't require much walking around, and you'll need to avoid motions that may cause self-occlusion. (i.e., no 360 turnabouts--for that, you'll need, at very least, a dual Kinect setup like iPi Mocap Studio. Here's an example of what would be difficult with a single Kinect: Sister's Mocap Test (http://www.youtube.com/watch?v=jASC8IOsIqY).)

That said, you can actually do quite a lot with a single Kinect setup--Happy Box (https://vimeo.com/channels/littlegreendog/55185005) was originally shot using a single Kinect and it was looking pretty good. The dual Kinect system came out only days after we finished our shoot, so we re-shot much of the film using dual Kinect, but the final film still includes a couple of shots from our original single Kinect shoot. Another example: our current production was shot using iPi Mocap Studio's dual Kinect system but I think many of the shots could have been shot using a single Kinect system like Nevron. Here's a one-minute excerpt from the film: Excerpt from 'Brudders 2' (https://vimeo.com/channels/littlegreendog/68543424). IMO, the only type of shots here that you might have difficulty shooting with Nevron are the ones at the end where Sergeant and Toullie turn 140 degrees away from Sister--all the other motions shouldn't be too difficult.

The tricky part has always been retargeting, which Nevron finally brings to LightWave. This is how you get the mocap data from a 'mocap' rig based on your performers' body proportions to a character's rig, which may be very differently proportioned. Without proper retargeting, you can run into weird problems like skating or sliding feet, limbs moving incorrectly and passing through the body, etc. Previously, you had to do this in Motion Builder or other animation program that featured retargeting and mocap editing, export an FBX and then use LightWave's Merge Only Motion Envelopes to transfer the motion to your rigged LightWave character. Now that we have Nevron, LightWave users can capture and retarget directly within Layout and do some basic editing too. This is not a 'Motion Builder killer' by any means but Nevron does make it feasible to work with mocap completely in LightWave.

IMO, working with mocap from capture to finish is a fairly advanced process, even with a single Kinect--there's quite a bit of planning and setup involved even before you start capturing, and you need to understand the limitations of your system and know how to work around those limitations. That said, once NT finishes the docs and we start seeing some tutorials, Nevron should make the process more accessible to intermediate users who have never used mocap before.

If you use LightWave and you haven't done any mocap work before, I think Nevron is a pretty good system to get started with. It's inexpensive, has a relatively short learning curve, and if you happen to outgrow it, you can move up to a more advanced system like iPi Mocap Studio 2 or Optitrack.

For me, Nevron is going to be a nice companion system to my iPi Mocap Studio 2 system because it allows for alternative ways to use the capture data (you can pipe the data into any animateable feature in Layout), plus it has basic face capture.

G.

jasonwestmas
08-08-2013, 08:24 PM
Totally depends on your character designs and what is involved there with your RIGGING setup. So really, the difficulty level is left totally to you and what you want to see in your deformations as he/she/it moves.

cresshead
08-10-2013, 04:30 PM
depends what you define as "motion capture"...what exactly are you looking to "capture"
such as whole body, mouth, eyes, whole face, head and neck, hands and fingers...

you can motion capture a game controller to puppet a character with an xbox360 controller or the sony move wands...and or use a kinect or a 6 cameras system
lots of options/requirements.

also some software allow you to capture using midi connections to hardware such as synths with their multitude of real controllers such as faders, velocity/aftertough keys , knobs
and buttons in which you can 'play' the character in realtime.

and there's the other of 'performance capture' with cameras driving blend shapes and face bone rigs.

Ryan Roye
08-10-2013, 06:18 PM
If you just want to edit, clean up, re-target, and re-use mocap on characters, you really don't need much more than IKBooster in most cases. Save programs like Nevron for the process of actual capturing and other stuff best suited to people who work with mocap on a daily basis.


http://www.youtube.com/watch?v=XmAnZssAwxo&feature=youtu.be

Can you believe we've had these tools for 10+ years? :D

geo_n
08-11-2013, 08:32 AM
How good is IKB for mocap retargetting? I watched most of ikb mocap tuts and It looks rudimentary. 10 years underdeveloped this could have been something good.

Ryan Roye
08-11-2013, 08:58 AM
How good is IKB for mocap retargetting? I watched most of ikb mocap tuts and It looks rudimentary. 10 years underdeveloped this could have been something good.

IKB mocap retargeting (in reference to position elements) is not fully automatic, it is achieved by calculating the rotation movement of the character's limbs and offsetting the parent object by the amount it moves. The nice thing here is that once the process is completed you can re-use the animation in Lightwave anywhere and the character will move relative to their current position in 3d space. Editing is achieved by auto apply with appropriate keyframe falloff settings (the tool I used in the video), but it requires a few other little things for it to work smoothly. Not hard to do, but hard to figure out without the benefit of documentation due to IKB's heavy emphasis on context-sensitive commands. Map motion 2 can be paired with IKB to achieve bone assignments for transfer between different rigs so long as their naming conventions are the same. As I have said before, it does not replace more specialized tools... but if you aren't working with mocap on a regular basis or simply don't want to spend your bucks on camera equipment and software, IKB + map motion handles it pretty efficiently.

geo_n
08-11-2013, 06:07 PM
Will that be on your commercial training? I'm looking forward to it.

Ryan Roye
08-11-2013, 06:53 PM
Will that be on your commercial training? I'm looking forward to it.

Yes, and that's just the tip of the iceberg. I will have more info as production nears completion; the production quality obviously needs to be higher than with my free tutorial content. Because I want people to know exactly what they are getting, I will outline *every* topic I cover for people to see before purchasing so they can know if the tutorial is right for them.

erikals
08-12-2013, 03:41 AM
also see > IKBooster vs NevronMotion (http://forums.newtek.com/showthread.php?136720-IKBooster-vs-NevronMotion)

Ryan, that's great! any estimated release date?

Ryan Roye
08-12-2013, 07:12 AM
Ryan, that's great! any estimated release date?

I'm shooting for an October release. If it gets done sooner, then maybe September.

Rollie Hudson
08-12-2013, 04:46 PM
All great info in this thread - but, suffice to say as an early NevronMotion adaptor and long time animator - it's a world of hurt and be very, very afraid. ;-)
That being said, successful taming of mo cap data is one of the most powerful aspects of 3D animation and once "mastered," as it were, the sky's the limit for really, really cool character animation! The advice I give myself is simply to keep working at understanding it, rigging, re-targeting, mo-cap file formats etc. And to enter this arena with lots of rests and breaks... The great (animators) out here often make it look easy... but I think for many of the rest of us it's a mine field. Just keep your eyes on the prize.

rednova
08-12-2013, 05:09 PM
Dear friends:
I have good news. I only want mocap for fun and personal projects-non-commercial.Only for personal
enjoyment. So I guess this new nevron thing is...if is more affordable and available...would probably be
good enough for my personal projects.
Thank you !!!

Surrealist.
08-12-2013, 06:45 PM
Something to keep in mind.

Cutting. Editing basically. Stringing together mocap takes to be seamless is probably one of the most challenging, unless you use MotionBuilder in which case it is fairly simple. But still something to master.

But especially if you are just playing with it why not open up the world of editing to your palette? By matching action from different camera angles you can string together a series of motions into an action sequence without having to have it all strung together in one motion. I did this using Blender a while back for a client and it worked a charm.

Megalodon2.0
08-12-2013, 07:06 PM
But especially if you are just playing with it why not open up the world of editing to your palette? By matching action from different camera angles you can string together a series of motions into an action sequence without having to have it all strung together in one motion. I did this using Blender a while back for a client and it worked a charm.

This is exactly what we do - not with Blender though, but with Sony Vegas. We use an 8 camera Optitrack setup to acquire specific motions and then edit the final images in Vegas. We have Motion Builder, but use it ONLY for retargeting and getting the data back into LW. We just purchased Nevron to bypass MB and work JUST within LW. Since we only use MB for retargeting, hopefully Nevron will simplify the process. But I think that this would be FAR easier than using MB to string together various motions into one seamless take. Besides... one of the best things in filmmaking is the editing. :)

erikals
08-13-2013, 06:46 AM
didn't the architect-render guy at Siggraph used LightWave's Motion Mixer on the movie he was working on to blend motions?
haven't tried Motion Mixer much myself, but it was claimed to be piece of cake as far as i recall.
this was a low-budget movie though.

my skepticism basically targets the transitions, would they be smooth?
is there any way to blend motions nicely in Motion Mixer? (curvy transitions)

Jimmy|Rig Pro is another alternative, currently beta though (there's link in my signature)

geo_n
08-13-2013, 07:14 AM
You can transition motions with motionmixer. From a walk to a run, a swing to a punch. It will look akward if the motions don't blend well. Problem is its baked animation. Better to have animation layers that are completely editable per layer.

erikals
08-13-2013, 07:47 AM
as for fixing the transitions, alternatives might be >

-merge the two bvh files in notepad (http://forum.maratis3d.com/viewtopic.php?pid=4598#p4598)
the mocap where the two transition can then be tweaked in IKB...

-merge the two bvh files in Jimmy|Rig, i believe JR itself can create a smooth transition.

(edit, yep)

http://www.youtube.com/watch?v=exLXI09rYPk

erikals
08-13-2013, 08:22 AM
it's possible to fix motions, but if you wanna take it further, you can always import it to IKB where you can make perfect, 100% non-sliding, sticking motions. in LightWave (and IKB) you can also add dynamics.


http://www.youtube.com/watch?v=bFA9LJx7OfU

Greenlaw
08-13-2013, 09:56 AM
At the Box, we used Motion Mixer way back when we did our first two Call of Duty commercials (2005)and also one of the Quaker State commercials with the oil horses (2006?). In both cases, we captured 'generic' motions related to the subject matter (combat motions and horse behavior, respectively,) and Motion Mixer worked out great with our data. We mainly used MM to re-time the characters but also to blend motions (for example, 'running' to 'dive' for soldiers, or changing gait for horses.) Later on, we switched to Maya for CA so Motion Mixer was rarely used again in the Box.

At Little Green Dog, we dabbled with Motion Mixer at the beginning of our own mocap productions but eventually switched to Motion Builder because we seriously needed its retargeting features for our 'non-human' proportioned characters, and since Motion Builder features a similar non-linear editor called Story Editor, we switched over to that for our re-timing/mixing needs. In Brudders 2, Story Editor is mainly for re-timing to make sure the motions stay in sync with the music track. If I was still doing this in LightWave, I'm sure MM would work the same way.

TBH, I'm not fully convinced that Nevron or IKB will allow me to retarget data to the 'Brudders' proportioned characters, especially the cats, in the same way Motion Builder does--at least not as quickly or easily. The Brudders characters have very long torsos, short legs and a very wide stance--when I retarget in MB, everything is adjusted to fit the character automatically and I rarely get any slipping or sliding in their gait. (Except for the tiny bit that might exist in the original data.) I'll have to give Nevron's retargeting a try as soon as I get a chance but I have a feeling it's mainly intended for characters that are much closer to normal human body proportions.

The other big reason I use Motion Builder is for its mostly non-destructive system of layering motions and it's flexible dynamic 'parenting' other constraints. This has been a god-send for the shots where the characters are handling musical instruments--in the excerpt (https://vimeo.com/channels/littlegreendog/68543424), for example, in a sequence of shots, Sergeant holds a harmonica in his left paw so it's initially constrained to the paw; then his other paw is constrained to the harmonica as he raises it; then, the harmonica is constrained to his 'lips' when it touches them, and both paws become constrained to the harmonica; finally, this right paw is released from the harmonica and becomes fully driven by the mocap again. To maintain continuity, this was done as once continuous animation even though it's broken into several shots. Admittely, this was not exactly 'easy' to do in MB but because I could create non-destructive takes and layers, I was free to experiment with different methods--so by comprison, MB made this task much easier to do than if I had tried it in LightWave.

I'm not saying everybody should switch to Motion Builder--just saying there is a good reason MB has continued to be the standard bearer for this type of work, even though AutoDesk has barely updated it since they acquired it back in version 7, and I'm very glad to have access to this tool.

But I do feel the inclusion of Nevron and Genoma are very good, promising steps in the right direction but there's still a of work to be done. The day may actually arrive where I can give up on MB and do it all in LightWave. :)

G.

P.S., Some may be wondering since I have MB, why I'm interested in giving it up. Two reasons: maintenance is expensive and, as mentioned above, it's hardly been updated since AD bought it. Like many users, I feel like AD is letting MB die a slow death while they cannibalize choice pieces of its code and GUI for Maya.

jasonwestmas
08-13-2013, 10:21 AM
I would hope that MB would perform to the same level as its price tag. :) I'm finding A lot of these conversations are just about how much time and money someone is willing to spend on software and tutorials in order to achieve one's vision. I guess that's a good thing to discuss.

Personally I don't think AD is cannibalizing ALL of their software. To me putting ALL of MB into maya would be like trying to do ALL modeling in layout. Eww.

geo_n
08-13-2013, 10:55 AM
it's possible to fix motions, but if you wanna take it further, you can always import it to IKB where you can make perfect, 100% non-sliding, sticking motions. in LightWave (and IKB) you can also add dynamics.


http://www.youtube.com/watch?v=bFA9LJx7OfU

Hopefully nevron becomes something like characterstudio. Gazillions of animated characters are based on that template.
Workarounds would be less and no jumping thru hoops. Plus the bonus of not using expensive mobu. We never needed mobu since cs does a pretty good job even for multiple character mocap stuff. If its good enough for Blur its good enough for many many studios.

Greenlaw
08-13-2013, 10:57 AM
@ jasonwestmas, You're right, that was something I neglected to mention--the cost of the software. We got MB cheaply because we had access to an educational discount (at the time the entire suite of AD tools was something like $400 or $500 for the ed license.) At some point, we're going to need a commercial license though and that might prove prohibitive for us, so we're considering alternatives.

There are a few good alternatives out there but nothing as complete as MB for mocap work. To be fair though, MB is far from complete as a 3D program--no modeling and, believe it or now, its 'basic' rigging tools are actually very limited even compared to LW. Motion Builder's primary super power is retargeting and mocap editing, and I think most users would consider it a specialist tool. (Like ZBrush or RealFlow.)

G.

Edit: From what I've seen, one exception might be Max--but I have no personal experience with Max so I left that out of the above.

Curly_01
08-13-2013, 11:02 AM
I would hope that MB would perform to the same level as its price tag. :) I'm finding A lot of these conversations are just about how much time and money someone is willing to spend on software and tutorials in order to achieve one's vision. I guess that's a good thing to discuss.

Personally I don't think AD is cannibalizing ALL of their software. To me putting ALL of MB into maya would be like trying to do ALL modeling in layout. Eww.


I would use reallusion Iclone. It also has build in walk generators etc... You can export fbx, bvh, and mix motions. It's cheaper then motionbuilder. I would use it with Nevronmotion. Export a fbx bvh file from it and link it with Nevronmotion on a Lightwave rig. You could also use Poser or Ikinema. In the future I'm going to use Iclone with Nevonmotion. I think that is the fastest workflow. There is a guy on the internet called truebones who has a ton of mocap cd's and not so expensive. I think you don't need expensive software to use mocap.

geo_n
08-13-2013, 11:15 AM
I would use reallusion Iclone. It also has build in walk generators etc... You can export fbx, bvh, and mix motions. It's cheaper then motionbuilder. I would use it with Nevronmotion. Export a fbx bvh file from it and link it with Nevronmotion on a Lightwave rig. You could also use Poser or Ikinema. In the future I'm going to use Iclone with Nevonmotion. I think that is the fastest workflow. There is a guy on the internet called truebones who has a ton of mocap cd's and not so expensive. I think you don't need expensive software to use mocap.

Problem with this kind of workflow is it might be the same as Ipi to lw workflow which is a big pain even dealing with one character.
Importing fbx files that generate temp lightwave scene, object files per motion clip. What a mess. Then you have a master lw file with one character rigged where you load item from scene motion only from the generated lightwave fbx scene making sure the lw fbx has the same hierarchy as the master lw character rig or else it fails. Too many steps, workaround, temp files for one mocap clip. You do this fbx, import motion only for every character on separate lw master scenes each per character.
Better with future nevron. Click rigged character, apply mocap data, done. Click next rigged character, apply mocap data, done.

jasonwestmas
08-13-2013, 11:51 AM
yeah I don't understand the advantage of using iclone with neveronmotion. I think I'd use just one set of tools for generating and editing mocap, not two or more. Maybe a second for slight tweaking but that's it.

Greenlaw
08-13-2013, 12:43 PM
yeah I don't understand the advantage of using iclone with neveronmotion. I think I'd use just one set of tools for generating and editing mocap, not two or more. Maybe a second for slight tweaking but that's it.

The 'advantage' may be that iClone is a complete system, including retargeting, retiming and mixing. I'm not sure how good it is and I don't know how iClone's retargeting capabilities compare, but in some demos I've seen it appears to handle 'non-human' proportions fairly well. At Little Green Dog, we have have a full license of iClone with 3DXchange Pipeline (the version you need for moving data in and out for LightWave,) but we haven't had time to really test it out yet. I'll be sure to post a review whenever I can get to it.

LightWave and Nevron Motion (along with Genoma, Motion Mixer, Morph Mixer and IKB,) is starting to get all the pieces in the package but it's not yet a cohesive system--what we have is a whole bunch of separate utilities with differing GUI and workflows. Hopefully, we'll get a system that's more uniform and complete when version 12 rolls in--we'll have to wait and see.

Again, not trying to discourage from using Nevron Motion--it's actually quite cool and, if you're already a LightWave user, it's inexpensive and it's the only way to record mocap directly into LightWave, and the easiest way to retarget the data to a character (again, in LightWave.) And to be fair, Nevron Motion is technically in public beta at this time--who knows how much it may improve over the next few releases? ;)

G.

erikals
08-13-2013, 01:30 PM
totally forgot this video, Bvh and IKB >


http://www.youtube.com/watch?v=uGRqrydOmV0


more info >
http://forums.newtek.com/showthread.php?135486-Tutorial-Motion-Capture-witk-Ik_booster!-%28with-test-files%29

erikals
08-13-2013, 03:15 PM
combining 2 Bvh files in Notepad >
http://forum.maratis3d.com/viewtopic.php?id=59

Surrealist.
08-13-2013, 06:22 PM
That reminds me I had this tutorial I did for 3D Artist magazine a while ago and they offered a free version online:

http://www.3dartistonline.com/news/2011/03/free-blender-tutorial/

It is not "true" re-targeting, but it does a very good job for most things. Some of this can translate to LightWave.

erikals
08-13-2013, 06:33 PM
nice, how does the two files, urm... "blend" though...?

is it static or or can the transition be edited to look smooth? (easy in / ease out)

Surrealist.
08-13-2013, 07:00 PM
Nothing in the tutorial about Blending files. Just basic re-targeting and set up including bvhacker.

Personally not interested in blending hacks as it always looks odd. If I am left with no tools to do it right, I'd opt for editing. For that I simply orient the character on the set so that it is facing the right direction and pick a point in the animation to cut on action. Liberal use of cutaways, reverse angles, changing camera angle drastic enough and so on. Basically all use the "Five C's of Cinematography" (http://www.freshdv.com/2012/05/five-cs-of-cinematography.html) to your advantage.

It has always been my contention since I started making independent films that the creative use of film technique will save you more money/time than any piece of equipment or tool.

Ryan Roye
08-13-2013, 07:11 PM
combining 2 Bvh files in Notepad >
http://forum.maratis3d.com/viewtopic.php?id=59

... it really surprises me just how many 3d formats just use plain-ol text that's accessible with such simple tools. That's pretty slick!!


Alright... here's something else to show to add to this topic; I think some of you might find it neat. This comes from my IKB experiment archives and it is all thanks to Larry for catalyzing this workflow which I now use on the Delura webseries. I didn't aim for perfect results, its just something I cobbled together in 30 minutes.

http://www.delura.tanadrine.com/image_manualupload/ikbmocaptest.jpg

http://www.delura.tanadrine.com/image_manualupload/IKB_mocapexample.zip

(after scaling, you may need to click the 3d window while in IKB mode for it to update. Real-time feedback is achieved by nudging any item once after loading the file. I don't know why, but this is required to initiate IKB's force refresh properties)

So... in a nutshell:

ADVANTAGE:

-Native! No other tools needed!
-Results in extremely re-usable, mixable, deployable mocap animations that can be modified and adapted in any way desired. Wanna make your mocap run down a flight of spiral stairs or up a ramp? You can do that.
-You can turn, twist, scale, etc and it will work (if you know proper apply keys workflow).

DISADVANTAGE:

- Slower process than other commercial solutions (IE: Nevron). Requires IKBind applications for each step. You can take a shortcut and use/bake motion modified nulls to automate a large portion of the process, but optimal, zero-foot-sliding results requires IKBinding between steps.
-Mostly destructive workflow especially after IKBind is disabled and one starts doing apply keys edits (for parts of character that are interacting with something; usually the ground). This is because the IKBind and "Fix" functions clash when used at the same time to control the same areas of the character.

LW_Will
08-13-2013, 09:13 PM
I would hope that MB would perform to the same level as its price tag. :)

Yeah, you would think that, wouldn't you? :-\

Surrealist.
08-13-2013, 11:29 PM
It does. :)

And then some.

geo_n
08-13-2013, 11:42 PM
Alright... here's something else to show to add to this topic; I think some of you might find it neat. This comes from my IKB experiment archives and it is all thanks to Larry for catalyzing this workflow which I now use on the Delura webseries. I didn't aim for perfect results, its just something I cobbled together in 30 minutes.


So you will use the bvh rig itself to drive your characters animations? I saw Larry do it the same way in his videos with ikb. He would fit the bvh rig into a character and save it as a template to import other bvh motions later on.
What if you wanted to use a different rig but use the motions from the bvh?

CaptainMarlowe
08-14-2013, 12:08 AM
To blend motions, I use the dead but still good animeeple, then exports the final bah, and import it in Lightwave, before retargetting with nevron motion. It works quite well, at least for my needs.

Ryan Roye
08-14-2013, 07:39 AM
What if you wanted to use a different rig but use the motions from the bvh?

So long as the mocap samples one takes from the internet are the same, these steps would allow it:

-Re-name all of the bones to match the bone names of another, different rig. If lscript commander is used, this process only ever has to be done once, then it's a 1-click process.

-Use map motion 2 to transfer animation.

-Adjust rotations to match target rig's rest rotation. Again, this only ever has to be done once per rig template. As soon as one knows the difference between the rotational values, one can use negative keyframe space and do a simple copy/paste/apply keys (flat) operation onto a saved "correctional" pose, and that will correct all of the rotational values in the entire rig at once.

Ryan Roye
08-14-2013, 10:29 PM
It is also worth noting that point at target constraints can entirely eliminate the need to even worry about how the axis of the bones in both rigs differ. A simple lscript commander macro combined with Matt's Assign tools pack can allow two entirely different rigs to be paired together (if your characters legs rotate on pitch axis, and the mocap legs rotate on heading, no problemo)... because all that matters is what the bone is pointing at. "target" bones can be constructed to adjust the length to prevent all flipping.

...Matt has really enabled so many possibilities with his script. :bowdown:

EDIT:

Rough test from my archives. Note that my custom rig is simply pointing and following the mocap rig.


http://youtu.be/ewsJIMiaLQg

Ryan Roye
08-15-2013, 09:22 AM
Another point at target example. I took some JimmyRig mocap I received from Ramirez and tried it on one of my characters.


http://youtu.be/rn2lrLTzQ7c

erikals
08-15-2013, 10:37 AM
looks good, i notice the upper arm of the final character seems a bit short, is this correct?

hope to see this stuff more explained in the video tutorial series...

Ryan Roye
08-15-2013, 11:02 AM
looks good, i notice the upper arm of the final character seems a bit short, is this correct?

An IKB load pose + apply keys operation is required for optimal results which, as I said earlier only have to be done once per character. Dijard is a long-legged, long armed and long-bodied individual so that's why he looks kinda scrunched into that rig. Don't worry, he has health insurance :D

But yeah it'll all get explained in the IKB tutorial. I think it will be extremely handy to a lot of people to know how to do this stuff natively in Lightwave even if they do own specialized mocap software. I am willing to bet the processes described could be combined with Nevron to, for example, paste in animations relative to the character's current position by just loading an IKB motion file. Nevron could prep the skeleton automatically, and then IKB could make it easier to re-use. That's speculation though... I definitely wouldn't mind seeing more Nevron tutorial content from its users just to see exactly how it works.

erikals
08-16-2013, 01:06 PM
...can't help but think, there must be an easy way to blend 2 bvh files nicely and easily together in LightWave. (after merging the 2 bvh files in Notepad first...)

LW_Will
08-16-2013, 02:21 PM
...can't help but think, there must be an easy way to blend 2 bvh files nicely and easily together in LightWave. (after merging the 2 bvh files in Notepad first...)

Motion Mixer.

erikals
08-16-2013, 03:10 PM
yeah? haven't used it, so it can blend motions fluid?...

edit: cool, i see now MotionMixer can do it,
when in Layout, press F1 to get the help file, search for "MotionMixer" read "Blending Motions with Transitions"

that would mean that i'd have to convert the 2 bvh files to 2 hmot files i guess...

this is great, didn't know LightWave with MotionMixer and IKBooster could kick it... http://erikalstad.com/backup/misc.php_files/035.gif

Ryan Roye
08-16-2013, 03:43 PM
yeah? haven't used it, so it can blend motions fluid?...

I've used motion mixer to blend animations; it works, and it is the only interactive way to blend motions in Lightwave that I know of... most other methods are "iterative" (IE: loading a motion file to see the blended result). I think the main thing to get used to is balancing the weight and transition values so that things line up proper. Also, the offset editor has to be used in order to make animation relative to a previous clip.

erikals
08-16-2013, 04:02 PM
what i wonder though, once the MotionMixer files are blended, can i then tweak / overwrite that mocap animation with IKBooster ?

or do i for some reason need to tweak the mocap with IKB first, then blend the motions in MotionMixer... (?)

Ryan Roye
08-16-2013, 07:13 PM
what i wonder though, once the MotionMixer files are blended, can i then tweak / overwrite that mocap animation with IKBooster ?

or do i for some reason need to tweak the mocap with IKB first, then blend the motions in MotionMixer... (?)

It really depends on the workflow someone wants to use... there are tons of possibilities on that front. I prefer sticking with IKBooster for everything, it's just a lot faster and I am never not in total control of the character that way.

Motion Mixer does blend things fairly well... however one needs to understand that having a standardized starting position/rotation for all animations is ABSOLUTELY VITAL to getting usable, clean results. If you take two mocap animations with differing start rotations/positions, Motion Mixer will yield nothing but frustration. So, you can either use the graph editor to adjust the mocap's starting position/rotation before loading it into motion mixer, or you can just move the top item to 0,0,0 HPB/XYZ, quantize keys in GE, and then run an apply keys operation to put that change into the entire animation.

I whipped up this little animation using my two test mocap samples from earlier... only this time the motions are blended via motion mixer. I should note that getting very seamless and slide-free transitions of the character's feet can be very difficult with motion mixer.

Mocap playback blending with motion mixer: 116463

Ryan Roye
08-16-2013, 07:56 PM
Extra random note:

You can mirror mocap animations with the Item Target method by childing the original mocap to a null, and setting one of its axis scale values of that null to -1. Because the bones are just pointing at their targets it is irrelevant that mocap's bones end up inverted.

Greenlaw
08-16-2013, 10:05 PM
Yes, as mentioned earlier, Motion Mixer works quite well for, uh, mixing motions. When I was with the Box we used it on the Call of Duty spots and the Quaker State oil horses a few years ago. :)

Motion Mixer is similar to Motion Builder's Story Editor. You have a non-linear timeline where you can place motion 'clips' of your motions. You blend them by overlapping the clips and using transitions, and you can compress and stretch the timing if you wish--it's a lot like editing video but with animation.

G.

erikals
08-17-2013, 06:05 AM
thanks for info http://erikalstad.com/backup/misc.php_files/smile.gif saved!
this make things more interesting.

Greenlaw, did you ever consider the MotionMixer + IKBooster approach... ?

extra note, here's a preview of the Keystrainer plugin (32bit)
might be of use for reducing mocap keyframes...


http://www.youtube.com/watch?v=kv9-EaYcmy8

Ryan Roye
08-17-2013, 08:42 AM
Dodgy posted this a year back. It isn't mocap, but the concept can be applied to it.


http://www.youtube.com/watch?v=1VRHXxSVr_M

geo_n
08-17-2013, 08:48 AM
what i wonder though, once the MotionMixer files are blended, can i then tweak / overwrite that mocap animation with IKBooster ?

or do i for some reason need to tweak the mocap with IKB first, then blend the motions in MotionMixer... (?)

Afaik since its been ages since I've used motionmixer, you can't overwrite the animation, only the timing(in/out points, speed, stretch). Its not like animation layers where keys can be added to the rig. Genoma has a very basic form of adding one layer of animation on top of mocap.

Ryan Roye
08-17-2013, 10:11 AM
Afaik since its been ages since I've used motionmixer, you can't overwrite the animation, only the timing(in/out points, speed, stretch).

Though I'd prefer a more convenient solution personally, there's a few options to get around this:

1) You can use the edit mode to change clips individually (if you want to make an alternate of an existing clip, make a new one first). Note that clip editing happens at the beginning of the timeline where the original animation was done, not at the timespan in which the motion mixer clip was used.

2) use MF motion baker when manual control is needed. Because of how relative positioning works, you will need to bake the entire span of motions that happen to be linked together. I don't recommend motion mixer's baker because it destroys all other animation in the timeline. This of course should be done after you are satisfied with the motion mixer items placed and don't intend to do any additional changes there.

There are a lot of improvements that could be made to Motion Mixer to make it a more robust tool... I believe Newtek purchased it from a 3rd party developer though so I have doubts that it'll ever get the updates it really needs.

Ryan Roye
08-17-2013, 01:19 PM
While we're on the subject of IKB + Motion Mixer, I think this plugin from Dodgy is highly relevant:

http://www.mikegreen.name/Files/MGMotionMixerControlGN.zip

It toggles motion mixer on/off to allow for IKB editing.

Greenlaw
08-17-2013, 03:01 PM
thanks for info http://erikalstad.com/backup/misc.php_files/smile.gif saved!
this make things more interesting.

Greenlaw, did you ever consider the MotionMixer + IKBooster approach... ?

I did several years ago (around 2008-2009) when I was just getting started on 'homebrew' motion capture and using LightWave 9.6.x, but at the time I found LightWave's 'mocap' support inadequate for my needs. Back then, our biggest problem was with LightWave 9.6's poor FBX support and LW's dependence on too many unsupported plug-ins. I wound up going with Motion Builder which really paid off when LightWave 10.1 brought out the much improved FBX I/O we have today. (In fact, I nearly left LighWave during the 9.6 cycle and Core, but the release of 10.1 restored my faith.)

Since that time, I grew comfortable with Motion Builder. MB's 'drag and drop' system makes retargeting trivial work even for characters with grotesque body proportions, and the non-destructive mocap editing workflow is wonderful to use--animation layers, multiple Takes, and a huge variety of 'drag and drop' constraints, altogether invites experimentation and exploration. If there is one issue that may make me switch to another mocap retargeting/editing system it's the cost of MB's commercial license (we currently have an educational license). When the time comes, I'm not 100% certain we'll upgrade, it really depends on how well we're doing at the time. If we do make the switch, it's more likely that we'll start using Ikinema Webanimate or possibly iClone with 3DExchange Pipeline. (That is, assuming iClone/3DXP works as well as I hope--this software is still untested by us and we won't really have time for it this until after 'B2' is finished.)

I'm very glad to see LW3DG focused on developing a cohesive character rigging/animation/mocap pipeline for LightWave, but I think LightWave alone still has a bit catching up to do to seriously compete with our current iPi-to-MB-to-LW workflow. But having seen LW3DG's recent progress (FBX I/O, Nevron Motion, Genoma,) who knows? When the time comes, maybe LightWave will be ready for us.

Just my two-cents. Others may feel differently of course. :)

G.

geo_n
08-17-2013, 10:38 PM
Btw IPI has retargetting. its possible to import any rig like motionbuilder. Ipi also has keyframing and basic animation tools. So for people looking for a cheap mobu that can use different rigs Ipi is pretty good. Just have to tolerate the lw fbx workflow which could be a lot better. Nevron afaik can only use its own rig to retarget to in V1 so its limited. But its real time capture.

Greenlaw
08-18-2013, 12:58 PM
Yes, that's correct--iPi Mocap Studio 2 has its own easy-to-use retargeting system. After capture, simply use Import Target Character with your own character rig in an FBX and then export the retargeted character to a new FBX for LightWave. This part of the process is straightforward assuming, you followed the expected bone naming conventions and hierarchy. If not, you'll need to create a custom template to associate the bones. (IMO, it's easier to just stick with a standard conventions.) After exporting the new FBX, use MOME in LightWave to transfer the retargeted motion data from the FBX to your final character rig in LightWave.

What iPi MS2 can't do is key frame editing that significantly change the mocap. You can guide the tracking system to fix tracking errors on the iPi Actor rig but you can't make big animation changes for the final character rig, like penetration issues caused by differing body proportions. For this level of editing you need an animation program like LightWave or Motion Builder.

In LightWave, you can make these changes more easily if you can transfer the motion data from the iPi MS2 FBX file to a rig in Layout (via MOME) that has a layered structure for offsetting rotations and IK controls that can override the motion entirely (like Rhiggit's mocap rig for example,) or use IKB as described by chazriker.

I prefer Motion Builder because, IMO, it's less 'fiddly' and better suited for working with extreme cases (like the cats in the 'Brudders' shorts.) Also, retargeting in Motion Builder is a bit more sophisticated than what's found in iPi MS2 and Nevron Motion.

I'm not sure how well iPi MS2's retargeting compares to Nevron Motion's retargeting--my guess is that they're similar but I really don't know. If I ever find the time, I'll check it out.

I'm actually working on a short video that demonstrates our iPi-MB-LW workflow used for Brudders. Will post it in a couple days.

G.

geo_n
08-18-2013, 09:20 PM
Yeah IPI is really a good solution and a cheap one that will make mobu less necessary especially if the host app has good native mocap tools already. Using Ipi with 3dmax is direct loading of bip files which is characterstudios mocap data. Super easy workflow, no mess, no temp files, no workarounds.
Maybe LW3D group can create a template for a nevron or genoma rig that Ipi can export directly to similar to bip files. No more tedious fbx workflow. That way people have the option of a mature dual kincect mocap system with Ipi. We are getting pro quality mocap from dual kinect and 4 pseye using Ipi. Before we had to go to a mocap studio with actors to get stuff done. Now its doable with "home" setups.

Greenlaw
08-18-2013, 09:56 PM
...No more tedious fbx workflow.

I'm not sure what you mean by 'tedious'. The FBX mocap workflow is to open your rigged character scene in Layout, select Load Items from Scene, choose the FBX with MOME on, click Okay and--voila!--the motion is applied to your character rig. Basically, it only involves opening two files in LightWave. Hardly tedious, IMO.

Naturally, this assumes you've retargeted the motion in iPi Mocap Studio or a program like Motion Builder first. I guess that's the 'messy' part if you don't have an easy way to do true retargeting (i.e., drag and drop in MB or just loading and saving a file in iPi.)

Nevron Motion is supposed to bring retargeting in LightWave but I haven't tried it yet to know how well it works. I hope to spend some time with it soon but just a little too busy right now.

G.

erikals
08-18-2013, 11:26 PM
yeah, i guess geo_n might be talking about the retargeting part...

Greenlaw
08-18-2013, 11:58 PM
Yes, but if you have iPi MS2, you can retarget there, so it should be fairly direct to work with LightWave right now. However, I have to confess, I haven't tried that workflow myself--I've only gone the iPi-MB-LW route described earlier because that's our established workflow and I don't want to experiment with alternative workflows until after we finish our current animation production. As they say, if ain't broke, don't fix it. Not in the middle of a project anyway.

What Nevron Motion gives you is the option to retarget and edit imported mocap within Lightwave, so it should be even more direct for working with any mocap system (especially the native Kinect option.) Assuming Nevron Motion works as advertised, LightWave users have this capability now. Again, I haven't personally experimented with the iPi MS2i-to-Nevron Motion workflow yet. We're pretty busy right now so I'm sure somebody else will get to it long before me. :p

G.

geo_n
08-19-2013, 12:57 AM
I'm not sure what you mean by 'tedious'. The FBX mocap workflow is to open your rigged character scene in Layout, select Load Items from Scene, choose the FBX with MOME on, click Okay and--voila!--the motion is applied to your character rig. Basically, it only involves opening two files in LightWave. Hardly tedious, IMO.

Naturally, this assumes you've retargeted the motion in iPi Mocap Studio or a program like Motion Builder first. I guess that's the 'messy' part if you don't have an easy way to do true retargeting (i.e., drag and drop in MB or just loading and saving a file in iPi.)

Nevron Motion is supposed to bring retargeting in LightWave but I haven't tried it yet to know how well it works. I hope to spend some time with it soon but just a little too busy right now.

G.

Wait, are you now loading the fbx file directly with load item from scenes into your master rig scene file? You're not temporary opening the fbx file in layout, then saving as lws? I know mobu has exactly the same workflow as Ipi that needs to use fbx with lw.
Previously you can't get fbx motions directly into layout and it needs to create a temp lws file that you will MOME into your master scene file. Is this not the case anymore? If you have dozens of motions and dozens of characters this is really problematic.

Greenlaw
08-19-2013, 01:33 AM
The way I did it in 'Happy Box' (2011) (http://vimeo.com/channels/littlegreendog/55185005) was I opened my character rig in LightWave, used Load Items from Scene to load the Motion Builder FBX, enabled Merge Only Motion Envelopes and clicked Okay. The motion was then transferred to my LightWave rig. Simple as that.

I learned this motion transfer method from Cageman back in late 2010 or early 2011, so it's been that way for a while. Rebel Hill helped me figure out the rigging part. I was using LightWave 10.1 when we made 'Happy Box'. IMO, 10.1's FBX I/O made working with mocap in LightWave a lot easier.

LightWave's motion transfer is a one-step process if you set it up correctly. The MOME option only imports the motion data and nothing else, assuming you have a Lightwave rig ready to receive the motion data. The receiving rig can be different (with additional holder bones, controls, etc.,) so long as the main hierarchy matches and is uninterrupted. To be clear, this is not retargeting, this is just motion transfer--our retargeting and editing is currently done in Motion Builder.

Alisa and I have more or less continued working this way for our current Brudders production. I'll try to get a demo video posted this week.

G.

geo_n
08-19-2013, 01:43 AM
Hmmm..ok have to try to MOME the fbx directly in a master rig. Previously you needed to load the fbx in layout and SAVE it as lws and as we know it creates a lot of junk/temp files depending on what's in the fbx per fbx file. But if you can MOME from the fbx file directly, great!
I got the workflow from watching Rhiggit videos done in lw 10. There were some reparenting issues as well before that needed to be manually done in layout but Ipi fixed that part of the fbx problem. And I'm still doing the open fbx save as lws, MOME to master file up to now so if your tip works then that saves me a lot of time and trouble tracking so many unnecessary files.

Greenlaw
08-19-2013, 02:06 AM
Yeah, I think the RH video you're referring to used an FBX from Animeeple or iPi DMC 1.0, both of which presented issues for LightWave's LIFS MOME.

But all those extra steps you see in that video aren't necessary with a Motion Builder FBX--LightWave loves MB FBX. Just use MOME directly with the MB FBX and, if your rig is prepared for it, the motion transfer should just happen.

BTW, iPi Mocap Studio 2 got fixes to its FBX format specifically for LightWave compatibility so I think a direct MOME transfer should work with that FBX too. (FYI, I haven't tested this myself but other users have reported that it works fine.)

One more thing--you need to be sure there is only one Take in the FBX file. LightWave's motion transfer will choke if there is more than one Take. What happens is that the rig will receive only the first frame of the imported motion and nothing else. Also, be sure the rig's root object has the same name as the root in the FBX--if they don't match, LightWave will import the entire contents of the FBX. (Okay, that was two more things.) ;)

If you follow the rules, the steps are simple and the result is pretty sweet.

G.

RebelHill
08-19-2013, 03:57 AM
Previously you needed to load the fbx in layout and SAVE it as lws and as we know it creates a lot of junk/temp files depending on what's in the fbx per fbx file. But if you can MOME from the fbx file directly, great!

You always could MOME directly from the fbx file, the only issue was that oftentimes (depending on which app/apps things round tripped through) you would get extra stuff inserted into the hierarchy, which'd throw the MOME necessitating the cleanup of those parts for the two to merge correctly... As it happens, even going direct from fbx, you are still getting the intermediate lws files... LW just opens the fbx, saves it to an lws somewhere (as usual) and them MOMEs from that.

geo_n
08-19-2013, 04:36 AM
Just tried it. Direct import of fbx MOME does work! Is that in the manual? :D
But just as RH and I said, the fbx from IPI creates slightly different info that makes it not useable directly by MOME. The fbx in Ipi has a different root than the original fbx in lightwave. So yes the extra steps to open each mocap fbx and save as lws and rename and reparent hierarchy is still there. Takes a minute or two I guess but it should take only 10 secs to import mocap really. :D
I'm not sure its newtek or Ipi which should fix this. But I'm pretty sure if fbx import in lightwave had an option to match and fix hierarchy to the target rig, or just auto fix it during MOME, this is a welcome feature. Maybe possible by lscript?
Guess motionbuilder does a better job with fbx file format if it doesn't have this issue.

cresshead
08-20-2013, 06:59 AM
i love how people just write in the phrase..."load it into motion builder"..it's like 3500 inc VAT..so there's a small stumbling block right there!

Greenlaw
08-20-2013, 08:52 AM
FWIW, you can get an educational license as part of a suite for around $400 or $500, so it's not terribly expensive.

The drawback to this version, of course, is that that it's not for commercial use and I don't think you can upgrade it. That will eventually become an issue for us because we do hope to go commercial someday.

There are a few alternatives (Ikinema Webanimate, iClone with 3DX Pipeline,) that don't cost nearly as much as the commercial license of MB, so it's not like you absolutely need MB to do this level of work. That said, MB's system is mature and very solid, even though the program hasn't seen a significant update in many, many years. I think this is why Autodesk can still feel justified in charging so much for it. (I would love to see a serious challenger though.) :)

G.

Ryan Roye
08-21-2013, 09:13 PM
There is almost nothing about mocap that cannot be done in Lightwave using native tools with a usable level of efficiency; I'll be covering this subject extensively at some point. I respect the need for tools like Nevron/MB, but I also like low cost alternatives and it is fortunate that users of Lightwave have so many powerful tools at their disposal for motion elements of animation.

Fun fact: You can use the Graph editor's Numeric Scale function (specifically, the "value scale" field) to quickly re-position the XYZ motion of a mocap animation relative to the scale of both the object itself and/or their leg/arm length. Combine this with the leave/backtrack footprint features and it makes it a pretty convenient, built-in process. This tool can be combined with the various other methods to help with mocap handling.

short223
08-22-2013, 08:40 AM
One more thing--you need to be sure there is only one Take in the FBX file. LightWave's motion transfer will choke if there is more than one Take. What happens is that the rig will receive only the first frame of the imported motion and nothing else.

If you follow the rules, the steps are simple and the result is pretty sweet.

G.

Ahh! This was plaguing me for a day or so! Now I got it working great and am starting to implement a nice workflow between MB and Lw. Thanks! A quick question, Have you used the Genoma Mocap Rig preset with MB yet? It seems to be coming in much larger in size when compared to the stock mocaps I have been using. It also mentions something about toe or thumb rotations that are off. I did my tests with just the rig by itself so I could easily see the process. does it matter if the rig is a bit larger or smaller than the eventual control rig that will drive the mocap?

Thanks!
Chris

Surrealist.
08-22-2013, 07:35 PM
By the way the ED version of Mobu was mentioned. Actually you can get them for free. That is what I did to do tutorials and learning the software fully before buying.

Usually people get Mobu as a part of a suite. There is really not a lot of advantage to purchasing it outright when you consider that you can get a competitive side grade from other software as I did. Not appropriate to go into it further here. But if you PM me I can give you some details about how I saved a ton of money getting my hands on some great software - legally.