PDA

View Full Version : XBOX Kinect for LW10.1 / 11?



LW_Will
11-17-2011, 08:12 PM
I'm watching a stream from Japan where D Storm are demoing Lightwave 10.1, AND a plugin that will give total control to an XBOX 360 Kinect.

Does ANYBODY want to comment? Rob? Matt? Lino??

IF I didn't want this upgrade, I want it... NOW...

;-)

:lwicon:4evar

LW_Will
11-17-2011, 08:14 PM
Oh, btw... http://www.ustream.tv/channel/interbee-dstorm just if you are watching on Thurs nite.

OnlineRender
11-17-2011, 08:30 PM
told you it was only a matter of time I WANT..............

LW_Will
11-18-2011, 12:57 PM
So... yesterday, or earlier today, D-Storm ran a demo for the "Kinect for Lightwave" plug-in. I also downloaded (and ran through Google translate) the PDF with the capabilities of the plugin.

The plugin looks amazing. It basically removes the simple single step that use of the Brekel drivers requires.

This might be a huge game changer.

Unfortunately, I don't know when its coming out... apparently they are taking beta testers when it starts in December...

So, is this a node for the new virtual studio node setup? Is this just a plugin?

I'd like to be on the beta... but how? ;-)

Lewis
11-18-2011, 01:26 PM
Any video ? I cant' find videos on their WEB, only PDF.

erikals
11-18-2011, 02:54 PM
yes, kinda figured this would happen, Dstorm was part of the LW11 beta team, so they probably made this meanwhile.

have yet to see it in action though...
anyone have a link (video / pdf) ?

LW_Will
11-18-2011, 11:17 PM
Any video ? I cant' find videos on their WEB, only PDF.

They had a show (With Ms. Kiki... you can tell they love Kiki) on USTREAM, the url is in my other post, but they don't seem to auto archive the material.

I saw them do the demo live and it was a bit annoing. They guy running the demo, stepped through all of the points about LW10.1. When he and his partner got into the plugin, they were a bit lost. They got it to move and everything, but the character was stuck in space, and the movement was a bit jerky.

But it was driving a character in LW from a Kinect! And it was sweet. :)

According to the options in the PDF, you can tune the sensibility down. If they had more experience, it was obvious that they would've been using the plugin more efficiently.

We need more data. We need somebody to allow us into the beta! We need to see what this thing will do with LW11!

I'm rambling on... but we do WANT this plugin!!

ps. The website said their office is closed during the InterBEE show. There maybe more data on Monday...

LW_Will
11-19-2011, 12:11 AM
Wow... just rechecked the pdf. Seems to be I missed the final page (Duh! Sorry...) but it seems that the plugin is a World Famous D-Storm plugin... totally free! Coming in "early December"...

This is AMAZING!

OnlineRender
11-20-2011, 08:14 AM
Linky ??????? so are you saying this is free ??

KScott
11-20-2011, 09:47 AM
http://www.dstorm.co.jp/archives/press/LW10_KinectP_20111115.pdf

Found this, I'm in if it just cost me a new xbox. And misc , very cool

OnlineRender
11-20-2011, 11:57 AM
wow wow wow , I only caught the last 10 minutes of the live stream ...that and I couldn't understand what they where saying....but wow wow woow

LW_Will
11-20-2011, 07:08 PM
Run data from the PDF through google translate and you will find that, yes, the price of the plugin is for free.

I love DStorm... truly.

LW_Will
11-20-2011, 07:15 PM
wow wow wow , I only caught the last 10 minutes of the live stream ...that and I couldn't understand what they where saying....but wow wow woow

Isn't though? Between this and the PS3 MOVE setup, we are getting more useful stuff out of LW with inexpensive tools.

CROUTON213
11-20-2011, 10:48 PM
Please count me in on being extremely interested in this kind of technology used in Lightwave, or any 3D program. Here is another link I saw in Post magazine recently.
-qb

http://ipisoft.com/products.php

erikals
11-20-2011, 10:56 PM
 
...for professional stuff i assume iPi is much better.
(we'll have to see though)
 

Greenlaw
11-21-2011, 12:20 AM
 
...for professional stuff i assume iPi is much better.
(we'll have to see though)
 
I'm assuming so because iPi DMC is a 3D mocap tracking system as opposed to realtime mocap system--tracking in post allows for refinement and higher accuracy. Also, iPi is capable of dual Kinect capture which is amazingly accurate. Here's an example from iPi with the latest beta:

http://www.youtube.com/user/mnikonov#p/u/0/msRtIZX529Q

That said, the realtime Kinect capture demos I've seen from Jimmy Rig Pro is not bad--probably good enough for previs and better. I'm guessing the direct to LW system would be comparable. And I stress the word 'guessing' as I only really know anything about the iPi stuff. :)

G.

LW_Will
11-21-2011, 08:38 PM

...for professional stuff i assume iPi is much better.
(we'll have to see though)


Well... I like what iPi has done... but my main problem with the iPi solution is the two-step processing. You need to shoot video then have your system interpret the visuals. While you can have some nice fixes (Like a spinning character, hands passing between the center line of a character, etc) to your character's performance it is amazingly slow... IMHO

But... being able to have mocap actual built into your 3D package is again, a game changer. Can Maya do that? These new DStorm plugins and the Brekel driver work in real time. Again, a game changer.

While these setups do have limitations, knowing the limitations can allow you to work around them.

erikals
11-21-2011, 08:43 PM
yes, Maya can, but cool to see NT take action on this,...

Maya and Kinect,
http://www.youtube.com/watch?v=oiu2-I4Y1MU

Greenlaw
11-22-2011, 06:17 AM
Well... I like what iPi has done... but my main problem with the iPi solution is the two-step processing....While you can have some nice fixes (Like a spinning character, hands passing between the center line of a character, etc) to your character's performance it is amazingly slow... IMHO

It's all relative, IMO. Tracking for me takes about 0.6 seconds per frame, which I really don't consider slow.

As for taking time to fix raw mocap data, that's standard procedure for any production pipeline. I don't think I've ever worked on a production that used the realtime mocap data 'as captured'. Realtime mocap data is rarely good enough for production use. From what I've observed at professional motion capture stages, the realtime data is used for 'on the spot' review purposes for the director and clients but the actual production data still goes through a manual 'cleanup' process after the capture session is over.

G.

Greenlaw
11-22-2011, 06:24 AM
But yeah, the thought of being able to do this directly in LW is pretty cool. :)

3DBob
11-24-2011, 09:43 AM
Having been playing with the LW11 studio tools today, controlling lights intensity and bone positions with my space navigator - and seeing this. this is very good news indeed.

I'm getting very useable results from JimmyRig Pro right now and seeing things live with your actual character is really powerful..... For instance - as an actor, you can immediately compensate for characters that are thinner or fatter than you are because you can see the action in front of you realtime - rarely is an actor the exact build of a character and this is where blind capture fails leading to so much cleanup down the road.

3DBob

Greenlaw
11-24-2011, 01:00 PM
I'm getting very useable results from JimmyRig Pro right now and seeing things live with your actual character is really powerful.....this is where blind capture fails leading to so much cleanup down the road.
You're right, there is that advantage with realtime feedback. My characters are nowhere near 'human' proportioned and I had to learn how to adjust my performances to suit them. And of course that was after seeing all the things that could go really 'wrong' with these characters after tracking them. :)

That said, after a bit of practice, I'm finding that I don't get many fails or have to do a whole lot of cleanup any more.

G.

LW_Will
11-26-2011, 12:09 PM
I'm assuming so because iPi DMC is a 3D mocap tracking system as opposed to realtime mocap system--tracking in post allows for refinement and higher accuracy. Also, iPi is capable of dual Kinect capture which is amazingly accurate. Here's an example from iPi with the latest beta:

http://www.youtube.com/user/mnikonov#p/u/0/msRtIZX529Q

That said, the realtime Kinect capture demos I've seen from Jimmy Rig Pro is not bad--probably good enough for previs and better. I'm guessing the direct to LW system would be comparable. And I stress the word 'guessing' as I only really know anything about the iPi stuff. :)

G.

I stand corrected, sir. The Dual Kinect system is by far the best for a personal MoCap system. Bar none.

And, for this Xmas this year, the Kinect is $100(US$) everywhere (In the US... sorry, don't know about the rest of the world).

Between this iPi system, and the DStorm plugin for Lightwave, I whole heartily suggest that EVERYBODY go out and get t least one Xbox 360 Kinect. And then make some epic animations!

LW_Will
11-26-2011, 12:16 PM
Bob, Law, I cannot agree with the both of you more. The point of making this work for me, was in the beginning, to make personal pieces that were a link between the large pieces of MoCap that were available from Carnegie Mellon or another pre-made library.

Now, with the plugin or the dual Kinect, it seems that making actual pieces of animation that I supplement with the the CM data or Library data is more than possible.

It is a good time to be alive! ;-)

3DBob
11-26-2011, 12:27 PM
Whilst iPi is the only 2 K solution at the moment (though off-line) and the DStorm is 1 K..... Do not be so sure that iPi will hold its crown for too long.

3DBob

OnlineRender
11-26-2011, 01:39 PM
I thought it was free because it used openNi drivers and brekel source code?

anyway quick question does the move sync with bluetooth adapter for the pc?

https://store.dstorm.co.jp/blog/wp-content/uploads/2011/11/IMG_2008.jpg

erikals
11-27-2011, 02:32 AM
Whilst iPi is the only 2 K solution at the moment (though off-line) and the DStorm is 1 K..... Do not be so sure that iPi will hold its crown for too long.

3DBob

isn't iPi more like 1K max?
http://www.ipisoft.com/sales.php

also interesting is $300 Jimmy|RIG
http://www.origamidigital.com/typolight/index.php/pricingorder.html

where does it say that Dstorm Kinect is 1K though? couldn't find that info anywhere... linky?

 

tyrot
11-27-2011, 04:05 AM
I am also digging this Kinect solutions. Jimmy Rig Pro shows great potential...
what to choose ... IPI? waiting DSTORM? or JimmyRig Pro?

Also i wonder, Kinect + JimmyRig Pro --- IKBooster (for adjusting) is enough for decent mocap workflow in LW?

3DBob
11-27-2011, 05:16 AM
erikals:

I was referring to the number of Kinects the solution supports without having to spell it out, not the number of thousands for the solution. Sorry for the confusion.

3DBob

LW_Will
11-27-2011, 05:55 AM
I am also digging this Kinect solutions. Jimmy Rig Pro shows great potential...
what to choose ... IPI? waiting DSTORM? or JimmyRig Pro?

Also i wonder, Kinect + JimmyRig Pro --- IKBooster (for adjusting) is enough for decent mocap workflow in LW?

You know, I think it might! We just need all the pieces so that we can develop the workflow.

Really want to use IK Booster for this...

LW_Will
11-27-2011, 05:58 AM
I thought it was free because it used openNi drivers and brekel source code?

anyway quick question does the move sync with bluetooth adapter for the pc?

https://store.dstorm.co.jp/blog/wp-content/uploads/2011/11/IMG_2008.jpg

See, this guy, from the DStorm event, isn't using a PS3 Move and a bluetooth adapter, he's using the Kinect. See? He's not signalling for a touchdown. That maneuver tells the connect to look at him! ;-) I think you can use a kinect AND the PS3 Move at the same time, or in succession... got an idea for that... ;-)

LW_Will
11-27-2011, 06:05 AM
erikals:

I was referring to the number of Kinects the solution supports without having to spell it out, not the number of thousands for the solution. Sorry for the confusion.

3DBob

Or the resolution or the scanning ratio for the Kinect. I dare say a single Kinect is NOT 1k in resolution. Far from it.

No, I'd say that if we want to refer to the multi-kinect as opposed to a single connect we use the "X" designation.

iPi solution is 2x Kinect, the DStorm is 1x Kinect. Problem solved. ;-)

sami
11-27-2011, 08:45 AM
We've created a full-blown real-time avatar character animated system using the OpenNI drivers and Kinect (running in Unity using Lightwave created character rigs with bones and weight maps exported to FBX). It worked great so that we used it as a co-host for an awards show. no lag, (which made me feel awesome when we were at a WETA presentation recently and saw that their gazillion $ perf cap previz system for TinTin did have at least a full second lag- lol!) and we used a wii controller for other effects for the performer to trigger.

So I imagine with python or the c++ SDK it wouldn't be too hard to do this in LW directly in a rudimentary way (unity is made to be realtime LW is not). One would also probably need to use a wii or or something to allow the self performing animator to trigger acting "takes" of your realtime performances and start and stop the recording so you could later review and tweak the performance capture. This aspect itself would not be trivial to design well. But this seems straightforward to develop.

I have not checked out iPi so I don't know what they are doing with Kinects, but I can tell you from our experience Kinect is Far from the holy grail because:

* Kinect has limited "bones" it tracks - in fact it doesn't really track bones it tracks center of mass and interpolates rotations of bones in places it can. This doesn't easily alway translate to the rigs animators are used to using in LW.

* Kinect doesn't currently give and hand or feet rotations at all (the MS SDK does to a very limited extent (don't count on animating bank rotation on any bones at least not with any control). But the Ms SDK is in some ways more primitive than the Open Ni / NITE and doesn't provide joint rotations, just joint translations, so you need a full IK solution or to be using joint bones not z-bones (game engines like unity seem to work better with zbone rigs as it may be more "lightweight" on deformations and CPU than full IK?)

* We used a wireless mic to drive the jaw bone by volume and are still trying to think of the best way to do mouth shapes (no I don't want a head mounted camera with face dots).

* Don't count on multiple spine bones. You will get 1 rotation for the center of the torso

* Kinect also doesn't give head and neck rotations. We tried Faceapi webcam tracking but all webcam face tracking is racist ;) so actors with tan skin lost tracking a lot, plus running around a 3x4m area with quick movements loses tracking too. We ended up using a hack- using an android phone's gyroscope duct taped to a baseball cap (iPhone gyro drifted too bad), sending rotational data over wifi to realtime complement the Kinect data and give the character head rotations. Bummer is that no phone gyroscope currently on the planet gives any perfect rotational data. You rotate 90 degrees to the left and the galaxy s will say 78 degrees maybe and -90 ends up being -120 maybe so you constantly recalibrate. We could buy wireless sensors in the high end range >$2000 and add on a couple weeks programming noise filters etc on their low level API but it doesn't seem worth it.

* legs are also very shaky on the Kinect (notice MS Kinect avatar is a seated character.) and a bit unstable. Wear tights so it can see your knees, baggy jeans make your knees flop about and bank rotations on hips when you turn your body >75 degrees is wonky. Forget turning all the way around (maybe with 2 Kinects?)

* crossing legs or arms or backwards where there is depth confusion or an occlusion also leads to instability. We suspect 2 Kinects at 90 degrees (if the ir patterns don't interfere with each other) might help with that.

Overall it's awesome the Kinect for (realtime) character animation, but its in it's infancy and by no means bulletproof. I'm looking forward to new firmware and or the 2nd gen device. But for a light waver that has been in the trenches in live production with a kinect avatar system- it's a fragile thing with alot of constraints. Not complainng but trying to share what I know and save someone from our troubles.

I'm keen to see what you've mentioned, but this plugin would have to be pretty magical to be this animator's silver bullet. But maybe to animate cameras or position or in combination with physics systems it could be cool ( we put spring joints on the hands and hair locks so they didn't look so stiff when the rig was animating)..

Anyone else here have any better experience with the Kinect? :)

sami
11-27-2011, 09:08 AM
Btw that calibrate pose that dude is using is needed by the Open NI Nite drivers to start skeleton tracking. Earlier drivers you had to stand like that for like 10+ secs stepping forward and back with your hands up to get it to pick you up as a skeleton to track. Newer versions of the driver pick up under 2secs. While using the MS SDK, you don't need a calibration pose, it is instant on as soon as you walk into the performance area. 2 people tracking slows it down and causes significant lag even for stick figures...

Also if I was him I wouldn't be wearing such a baggy jumpsuit, that will cause problems when your arms hang close to your body :-p

Greenlaw
11-27-2011, 09:44 AM
This morning I posted the final version of Happy Box, our first movie based on my webcomic Brudders. It was created using Lightwave 10.1 with iPi DMC and two Kinect Sensors. You can watch it here:

Brudders in 'Happy Box' - Final Version (http://www.youtube.com/user/littlegreendogmovies#p/u/0/gy3hSz7_zb8)

Enjoy! :)

G.

3DBob
11-27-2011, 11:51 AM
Hi Greenlaw,

I like it.

B

OnlineRender
11-27-2011, 12:17 PM
This morning I posted the final version of Happy Box, our first movie based on my webcomic Brudders. It was created using Lightwave 10.1 with iPi DMC and two Kinect Sensors. You can watch it here:

Brudders in 'Happy Box' - Final Version (http://www.youtube.com/user/littlegreendogmovies#p/u/0/gy3hSz7_zb8)

Enjoy! :)

G.

COOL STUFF!:thumbsup:

Dexter2999
11-27-2011, 01:19 PM
This morning I posted the final version of Happy Box, our first movie based on my webcomic Brudders. It was created using Lightwave 10.1 with iPi DMC and two Kinect Sensors. You can watch it here:

Brudders in 'Happy Box' - Final Version (http://www.youtube.com/user/littlegreendogmovies#p/u/0/gy3hSz7_zb8)

Enjoy! :)

G.

"Like"'ed and "Favorite"'ed

OnlineRender
11-27-2011, 05:10 PM
"Like"'ed and "Favorite"'ed

google +'ed :dance:

Greenlaw
11-27-2011, 07:09 PM
Thanks guys! Happy to hear you enjoyed that. :D

sami
11-27-2011, 10:26 PM
This morning I posted the final version of Happy Box, our first movie based on my webcomic Brudders. It was created using Lightwave 10.1 with iPi DMC and two Kinect Sensors. You can watch it here:

Brudders in 'Happy Box' - Final Version (http://www.youtube.com/user/littlegreendogmovies#p/u/0/gy3hSz7_zb8)

Enjoy! :)

G.

Nice! I like it stylistically. :)

LW_Will
11-27-2011, 11:31 PM
This morning I posted the final version of Happy Box, our first movie based on my webcomic Brudders. It was created using Lightwave 10.1 with iPi DMC and two Kinect Sensors. You can watch it here:

Brudders in 'Happy Box' - Final Version (http://www.youtube.com/user/littlegreendogmovies#p/u/0/gy3hSz7_zb8)

Enjoy! :)

G.

Dude! Fabulous work!

And I liked the mocap too. ;-)

This is how great work is done, people!

Well written, characters that have different personalities and some really, really great Lightwave work.

THAT is what we've been looking for! Somebody who knows how to do this and IS doing it!

Great job!

LW_Will
11-27-2011, 11:34 PM
REDACTED

Dude... you've got to do a write up. We need to develop workflows and you seem to have one.

Please! If not for me, then the community!!

;-)

dwburman
11-27-2011, 11:55 PM
Demo showing iPi dual Kinect test with the performer turning all the way around.

http://youtu.be/msRtIZX529Q

Greenlaw
11-28-2011, 12:07 AM
Demo showing iPi dual Kinect test with the performer turning all the way around.
Yes, this works very well. Here's my experience: The chainsaw dance in Happy Box was originally recorded using a single Kinect with plans to cheat a 360 degree turn by rotating the base in Layout. But then dual Kinect support became available in iPi DMC and I found that I no longer needed to fake this action. :)

G.

Greenlaw
11-28-2011, 12:18 AM
BTW, I do not have enough space in our living room for a full 90 degree setup--my setup is probably less than 70 degrees.

Actually, I just now checked my test video Dual Kinect Sensor Test (http://www.youtube.com/user/LGDTestTube#p/a/u/1/XLWygMW12HI), and you can see the camera positions in this video at around 0:34 and it's clearly less than 70 degrees, but 360 degree capture was still no problem. (Visually, I'm not sure this is even 45 degrees but if anybody here is really interested I can find out the exact angle by opening the calibration scene file used for the Happy Box motions.)

Technically the current iPi DMC software allows you to use a 180 setup for full 360 capture but iPi Soft isn't recommending this setup yet. (Keep in mind that dual Kinect support is still considered 'a very beta' feature.)

G.

sami
11-28-2011, 04:15 AM
BTW, I do not have enough space in our living room for a full 90 degree setup--my setup is probably less than 70 degrees.

Actually, I just now checked my test video Dual Kinect Sensor Test (http://www.youtube.com/user/LGDTestTube#p/a/u/1/XLWygMW12HI), and you can see the camera positions in this video at around 0:34 and it's clearly less than 70 degrees, but 360 degree capture was still no problem. (Visually, I'm not sure this is even 45 degrees but if anybody here is really interested I can find out the exact angle by opening the calibration scene file used for the Happy Box motions.)

Technically the current iPi DMC software allows you to use a 180 setup for full 360 capture but iPi Soft isn't recommending this setup yet. (Keep in mind that dual Kinect support is still considered 'a very beta' feature.)

G.

We were going to attempt to use dual Kinects in our setup and compare and contrast (maybe average) data where confidence on a joint rotation is low or occluded, but for our initial show we were happy with what we used and could work around not fully turning around.

I was pretty sure they would both be addressable by the same machine - and if they weren't we'd use a separate laptop and send data over UDP to the other machine, but I was concerned that the 2 Kinects IR projection patterns would interfere. So if iPi is using them together then they don't? Is the firmware not confused by additional IR light on the scene?

Yes, in that video your video looks like 2 kinects at maybe 45-60 degrees from each other? That will probably provide less limb jitter I suspect, but how can that help with arms or legs crossing behind or going behind the torso from the one kinect? It seems like a 90 degree at least or 180 if that's possible with the IRs aimed at each other would get you more accurate data?

If your living room's too small for the full Kinect area, have you tried the Kinect Zoom lens? We had some luck with it (as it compresses the space and lets you be closer to the Kinect). (Also heard a new firmware for the Kinect coming is doing tricks to almost macro lens capability on the Kinect where you can get up to 50mm to it!) We had a bit more leg jittter when we tried the Kinect Zoom lens and had the room any way so we flagged that - but it might help in tight spaces.

Another couple of questions for you:
1) (maybe this is for the iPi guys), why is the iPi performance space with the Kinect (as listed on their website) only 2m x 2m? We were able to get a good 3m x 4m cone (to the Kinect) space, which gives a bit more depth for longer motion as well.

2) what heights did you prefer for the kinect? We settled on roughly a 2m high tripod (a tripod sandbagged on a table) aimed slightly down to the floor (adjusted for the performers height and arms stretch lengh.

thanks

sami
11-28-2011, 04:38 AM
Dude... you've got to do a write up. We need to develop workflows and you seem to have one.

Please! If not for me, then the community!!

;-)

Once the video of the show is edited and posted I will give a link to it and answer any questions, but much of it was custom Unity C# code and a couple minor apps to send kinect or phone gyroscope head data via OSC.

I did notice and was disappointed but not surprised to see that iPi does NOT do head and neck rotations - we found that this looked WAY too robotic and not life-like for a realtime use of an avatar - the head rotations seemed to even evoke gender! i.e. when us "manly" guys were driving the female 3D character it just didn't look right - we thought it was the model or the motion, but it turns out there are tons of female motion subtleties. When a female performer took the stage, subtle head tilts and everything made the character come to life for us.

Also this was confirmed recently at a WETA presentation I saw recently where they were telling us how they did planet of the apes performance capture and they pretty much said "use big people for big characters, use skinny people for skinny characters and try to make the performer's body as closely match the avatar as possible. The sheer physics of them makes a difference beyond the acting ability of the performer."

I found this fascinating and from my limited experience with it, quite true. Though admittedly, my use was for a realtime live situation, NOT a capture and then hand tweak the head animation kind of situation.

LW_Will
11-28-2011, 08:52 AM
Once the video of the show is edited and posted I will give a link to it and answer any questions, but much of it was custom Unity C# code and a couple minor apps to send kinect or phone gyroscope head data via OSC.

Thank you.


I did notice and was disappointed but not surprised to see that iPi does NOT do head and neck rotations - we found that this looked WAY too robotic and not life-like for a realtime use of an avatar - the head rotations seemed to even evoke gender! i.e. when us "manly" guys were driving the female 3D character it just didn't look right - we thought it was the model or the motion, but it turns out there are tons of female motion subtleties. When a female performer took the stage, subtle head tilts and everything made the character come to life for us.

Hmm... Fascinating. Maybe knowing you are a guy, when you see the data, you are seeing you in the data? Maybe? ;-)


Also this was confirmed recently at a WETA presentation I saw recently where they were telling us how they did planet of the apes performance capture and they pretty much said "use big people for big characters, use skinny people for skinny characters and try to make the performer's body as closely match the avatar as possible. The sheer physics of them makes a difference beyond the acting ability of the performer."

I found this fascinating and from my limited experience with it, quite true. Though admittedly, my use was for a realtime live situation, NOT a capture and then hand tweak the head animation kind of situation.

This is indeed fascinating. There are a lot of aspects to considered and it is all amazing. There are a lot of things to learn and understand about mocap in general and Kinect and Move with Lightwave specifically.

Greenlaw
11-28-2011, 09:49 AM
90 degrees is supposed to be the ideal with the current beta but you make do with what you can get. :)

Some users have experimented with the zoom lens with very poor results. I think the problem is distortion, same as when tracking 2D data. My guess is that iPi Studio would have to be aware of the level of distortion and 'undistort' the image to resolve the volume calculations. I think the developers are aware of this but I think they feel there are more pressing issues to address first. (And they would be right.)

FWIW, I don't feel I need to position the cameras any wider for my current projects. The results I got with my current setup are perfectly fine for the type of projects I'm doing at home, and I'm assuming results will be even better with this latest release. If I do need more coverage and greater accuracy, I also have the six PS3 Eye setup available which has a significantly larger capture space than Kinect. To see an example of how it resolves 'crossed arms', please look at this video I created about a year and a half ago:

http://www.youtube.com/user/LGDTestTube#p/u/4/G2KLtGsl-L0

This was a very early test using only four PS3 Eye cameras, and near the end of the video you can see me cross my arms. This test was created to find the actual 'limits' of the system and while it does reveal some real issues, it also shows many surprising strengths with the system.

IMO, if you are able to install both setups, you get a system capable of a broad range of motions at an incredibly reasonable price.

(In case you're wondering, I did not use the six camera setup on Happy Box because Happy Box was specifically designed to test the single Kinect setup. It was just a lucky coincidence that dual Kinect support became available during production.)

Anyway, I'm happy to say that, for me anyway, the dual Kinect system worked extremely well. While making Happy Box, getting motion capture on the characters actually became the least of my concerns. :p

The real bottleneck for me now is render time. Lightwave worked as well as expected (meaning it was absolutely fantastic for this project,) but I really need to get more render horsepower in here for future projects. (Or maybe consider using a game engine.)

G.

3DBob
11-28-2011, 09:53 AM
Yes - the wrong actor gives the wrong result - I call it WEIGHT, GAIT, TRAIT.

I've linked below renders of our very first uncleaned Jimmy Rig / Single Kinect tests. These were done following 2 mins of manually adjusted Autorig saved immediately to LW and rendered without any adjustment

There are basic cleanup tools in Jimmy Rig at the moment - but we are doing a key reduce, outlier cull and IKboost in LW to animate from these captures. This may not be necessary so much in the very near future.......

Roach (http://dl.dropbox.com/u/13098777/Roach_JRs.mov)

Raxx (http://dl.dropbox.com/u/13098777/rax_motiontest.mov)

Extreme kick legs were done by the actor in the rax test, he had poor choice of clothing on and Kinect was not in a good place. I think it creates a wonderful puppet on a string look that we might like to work with..... Don't fight it I say - use it. For close crops, works well. Lock down legs and just use top half is an option.

Enjoy!

3DBob

Greenlaw
11-28-2011, 12:02 PM
IMO, physical acting ability is actually far more critical for good performances than the 'right' body type. A very good actor or talented puppeteer can mime almost any body type and behavior, including physically improbable ones.

(Andy Sirkis as King Kong is a pretty decent example.)

G.

Greenlaw
11-28-2011, 12:21 PM
Oh, and I meant to comment on the clips: Those guys are awesome! :)

3DBob
11-28-2011, 12:56 PM
Thanks Greenlaw,

And your right a good actor can do so much - having mo capped a female though and a male - the hips just appear to move differently - I wrote a much longer post with qualifications - however - I had to move location and forum logged me out and then lost my post.

Andi serkis is great - but Planet of the apes, king kong and golem - well... hardly maralyn monro, jakie chan and Jordon (the basket ball player). I've been acting since the age of 4, but I don't pretend to be able to do balet, ju jitsu or wrestling.

A good actor can play a lot, but there are limitations.

3DBob

Greenlaw
11-28-2011, 02:26 PM
Oh, I wouldn't argue that there will be limitations no matter how good the actor is. There are obvious reasons why sports video games tend to use the actual 'sports stars' for performance capture. But then again, these guys are not being captured for their acting abilities but rather authenticity in a specific field.

IMO, it's ideal to have a physical actor who can totally get into the personality of the character, and imagine the shape and build of the character to mime it appropriately. That's why I mentioned puppeteers...this is what they're expected to do.

I guess when you get down to it, it all depends on your goals and what the motion is intended for. Some people will need highly accurate sports action for their productions; I need convincing motions of cats playing musical instruments and cute toddlers with giant heads.

Well, that's my two-cents anyway. :p

G.

geo_n
11-28-2011, 07:59 PM
kinect to lw for lw 9.6
http://www.youtube.com/watch?v=Cj9qi5doWec

The Dommo
12-02-2011, 01:10 AM
Nice examples, 3DBob ;)

3DBob
12-02-2011, 03:14 AM
Cheers Dom, its a start.

My current thinking is to do

- previs capture in JR / ipi
- minor adjustments and motion blending and basic head animation in JR
- Key reduction and minor tweaks in LW

Then either
- export to FBX
- Load motion into animeeple / motion builder & finesse
- export to FBX
- Rig char with RHR
- import and apply motion to RHR embedded FBX rig
- use RHR for final tweaks

or

- add finger and facial controls
- IKBoost and animate in LW

3DBob

3DBob
12-02-2011, 06:36 AM
This was done by someone more than a year ago with 4 sony cameras and iPi

iPi (http://www.youtube.com/watch?v=LiSYCD6fUD0&feature=related)

It is raw iPi mocap solve imported via motion builder into LW

3DBob

tyrot
12-02-2011, 06:50 AM
3D Bob
for a guy who never tried either IPI or Jimmy Rig..which one should i go? I know they are different but...what good things you can say about Jimmy Rig

And would it be SO hard to make a video tutorial about it? With various steps..

Guys really i have never excited about character animation this much.

BTW i was watching 3D Thunders (thanks mate!) Motion Mixer tutorials.. How these things work together without going out of LW ?

JR Pro / LW / Motion mixer or IKBOOST? what to do?
thanks again for this awesome thread..

3DBob
12-02-2011, 11:07 AM
JR autorigs - in seconds (albeit handless etc, bit like basic FBX rig), you can quickly adjust the result, blend animations together, create walk/run cycles and blend these, get it to walk around and over uneven ground even adjusting parts of animations and bake it all direct to a LWS you can immediately render. If you wanna tweak it, just IKBoost and you're away - or parse the motion and apply it to a RHR Pro rig for fine control including all of the nicities he has put in that rhig.

Also, because JR lets you animate live with Kinect - you can see results immediately, and I mean that. This stops you having to guess where the physical parts of you character line up with your actors performance.

I've just knocked up a quick OGL render... Just imported character into JR, Autorigged with minor adjustments (3 mins total) - Loaded a motion from the library and saved scene to LW. In Lightwave modeler I added the most awful mouth closing morph ever and dropped a few keyframes in Morph Mixer. Still rendered in 1.5 mins, OGL preview, about 1 second a frame.

Still (http://dl.dropbox.com/u/13098777/Look_Around_StillRender.png)

OGL Render (http://dl.dropbox.com/u/13098777/Look_Around_OGL.mov)

iPi can do many Sony cameras at a high frame rates for accuracy and coverage and *currently* is the only solution to offer 2 Kinects. For me though, I want realtime feedback.

3DBob

Greenlaw
12-02-2011, 11:28 AM
Hi Tyrot,

Jimmy|Rig Pro and iPi DMC take very different approaches towards similar goals and I think which you choose may depend on how much you want to spend and the results you are expecting. Be aware that both programs are still in beta as far as Kinect support goes.

AFAIK, iPi DMC is the only program currently with dual Kinect support, so if you need that level of accuracy, then it's a no brainer. This is with the iPi Basic and Standard, not Express.

For single Kinect capture, there are many considerations between the two programs:

Jimmy|Rig has mocap mixing and editing tools; iPi does not. Both programs will apply mocap to your rigs though. Jimmy|Rig applies the data directly to a rig it generates, while iPi transfers the data to your own rig via Collada or FBX. If you want to edit iPi data, you need to use another application like Motion Builder, Animeeple, or you can apply the data to a rig with controls like Rebel Hill's and edit the data in Lightwave.

Jimmy|Rig has an autorigger and exports directly to .lws; iPi does not. That said, you give up a bit of flexibility by going with J|R's 'do it all for you' solution. Also, J|R does not support FBX and it's been made clear that it never will. I think the choice here is between convenience vs. flexibility (J|R will create a complete rigged and animated character and export a Lightwave scene; iPi gives you raw data and it's up to you to apply it and make it work in your 3D app of choice.)

Jimmy|Rig is realtime; iPi is a post tracking system. One gives you instant gratification, the other gives you greater accuracy.

I'm not sure how the cost will compare eventually.

Dual Kinect support and the 'pro' version of iPi will be more expensive obviously, but it's the most powerful solution. If you throw in Motion Builder, it can be quite a bit more expensive. MB isn't absolutely necessary but it can sure make life easier if you have 'oddly proportioned' characters like my guys in 'Happy Box'.

I think single Kinect with iPi is currently more expensive than Jimmy|Rig Pro, however, Kinect support might not be included in the final release version of Jimmy|Rig Pro; instead, it sounds like it will be released as part of a third yet-to-be-announced product, and I don't think the price has been announced either.

There may be more current info on the Origami-Digital website and I suggest you check it out before deciding. (It's been a few weeks since my last visit.) If you have any iPi questions, feel free to ask. I can also answer some J|R questions but my knowledge about it isn't as current at the moment.

Hope this helps.

G.

Greenlaw
12-02-2011, 11:42 AM
I just noticed that iPi DMC is on sale now: http://ipisoft.com/sales.php

You can get the single Kinect Express version for $276, dual Kinect Basic for $416.50, and Standard (dual Kinect and six PS3 Eye support) for $696.50. Good deals! :)

Greenlaw
12-02-2011, 12:08 PM
Just to add a little more confusion to the mix, I should add one more point about J|R Pro's motion editing capabilities: It may not be as fully featured or as sophisticated as MB but it is definitely a lot easier to learn, easier to use, and it's a lot cheaper too. Again, it all comes down to how much you want to spend and how much time you can put into your projects.

G.

3DBob
12-02-2011, 12:13 PM
And as another thing to add to the mix - what if I could supply a wireless real-time finger capture solution for under $250 per hand. Would that be something that people might be interested in?

3DBob

Greenlaw
12-02-2011, 12:39 PM
3DBob,

Question: Does J|R Kinect capture head movements?

That's still unavailable in iPi, and I've had to make do using aim constraints. Aim constraints are easy to set up and work very well but there are a few shots in 'Pooper' where I wished I was able to capture the head-bobbing that was synced to music.

G.

3DBob
12-02-2011, 03:30 PM
Not yet. However you can add a manual modifier and put in some head animation before export to LWS.

3DBob

tyrot
12-02-2011, 05:28 PM
3DBob and Greenlaw ..

THANK YOU MATES..! very exciting times for Mocap for LW. I guess i will jump in JimmyRig and start working there. When i need i ll check back to IPI.
Please do not let this thread die..It is So cool information here

Greenlaw
12-06-2011, 06:27 PM
I checked in at the J|R forums at Origami Digital, and now it sounds like they are adding FBX support to J|R Pro after all. Those guys!

G.

3DBob
12-07-2011, 04:33 AM
Hi Greenlaw,

Yeh, sorry, forgot to mention that. It is FBX - but not an FBX standard rig. However, we have had no problems saving the FBX and loading into animeeple - retargeting and then saving animations that are now on standard FBX rig form to Collada - which imports into LW quite nicely.

This cuts out the need to save to LWS and then export form LW to FBX.

We are testing applying these motions to RHR FBX core version.

You can potentially expect JRPro to capture some head movements in an up and coming version.

On a related note, you could strap a Move to your head - and combine that with the Kinect to capture body and head - I'd like to see 3 moves, one on head and one on each foot combined with kinect and a bit of dudicious IK - should end up with rock solid feet and head tracking too. Of course capture area is limited but could still do a lot with it. This could be done with Studio tools and the Japanese Kinect plug in very soon.

Also - I have seen USB3 webcams running at 600 FPS which poses interesting possibilities - also running high frame rates of 60/120 FPS at stupid res - if these were just tracking active LEDs and dumping XY coordinates to IP from several calibrated cameras - could end up with very accurate broad capture area mocap.

3DBob

Greenlaw
12-07-2011, 11:34 AM
Thanks for the update. This all sounds very interesting indeed. Please keep posting on your progress.

That reminds me, and this doesn't affect anything we've discussed but, the PS3 Eye cameras have a maximum framerate of 120 fps. Sounds good but unfortunately it's not very practical because of the amount of graininess this speed produces. That said, PS3 Eye cameras have been around for long time now, so I imagine newer web cameras should produce higher quality images at these speeds than the PS3 Eye could.

Slightly off-topic, my several years old Sony Handycam CX7 can shoot 240fps for about 3 seconds, resulting in about 11 seconds of awesome slow-mo. It's a little grainy but the grain in this case is easy to clean up with Neat Video.

G.

3DBob
12-07-2011, 12:51 PM
This is the bad boy that interests me (due soon)

FL3-U3-13Y3M/C-CS Cypress VITA 1300 CMO S, 1/2, 4.8 μm,

Global Shutter 1280 x 1024 at 150 FPS

no wobble vision (no rolling shutter!), 1/2" USB3 camera


Here are some of their current models

http://www.ptgreystore.com/category/69-usb-30-cameras.aspx

3DBob

Greenlaw
12-07-2011, 01:24 PM
Drool. That's awesome!

This is just me but I've always been into using 'cheap' consumer cameras and seeing just how much I can squeeze out them. Right now, I'm planning to shoot a green screen project using a Sony Bloggie Duo (http://www.amazon.com/Sony-Bloggie-Camera-White-NEWEST/dp/B004H8FNCG/ref=sr_1_2?ie=UTF8&qid=1323289293&sr=8-2), which is a teeny pocket camera than can shoot full 1080p. Yes, there are serious limits with this device but for a variety of reasons (mainly compactness and ridiculous price) I'm still pretty excited about using it. Let you know how that turns out. :)

G.

3DBob
12-07-2011, 01:47 PM
The Sony does 120 FPS at 320 X 240 i think....

I would check out the new GOPRO HD Hero 2

Hero 2 (http://gopro.com/cameras/hd-hero2-outdoor-edition/)

These are small physically and economical at $299 and do

HD RESOLUTIONS:
1080p: 19201080, 30FPS
960p: 1280960, 48FPS + 30FPS
720p: 1280720, 60FPS + 30FPS

STANDARD DEFINITION RESOLUTIONS
WVGA: 848480, 120FPS + 60FPS

they do have crappy H264 fuzzy vision encoding (being a snob) BUT - apparently they have a clean HDMI RGB uncompressed out - can't quite believe that - if true - you could couple to this

Hyperdeck Shuttle (http://www.blackmagic-design.com/products/hyperdeckshuttle/)

For uncompressed portable HD recording - though the gopro is probably 8 bit, I don't know.

3DBob

Greenlaw
12-07-2011, 02:03 PM
Thanks for the tip. That looks like it's built to take abuse which is very appealing to me. (One of the reasons I like 'cheap'.) Could be my next camera.

As for fuzzy compression, after I clean up my 1080p output, I scale it down to 720p and hit it hard with my heavy 'comp stick', so I'm not terribly concerned about that. :p

G.

3DBob
12-08-2011, 11:56 AM
OK, an update.

1. Load Char into JR
2. Autorig
3. Animate via kinnect, tweak / blend - ensure have t-pose on fist frame (for ease) -Export to FBX
4. Import into Animeeple, Animeeple correctly assesses the appropriate bones exported from JR. Go to t-pose tab and select first frame.

Then you have your motion in Animeeple.

you can then load in your rig from LW that you have exported from LW as FBX - and assign the correct bones as required. If you download RHR FBX rig - this is a doddle pretty much auto magic.

Then you can drag your motion on your matched LW rig to apply - tweak in Animeeple if necessary or save to collada.

I then loaded the collada file into LW - you need to make sure the bones are children of objects and sub objects with the same name as those in your main animation - so that you can transfer the envelopes. Save as LWS.

Load the character you want to apply the motions to, then load from scene using load motion envelopes to apply the mocap to your character.

done. well - hopefully.

ooo, and yeh, here is an update to our first dancing freek.

Dance (http://dl.dropbox.com/u/13098777/Raxx%20Dance%20V2.mov)

Enjoy!

3DBob

3DBob
12-12-2011, 07:26 AM
Got this rather short but "Sign of the times" email just now.....

Hi all,

We have some bad news. We are shutting Animeeple down. Animeeple as a business has not been as successful as we had hoped. Thank you for your support and encouragement. We've enjoyed working with you.

We will keep the Animeeple servers online until January 31st, 2012. You can download content and use the Animeeple software until then.

If you bought the FBX exporter and need it after January 31st, please contact us at [email protected]

- The Animeeple Team

Does this mean that the current version will fail to work in the very near future?

Basically - if something does not pay - sooner or later it will fold. Unless you are a big bank of course.

on the upside - here is a new video showing some of the cool new features in the JR Pro Beta

Jimmy Rig new plugins (http://www.youtube.com/watch?v=bFA9LJx7OfU&feature=g-all)

3DBob

erikals
12-12-2011, 08:59 AM
i'm aiming for JimmyRig for sure,

btw, any plans on adding a second Kinect for more accuracy?

3DBob
12-12-2011, 09:08 AM
I can't say because development plans are subject to change, but I believe that is the intention. I guess the more people that buy an app the more incentive there is to take it forward.

3DBob

Greenlaw
12-12-2011, 12:14 PM
Man, that's a shame. I did wonder if this was going to happen because the developers' attention has been focused more on smartphone apps this past year than Animeeple. Plus, Okan and Leslie are really nice people who were quite helpful in my early efforts at 'homebrew' mocap, so even after I started using MB for Brudders, I was wishing great success for Animeeple.

IMO, Animeeple is a good idea that just needed a little polish. I wonder if if would do better as a commercial standalone application? Oh well.

The market is still wide open for other clever programmers who want to take on Motion Builder, which for most people is too expensive and way more 'program' than most users will need or want to learn. So far, Jimmy|Rig seems to be filling this niche nicely.

BTW, I should point out that iPi DMC has its own built-in retargeting system, which seems to work well for many artists who use it. Basically, you import at rigged character via FBX or collada, retarget the motion, and then export a new FBX or Collada for your 3D program of choice. I haven't used it myself yet--I used MB to retarget and edit the motions for Brudders--but if I get any time off this month, I'll take a closer look at the native iPi retargeting tools and let everybody know how they compare.

G.

3DBob
12-12-2011, 12:46 PM
Hi Greenlaw,

That would be really helpful.

3DBob

Greenlaw
09-21-2012, 09:41 AM
Heads up! I mean it--iPi DMC is finally getting head tracking before the end of this month.

http://forum.ipisoft.com/viewtopic.php?f=2&t=6310&p=12135#p12135

This is almost perfect timing for us. We intended to shoot new mocap for the second Brudders short next week using DMC 2.0 but if we have to wait a few extra days to use this feature, that's cool. :)

G.

erikals
09-22-2012, 11:26 AM
very-very nice.

does this go for dual kinects too?..

Greenlaw
09-22-2012, 02:53 PM
The tracking system in iPi Studio is essentially the same regardless of how you record the motion data with iPi Recorder, so the answer is probably yes. The difference will be in the quality of the track--six PS3 Eye cameras should be better than dual Kinect, dual Kinect should be better than single Kinect.

I'll post more reliable info once I get my hands on the update.

G.

Greenlaw
09-30-2012, 04:04 PM
The first iPi DMC 2.0 beta featuring Head Tracking was released today (build 139). I downloaded it and will be giving it a spin later today. According to the build notes, this version doesn't track Y rotation (heading) yet but it's a start. I'm actually okay with that because what I really need right now in our current production is X rotation (pitch). It can also track z rotation (bank), so even better!

I'm very excited about this, especially since we were preparing to shoot mocap for the next Brudders short this week anyway. Stay tuned. :)

G.

Greenlaw
09-30-2012, 05:00 PM
I ran a quick test a few minutes ago using some existing iPi Recorder data (dual Kinect configuration.) Head tracking in the current beta does seem to be working!

In this first test, I got a few pops in the head motion with the default Configurable Jitter Removal setting of 0 (first notch) but after dialing it up to 2 (middle notch) for the head, the motion looks pretty smooth and realistic to me. No tracking for heading (y rotation) yet but we already knew that because it's mentioned in the build notes.

Overall motion tracking seems better but since the build notes don't mention this, it's probably just my imagination. More later. :p

G.

massmusic
09-30-2012, 08:12 PM
I ran a quick test a few minutes ago using some existing iPi Recorder data (dual Kinect configuration.) Head tracking in the current beta does seem to be working!.

JR Pro Beta w/ Kinect has never worked on any system I've run it on. I was told back in April a fix was in the works. Maybe time to go with iPi.

Greenlaw
10-02-2012, 01:03 AM
iPi updated their iPi Recorder program today. Some UI and visualization improvements. Also improved support for the ASUS Kinect clone.

BTW, the devs mentioned in their forum that Y rotation for the head is coming later. Just keeps getting better. :)

G.

erikals
10-02-2012, 01:19 AM
great stuff..

hand rotation is the next step? :]
nah, probably too tricky.... ;]

very nice, a needed feature indeed.

Greenlaw
10-02-2012, 02:16 AM
great stuff..

hand rotation is the next step? :]
nah, probably too tricky.... ;]

Actually yes. I'm not sure how they will pull this off--maybe it will use the PS Move or Wii controller? That's just a guess--it's possible that it won't involve extra hardware at all.

Before the end of the year they intend to implement prop tracking (instruments, swords, rifles, etc.,) and multiple performer tracking.

G.

Greenlaw
10-23-2012, 05:52 AM
...I'm not sure how they will pull this off--maybe it will use the PS Move or Wii controller? That's just a guess--it's possible that it won't involve extra hardware at all.
It's confirmed! The upcoming iPi DMC with wrist and props tracking will use PS Move or Wii Motion Plus, your choice. I asked if this required the use of a separate game console and the answer was no--all you need is a bluetooth receiver for the PC. (A bluetooth USB stick will do just fine if your computer doesn't have this feature buillt in.) Also, you can use multiple devices at the same time--at least one for each hand I guess.

I asked if one controller was any better than the other and this was their response: "PS Move should provide more accurate orientation value due to having built-in magnetometer. Wii Motion Plus does not have one, so there is an accumulation of error during prolonged usage. Though we had not yet carried out thorough tests."

We don't have PS Move devices here in our home studio but we do have two Wii Motion Plus controllers. I can't wait to try this myself. The PS Move can be purchased for under $30 so I may be looking into trying that controller shortly after the software release.

G.

erikals
10-23-2012, 09:01 AM
yeah, PS Move devices are quite cheap, so guess that would be the best bet...