PDA

View Full Version : Hack turns Kinect into 3D video capture tool



caesar
11-15-2010, 06:15 AM
http://www.engadget.com/2010/11/14/hack-turns-kinect-into-mindblowing-3d-video-capture-tool/

Very cool, watch the 1st video

erikals
11-15-2010, 09:09 AM
haha, that was fun :]

cresshead
11-15-2010, 01:09 PM
looks REALLY promising actually...maybe 2 of these cameras would enable motion capture to be pretty solid...even 1 camera seems to do an amazing job...just need some tracking data for joints along with the depth


http://www.youtube.com/watch?v=pk_cQVjqFZ4&feature=player_embedded

OnlineRender
11-15-2010, 01:16 PM
QUOTE " " This is the same man who got the iOS devices syncing with Ubuntu! What a lord!

http://www.omgubuntu.co.uk/2010/11/open-source-kinect-driver-demoed/

erikals
11-15-2010, 01:28 PM
but can it compete with...
http://www.youtube.com/watch?v=dTisU4dibSc

cresshead
11-15-2010, 01:38 PM
but can it compete with...
http://www.youtube.com/watch?v=dTisU4dibSc

if price and availability are high on your list...then i'd say the kinect hacking/programming we may see could be a real option for many people here...the new scientist vid is nice but it's not gonna be £130 for the hardware [kinect] Vs the new scientist video [8 cameras] :D

erikals
11-15-2010, 02:50 PM
not sure, seems to be only $15 for some of the source code...
http://portal.acm.org/citation.cfm?id=1360697

cresshead
11-15-2010, 05:04 PM
http://www.ipisoft.com/products.php

on it's way to this budget system

erikals
11-15-2010, 06:46 PM
yes, the best way is actually to use iPi together with Syflex and Sign Track...

for scanning, laser scanners have become quite affordable... ($600)
http://www.david-laserscanner.com/

OnlineRender
11-19-2010, 08:56 AM
http://uk.gamespot.com/xbox360/action/adrenalinmisfits/news.html?sid=6284291


MIT student links Microsoft's motion-sensing camera to iRobot, uses it for 3D mapping, gesture command.

shrox
11-19-2010, 01:26 PM
I was guessing someone would do that. We've had one since the release and it is just begging to do some mocap.

Philbert
11-20-2010, 06:10 PM
Yes I was waiting for a mocap hack to come out of Kinect since it was first introduced as Natal.

OnlineRender
11-23-2010, 09:05 AM
Microsoft approves of Kinect open-source projects

http://uk.gamespot.com/xbox360/action/adrenalinmisfits/news.html?sid=6284388

cresshead
11-23-2010, 10:12 AM
excellent news from microsoft so we can now hope to see a motion capture setup from those chaps from http://www.ipisoft.com/products.php in 2011

erikals
11-23-2010, 02:59 PM
Great! :]

Philbert
11-23-2010, 03:38 PM
I just saw this hack today, it looks pretty close to a mocap set up already, just needs to record the motion.

http://www.vimeo.com/16985224

cresshead
11-23-2010, 04:13 PM
I just saw this hack today, it looks pretty close to a mocap set up already, just needs to record the motion.

http://www.vimeo.com/16985224

wooo we're there!
great stuff!:thumbsup:

Tonttu
12-13-2010, 11:24 AM
Controlling a 3D-monster doggie with puppeteering: http://www.youtube.com/watch?v=g-JSaRHhdzU&hd=1

Pretty cheap way to do awesome stuff in the future? Like TAFA on steroids?

Greenlaw
12-13-2010, 11:39 AM
That's fun. It reminds me of this open source project but made better now: Animata (http://animata.kibu.hu/index.html)

G.

Heldure
12-15-2010, 09:29 AM
that would be amazing! less expensive for indie studio, and more people including me could learn mocap.

Greenlaw
12-15-2010, 11:50 AM
I have to admit, I've been skeptical of Kinect, especially for traditional full body mocap like that created using iPi Studio. The problem with Kinect is that you can only use one camera angle in a room because it relies on IR for depth data, and having only one camera reduces the accuracy of the tracking. For example, the Kinect cannot see body parts that are being occluded by the performer's body. Compare this to multiple PS3 Eye cameras (four to six) arranged in a full or semi-circle, which can record the entire body. Also, there's the issue of cost versus perfomance: for the price of a single Kinect camera you can buy five PS3 cameras.

But...after seeing these realtime puppeteering demonstrations the Kinect obviously shows potential use for a very different area of mocap animation.

To be fair to the Kinect, I should also point out that, while the PS3 Eye camera setup for iPi is a lot more accurate, it also requires a lot more space, and to record and track six streams of syncronized video requires an above average computer and graphics card. My system currently takes up an entire two car garage, and that's only just big enough for full body capture. (See here for an example: iPi Studio Markerless Mocap Test (http://www.youtube.com/user/LGDTestTube#p/a/u/0/G2KLtGsl-L0).)

The Kinect setup, however, will probably work with an average gaming computer and a space as small as an average living room. I imagine this setup will be more appealing to 'mainstream' and hobbyist users.

Professional and semi-pro users like indie filmmakers, small studios and inidividual freelancers will most likely want the precision that comes with using mulitple PS3 cameras though.

But, if the Kinect system proves to be well-suited for realtime puppeteering, users like me will probably wind up wanting to use both systems.

Anyway, thanks for posting these links. I have to say I'm now a lot more interested in seeing where the Kinect goes mocap technology than I was before. :)

G.

Philbert
12-15-2010, 12:04 PM
I wouldn't expect Kinect with a mocap setup to be the greatest in the world or anything. Like when I first started at DAVE School in 2002 all the had for mocap in house was a magnetic system where the performer had to basically stand in one place. I expect a little something closer to that, anything more is a bonus.

Titus
12-16-2010, 12:21 PM
I have to admit, I've been skeptical of Kinect, especially for traditional full body mocap like that created using iPi Studio.

I've a setup with iPi Studio right now. The couple of test I did aren't really great. Do you have any tips?

Greenlaw
12-16-2010, 04:09 PM
I've a setup with iPi Studio right now. The couple of test I did aren't really great. Do you have any tips?
Hi Titus,

I started a thread here on the topic: iPi Studio Desktop Motion Capture Tests
(http://www.newtek.com/forums/showthread.php?t=114498&highlight=ipi)

Feel free to ask any questions there. I recently added two more PS3 cameras to my set up but probably won't get around to testing them for a while. In the meantime, I've been spending a lot of time getting the iPi Studio mocap data onto my LW rigs and characters and having some good success with it. (Tip #1: Wait for Lightwave 10. You can do it in 9.6, but it's gotten a lot easier with 10.)

I'll post more videos when I get the chance. (Probably soon.)

G.

Tonttu
12-18-2010, 10:46 AM
Is there an open source alternative for iPi Studio?

Titus
12-19-2010, 07:58 PM
that would be amazing! less expensive for indie studio, and more people including me could learn mocap.

You can have a mocap suite with less than a grand using ipistudio, and guess what! it supports kinect.

Tonttu
12-20-2010, 02:03 AM
You can have a mocap suite with less than a grand using ipistudio, and guess what! it supports kinect.

Aha, Express edition with Kinect support, Availability: Coming soon.

cresshead
12-20-2010, 02:53 PM
kinect to bvh - motion builder demo
http://www.youtube.com/watch?v=rJGJBpz_Oec&feature=related

kinect to blender - point cloud
http://www.youtube.com/watch?v=yZSXXFwsyhc&feature=player_embedded

kinect -ogre 3d
http://www.youtube.com/watch?v=Zl6O-Rf52Co&feature=related

MikuMikuDance with OpenNI(Kinect) test 2
http://www.youtube.com/watch?v=bQREhd9iT38

http://www.primesense.com/images/siteCont/Content_63.6.jpg

http://www.primesense.com/images/siteCont/Content_63.5.jpg

Philbert
12-20-2010, 05:23 PM
That one with Miku chan was great. Looked like it was pretty much on the money.

He even shared the code:
http://www.geocities.jp/higuchuu4/index_e.htm

cresshead
12-21-2010, 10:27 AM
live link to 3dsmax

only working with arms so far but "F" that looks cool!
http://www.youtube.com/watch?v=lLK3E_vMGBU

cresshead
12-21-2010, 06:10 PM
more demo's

http://game.g.hatena.ne.jp/Nao_u/20101221#p1

cresshead
12-22-2010, 07:17 AM
2 kinects working together...early days but THEY WORK...people were saying that no way would the IR cameras be able to work with 2 kinects...err..they ARE here!

http://www.youtube.com/user/okreylos#p/u/3/5-w7UXCAUJE

erikals
12-22-2010, 07:21 AM
2 kinects working together...early days but THEY WORK...people were saying that no way would the IR cameras be able to work with 2 kinects...err..they ARE here!

http://www.youtube.com/user/okreylos#p/u/3/5-w7UXCAUJE

hah, i was just about to say it will first show it's true potential when they connect several Kinectic scanners together....

...and here we go... hehe :]
cool! :]

Greenlaw
12-23-2010, 04:13 PM
The iPi guys have iPi Studio working with Kinect now:

iPi Desktop Motion Capture: Kinect Test 2 (http://www.youtube.com/watch?v=GCu8KTrC4sc)

G.

cresshead
12-23-2010, 04:36 PM
.....THUD!
....that was me...falling to the floor!

Greenlaw
12-23-2010, 06:37 PM
That was their second test. Here's the first one from a couple of days ago:

iPi Desktop Motion Capture: Kinect Test 1 (http://www.youtube.com/user/mnikonov#p/u/1/Ey3xJOf9xDI)

Those guys are amazing.

G.

Philbert
12-23-2010, 06:38 PM
Time to start saving my pennies, by the time I have enough saved up It may be ready.

I wonder how much iPi Plans to charge for this though. Their current software is about $1000 if I recall correctly, while it looks like there may well be open source options using Kinect that will be free.

Titus
12-23-2010, 06:42 PM
Time to start saving my pennies, by the time I have enough saved up It may be ready.

I wonder how much iPi Plans to charge for this though. Their current software is about $1000 if I recall correctly, while it looks like there may well be open source options using Kinect that will be free.

More like 500 for the basic, and 250 if you're a student :).

Philbert
12-23-2010, 06:44 PM
Yeah I was looking at the Standard one that is $995 since it seemed like that is really what you need to get anything decent. Though I could be wrong.

The basic is actually $595 Plus the cost of cameras. If it needs multiple Kinects that could be another $500 right there.

cresshead
12-23-2010, 06:55 PM
only 1 kinect needed..£129...

as SOON as it's available to buy and can spit out 3dsmax biped friendly bvh's i'm having one!

OnlineRender
12-25-2010, 07:27 AM
Santa made sure there was a Kinect under the tree this year 'cheers Santa' .....

but sadly my wife says I need to wait a week before I can start some hacking :( ....

I gather its possible to corrupt the firmware in the device , I may need a backup Kinect .

Philbert
12-25-2010, 07:58 AM
I don't think it's possible since you're not actually hacking the device, only the drivers on your own computer. Of course I'm no expert.

OnlineRender
12-25-2010, 08:05 AM
you need more than just a kinect , you need to be able to capture read and write data to the kinect . USB port does not cut it 'fine for using it with opensource drivers .


http://www.youtube.com/watch?v=Brpu30vjCa4&feature=related

OnlineRender
12-26-2010, 01:48 PM
http://code.google.com/p/kinemote/

BETA!

OnlineRender
12-26-2010, 02:07 PM
http://www.primesense.com/?p=514

u want this

OnlineRender
12-27-2010, 02:33 AM
sat upto 4am last night hacking the motor functions..... wow I hear you say ...I think that's as far as I can push it :) gets too heavy for me after that .


ps there is a tut on how to do this ............google is your friend


Here is the tut : http://www.ladyada.net/learn/diykinect/

X10 easier in Linux for obv reasons , but in windows in not that difficult .

if you dont own a usb analyser 'we all should' but there $400 ,there is lib files with explanations .

cresshead
12-28-2010, 06:23 PM
a exoskeleton mocap system is $8000
note the exokeleton option gives you realtime feedback in the viewport of the computer unlike markerless mo cap that has to analyse teach frame offline for each camera.

http://www.animazoo.com/images/stories/gypsyfront3.jpg
http://www.animazoo.com/index.php/gypsy-7

ipi mocap with support for 6 ps3 camera's is $995 [reduced from $1495 for a limited time]
http://www.ipisoft.com/sales.php
the $995 is for the software
a ps3 camera is $39.74 per camera so 6 cameras would be $239
add 6 stands @ $30 for £180
total 995 + 239 +180 = $1414

from the video demo's on the web it appears that the kinect option may also be realtime which will be pretty amazing.


ipi express for the kinect hasn't been listed as a price yet...but say it's $499
kinect is $149
so a kinect system maybe around $650....all depends on the price of the software mind you! :)
please note that with a kinect based option you have to 'act toward the camera' with a 4 or 6 camera option using the ps3 camera you get 360 degrees of capture...but you'll also need an empty double garage sized space to set this up

exoskleton clothing systems are around $25,000
http://www.animazoo.com/index.php/igs-190-m

http://www.animazoo.com/images/stories/igs190_1.jpg

camera based systems that use makers start at $40,000

cresshead
12-28-2010, 06:39 PM
more vids>>

http://www.youtube.com/watch?v=-6LBkyZvCZU

running live in maya...lightwave version PLEASE!
http://www.youtube.com/watch?v=oiu2-I4Y1MU

Greenlaw
12-28-2010, 07:41 PM
a ps3 camera is $39.74 per camera so 6 cameras would be $239
add 6 stands @ $30 for £180

Looks like the cameras have gone up in price: I bought my PS3 Eye cameras for about $30 a piece from Amazon about a year ago.

I also bought c-stands from Cowboy Studio (http://www.amazon.com/CowboyStudio-photo-Light-Stands-case/dp/B001WB02Z4/ref=sr_1_27?ie=UTF8&qid=1293588657&sr=8-27) that cost two for $24 (or $12 each,) with a handy carrying case. (That's six for about $75!) These are sturdy, lightweight metal, and really well built. The head does not rotate like a tripod head, but it doesn't matter because the camera base rotates.

As an alternative, you can use security camera mounts and mount the cameras to the walls. I'm going to do this eventually.

To mount the cameras you'll want to add a $18 Helicoil kit (http://www.amazon.com/gp/product/B0002SRE8Q/ref=wms_ohs_product) that lets you tap the bottom of the cameras so you can mount them to the c-stands. This is easy to do because the PS3 Eye cameras already have a hole in the metal base (it's under the sticker.) I shot a video of the process; I need to edit it and then I'll post it on our website. A single kit is good for 12 cameras, so I guess you could split the cost with another iPi user.

Some users are used zip ties or tape to mount the cameras to tripods or c-stands, but I can't see how this could possibly more stable or convenient than properly mounting them using the Helicoil kit.

Finally, you need a set of 16ft USB repeater cables (http://www.newegg.com/Product/Product.aspx?Item=N82E16812224004). These are long cables that have powered boxes on the end that extend the signal beyond normal USB cables. You can daisy-chain up to five, but you'll never need it that long. Typically, I daisy chain a pair for the farthest cameras, and use just one for each of the closer cameras. The cables cost about $10 a piece if you shop around (I get mine from Newegg.com.)

I think that's it. I'm too lazy to do the math right now but totaled up it's still pretty cheap.

BTW, even though I currently have six cameras, the test videos I posted were done using four cameras. The results using four cameras is really not bad, so if you're really on a budget you can save a bit of money by getting the four camera version of iPi Studio ($595).

G.

Greenlaw
12-28-2010, 07:53 PM
Technically, you can shoot an iPi video session anywhere so long as you don't have a reflective background (like windows,) or colors that confuse the tracker. Some users shoot in their backyard early in the morning when it's still overcast. Of course, ideally, you want a more controlled environment. It looks like the iPi guys just shoot in an empty meeting room or in their offices. I shoot in a two car garage with a backdrop, which takes a bit more work to set up but it works well. (I just leave the rig up for the duration of a project.)

cresshead
12-28-2010, 07:53 PM
Q. using the ps3 option how long does it take to analyse a say a 10 second session to get to motion up and running on playback in a open gl port after the capture?

Q. how do you commence 'capture' in your system...
1.do you have it count down to going live
2. a usb foot switch to record
3. another person running the recording session in the room with you

cresshead
12-28-2010, 08:00 PM
another question and observation....!!

head tracking...i see reading the forums that the ipi system does not do head tracking..so we get static heads...

over on the behind the scenes footage of Killer Bean Forever DVD he used sysntheyes to track his head movements to give his characters 'life'.

I'm wondering if that's another thing to look into in 2011...
and then there's lipsync! [EEEK!]

erikals
12-28-2010, 08:09 PM
...and finger movements /animation... [agh!]... :]

what's the best way for hand animation? manual posing? pose library?
afaik there is no good mocap finger trackers available...

cresshead
12-28-2010, 08:12 PM
...and finger movements /animation... [agh!]... :]

what's the best way for hand animation? manual posing? pose library?
afaik there is no good mocap finger trackers available...

there are mocap gloves...NOT cheap though!
http://www.metamotion.com/images/wireless_CG.jpg

http://www.metamotion.com/hardware/motion-capture-hardware-gloves-Cybergloves.htm

i think the whole performance capture thing is coming to the indie animator in 2011...mainly thanks to the hi profile of Avatar being so successful and spurring on companies such as http://www.ipisoft.com/

re face/lipsync...
there was mimic for lightwave
http://www.daz3d.com/i/software/mimic
note that you'll need access to lightwave 8 for mimic to work.

for other apps re lipsync voice o matic for 3dsmax, maya and softimage
http://www.di-o-matic.com/products/plugins/VoiceOMatic/#page=overview

erikals
12-28-2010, 09:03 PM
ZignTrack looks to be best for facecapture, unless the "manual" way is used using TAFA...

i still strive to locate a good hand mocap option though,
i have a homemade project on it, but it's not moving too fast, it's aiming towards using webcams to record fingermovements by placing it on the wrist, long story...
it's sort of a spin-off of the avatar facecap method...

we'll see,...

erikals
12-28-2010, 09:31 PM
then maybe blend it with something like this,...

http://www.youtube.com/watch?v=783UG-eJgt8
http://www.youtube.com/watch?v=qM-X1UAeDoE

though the fastest most secure option is to use pose saves i guess,...
http://www.youtube.com/watch?v=HClYdWYz8f4

cresshead
12-28-2010, 09:34 PM
ZignTrack looks to be best for facecapture, unless the "manual" way is used using TAFA...

i still strive to locate a good hand mocap option though,
i have a homemade project on it, but it's not moving too fast, it's aiming towards using webcams to record fingermovements by placing it on the wrist, long story...
it's sort of a spin-off of the avatar facecap method...

we'll see,...

ZignTrack doesn't offer much in he way of connecting to other apps unless your talking about buying motion builder or using poser.

http://www.zigncreations.com/tutorials.php

montion builder = too costly for me.
poser = too poo for me to use.

erikals
12-28-2010, 09:37 PM
true, i wish it was realtime, maybe in the future...

Greenlaw
12-29-2010, 02:24 AM
Q. using the ps3 option how long does it take to analyse a say a 10 second session to get to motion up and running on playback in a open gl port after the capture?
For my current project, I'm shooting fairly long sequences that are cut down to individual shots. For example, the first sequence runs a little over 800 frames per character. BTW, tracking speed is highly dependent on your graphics card. iPi Software recommends a GTX 260 or better. Here's a comparison chart: Video Card Comparisons (http://www.ipisoft.com/en/wiki/index.php?title=Cameras_and_accessories#Video_Card )

A typical tracking session using a quad core computer with a Geforce GTX 260 goes like this:

Calibration - This is a short video where you walk around the active performance space holding a small Maglite in 'candle' mode. Here's a description of my typical video: I like to draw a 'table' with the four legs touching the floor, and a spiral from top of the table to the center spot on the floor. My typical calibration video takes about 10 minutes to process. When this is done, I set the scale of my scene (usually based on the known height of one camera,) and save the data. If you didn't move the cameras during the rest of the session, you can reuse the calibration data over and over again.

Tracking - Now you load a performance video, where you recorded your 'mocap'. The video should have some 'clean' video near the beginning (you can set the range for it,) a T-Pose, and multiple takes of the action. The clean segment is used as a difference matte, which in theory allows you to shoot 'anywhere'.

The first step is to 'fit' the rig into the T-Pose. To do this, just move the rig over the performer where he assumes the T-Pose, and click the Refit Pose button a couple of times. The rig's height will probably need adjusting to get a good fit. You will also want to alternate between Refit Pose and Analyze Actor. (Analyze Actor samples colors from the video and examines the lighting.) When you get a good fit, save the Actor file. You can resuse this data throughout the session.

Okay, now the actual tracking. Workflow can vary, so I'm going to try to generalize. Set the In and Out point of a 'take'. Decide if you want Feet and Shoulder tracking enabled. At the start of the action, you can pose the rig to fit the pose of the performer. If it's a T-Pose, this is easy, but it means tracking extra data. If you cut in on the action, you need to pose the rig to roughly fit the action pose and hit Refit Pose. What I usually do because I'm lazy is I turn off all the 'quality' settings, do a fast track from the main T-Pose to the beginning of the action, and stop the track at the first frame. Then I turn on the quality settings (i.e., feet and shoulders, etc.) click Refit Pose, and then to my actual tracking. Click Track; tracking will stop automatically when it hits the Out marker.

When tracking is done, you can scrub through your video, tweak the pose at trouble spots and retrack forwards and backwards from the tweaked frame.

There will be some jitter, so you may want to run Jitter Removal on some, if not all of the tracked data. Jitter Removal helps a lot with stabilizing the feet, but in the current release, it's too aggressive and may hammer some of your other motions. iPi Soft is aware of this problem and they're suppose to be fixing this now. They're also adding Jitter Removal that can be isolated to specified joints, which should be very useful.

There is a second 'post enhancement' feature called Trajectory Smoothing. This option is a global and non-destructive setting that smooths your motion 'on-the-fly'. You will probably want to set this to 2 or 4. Unlike Jitter Removal, you can change this setting without retracking the action and see the results immediately.

Tracking at high-quality seems to take about 2.5 seconds per frame with my system. IMO, this is not bad, but users with a GTX 480 are reporting that they can track multiple frames per second! Clearly, you want the best gaming card you can afford.

If you have multiple 'takes' in your video clip, there is a second timeline where you can set In and Out points for each take. With planning, this can speed up your workflow.

When your all done, you can export a 'take' as .bvh, .dae, and a few application specific formats, by right clicking the 'take' range on the timline and selecting Export. (Note: you can also export an OpenGL preview of a 'take' if you wish. This is how my iPi test videos were created.)


Q. how do you commence 'capture' in your system...
1.do you have it count down to going live
2. a usb foot switch to record
3. another person running the recording session in the room with you
I just hit 'Record' (spacebar), allow a second or two for the clean video, get in the the shot, assume a T-Pose and perform. When I'm done, I walk over to the workstation and hit 'Stop' (spacebar again.) iPi Recorder automatically names and saves the file. Since I can select the ranges for tracking, I don't worry too much about 'unusable' video data.

Another option I'm going to try on my next session is voice command. I discovered that the PS3 Eye camera has an active microphone which is compatible with the voice command features in Windows 7. In theory, I can use this to start and stop the Recorder, which will reduce some of the 'trash' video. And because iPi Recorder will automatically save and name files, this part doesn't need to be supervised at the computer.


another question and observation....!!

head tracking...i see reading the forums that the ipi system does not do head tracking..so we get static heads...

over on the behind the scenes footage of Killer Bean Forever DVD he used sysntheyes to track his head movements to give his characters 'life'.

I'm wondering if that's another thing to look into in 2011...
and then there's lipsync! [EEEK!]

In general, I'm not too worried about head tracking...I just target the head and keyframe what I need. Even at work, we sometimes wind up reanimating the head because the actor wasn't looking exactly where the character needed to be looking.

That said, I wish I had head tracking for our short film. Our sessions are recorded in sync to music, and it turns out the performer's head is very noticeably moving in time to the music, something I have to now keyframe by hand because the current version of iPi DMC ignores this data.

The developers are very aware of this limitation and are working to fix it. They got shoulder tracking working just a few months ago, which I imagine is harder to track, so I think head tracking is not far behind.

For face tracking, you're probably better off using SynthEyes. I have an older version of Zign Track but never found the time to make it work for Lightwave. The current version of iPi Studio doesn't have a mode for face tracking, and I don't think they're working on one. (Not yet anyway.)

iPi doesn't track hands and fingers either. They haven't promised fingers, but I imagine if they had a mode for tracking hands in a separate 'hands' session, you could merge that track to the body capture. I believe this is how the folks who do our mocap (usually Giant) used to do it several years ago, though nowadays they can capture hands and fingers (and face!) as part of the full body performance. Personally, I'd be happy if iPi just added wrist/hand rotations to iPi DMC.

For the Brudders movie, we're using MagPie Pro to animate the face and lipsync. When you see the character designs, it's obvious why we chose this system. We also have TAFA but it's not appropriate for this project. For automated face animation and lipsync, I experimented with DAZ Mimic for Lightwave several years ago, and I thought it worked reasonably well for the time you put into it. (Never actually used it on a production though.) IMO, none of the low-end automated facecap and lipsync systems ever look as good as manual key framing though.

Obviously, I'm not comparing any of the above to what Giant uses. Their system is absolutely state-of-the-art meant for feature film quality work, and the cost and crew size reflect that. What my wife and I are working on is quite literally a two-man garage production, so we try to keep our cost and expectations in perspective. (Well, my wife seems better with that than I am.) :)

G.

Edit: Made a few corrections to the above.

Greenlaw
12-29-2010, 02:53 AM
I should mention a few more features of iPi DMC. It can export .bvh and .dae, and some specialty formats. No .fbx yet, but they're working on it. If you're application can export a proper .dae, you can retarget your data directly to the imported rig from within iPi DMC. iPi DMC doesn't have much in terms of editing the retargeted mocap though. Also, Lightwave's .dae file appears to be totally incompatible with iPi DMC.

The main problem I'm having with the current release of iPi DMC is jitter. They recently added improved Jitter Removal, which does work but as I already mentioned, it's too aggressive for my tastes. Unfortunately, the current build seems to have more jittering in the raw data than past releases of DMC did, so right now I'm damned if I do and damned if I don't. But the iPi guys are fully aware of this and they've stated that a fix is on it's way for the next release. (Their pretty good about posting frequent updates by the way.)

What I'm trying to do now to work around the problem is to output two passes, one with and one without Jitter Removal. By merging the data selectively, this should allow me to stabilize the lower body and feet with Jitter Removal, and have less aggressive smoothing for the upper body using only Trajectory Smoothing. I'm pretty sure I can do this through cutting and pasting with IK Boost, but I've decided to learn Motion Builder so I may wind up doing all my editing in that program instead.

BTW, have you seen this? Non-Destructive Live Retargeting — Maya 2011 New Features (http://www.youtube.com/watch?v=l1hV8BqsCEA)

It's stuff like this that makes me feel like I've been trying do everything the hard way. What really caught my interest in this video is how in just a few minutes the artist corrected the mocap for a grotesquely proportioned character, which is exactly the issue I'm dealing with right now. I've been able to do some of this kind of editing using IK Boost, but my workflow is not nearly as elegant or efficient as what I see in this video.

Unfortunately, it sounds like this Maya feature isn't quite compatible with iPi DMC's mocap data yet. Something to do with iPi's joint rotations not being normalized. I'm sure they're working on it though.

G.

Greenlaw
12-29-2010, 03:24 AM
I purchased two additional PS3 Eye camera from Amazon only last November, and I just checked my receipt: $30.57 for each camera. Man, did they raise the price just in time for the holidays or what? (Current Amazon price in late December is $39.63)

Maybe the price will come down again in a few months.

G.

Philbert
12-29-2010, 06:53 AM
LOL On the news just now they were showing that an Austrian game company is using the Kinect to make an adult game.

Greenlaw
12-29-2010, 11:11 AM
BTW, have you seen this? Non-Destructive Live Retargeting — Maya 2011 New Features (http://www.youtube.com/watch?v=l1hV8BqsCEA)
...Unfortunately, it sounds like this Maya feature isn't quite compatible with iPi DMC's mocap data yet. Something to do with iPi's joint rotations not being normalized. I'm sure they're working on it though.

Update: As a public test, the iPi guys posted a Motion Builder friendly rig that you can import to iPi DMC and retarget the data to. This should make the Maya NDLR system work with iPi DMC. I haven't tried it myself yet...will probably have to bring my wife into it; she's the Maya expert.

Seeing how a iPi Studio is starting to work directly with other 3D apps, I hope Newtek will work more closely with them (and Animeeple) for direct Lightwave support. In general, it would be cool if they could help more third party programs export to .lws directly. (Like Vue, SynthEyes, and PFHoe Pro does for example.)

G.

Edit 1: Here's the link for the iPi data for anybody else who can use it: iPi to Maya/Motion Builder workaround (http://www.ipisoft.com/forum/viewtopic.php?f=6&t=4806&sid=e05608b824d26395997b775315b63ce4&start=10)

Edit 2: I wonder if this workaround will help with some of the problems I was running into with LW 10. Will take a look at it if I find the time.

Titus
12-29-2010, 09:30 PM
BTW, have you seen this? Non-Destructive Live Retargeting — Maya 2011 New Features (http://www.youtube.com/watch?v=l1hV8BqsCEA)

This is very heavy stuff. Worth considering now I have my mocap system running.

erikals
01-06-2011, 10:05 PM
new app exports kinect motion data as bvh files

http://www.cgchannel.com/2011/01/new-app-exports-kinect-motion-data-as-bvh-files/

 

Greenlaw
01-07-2011, 01:06 AM
I posted this info elsewhere, but a new version of iPi Studio was released yesterday. For more info, here's the link to my post: iPi Update (http://www.newtek.com/forums/showthread.php?t=114498&page=2)

Starts at #19.

This update is pretty fantastic. Not only have they fixed the 'excessive jitter' bug from the previous build and improved overall tracking precision, they added a new tool called Configurable Jitter Removal which lets you vary the amount of Jitter Removal on different sections of the body. I tried this version with the 'stretch and walk' session I created to test rig limits and deformation, and the results are significantly better and more accurate to the original performance. (I'll post a comparison video soon.)

Dare I say it? I'm starting to feel that iPi Studio's mocap quality is, within reason, good enough for professional work.

On a side note, I added an SSD to my computer which should allow me to capture full res data at 60fps from six synchronized PS3 cameras. I'll post comparison results from a six camera session next week.

G.

Edit: I forgot to mention, it now exports Motion Builder friendly .bvh files too!

Philbert
01-07-2011, 01:28 AM
So it could be used directly with LightWave then?

Greenlaw
01-07-2011, 03:33 AM
So it could be used directly with LightWave then?
I'm not sure which app you're asking about (NIUserTrackerToBVH or iPi Studio) but, yeah, if it exports .bvh, you can use that format in Lightwave. iPi Studio has always done this.

It's retargeting to an existing 'LW native' rig that's been the tricky part. In theory, you should be able to do this within iPi DMC by importing a Lightwave Collada file and transferring the motion, exporting it back out as a Collada file for Lightwave, and then loading the motions from that file directly to your Lightwave rig using Load From Scene Load Only Motion Envelopes. I haven't had any luck getting a Lightwave Collada file into DMC yet though.

Thanks to help and advice from several mocap experts in the HardCORE forums, I learned that the best way to retarget iPi DMC .bvh data on a Lightwave character rig through Motion Builder via FBX, and then using Load From Scene Load Only Motion Envelopes to the final animation rig in Layout.

This method should work the same way with Animeeple, but at the moment Animeeple's FBX format is partially incompatible with Lightwave 10. But the developer is aware of the problem, and hopefully there will be two good mocap retargeting tools for Lightwave 10 shortly. :)

As an alternative, you can work directly with the .bvh data in Layout but if you need to get that data onto an existing Lightwave character rig, especially if it has unique body proportions, the process isn't as streamlined as using a third party program like MB or Animeeple.

Well, as far as I know anyway. I'm still learning a lot of this stuff as I go.

G.

cresshead
01-07-2011, 04:43 AM
I'm quite hopeful that now Newtek have Rob Powers [Avatar VAD top guy] in place at the top of the Lightwave development such tasks as throwing motion capture onto existing lightwave characters will be as simple as 'load motion capture and apply to character' and the days of jumping thru several other apps or hand crafting each and every import will be a distant memory.

Currently i think most lightwave artist would agree that it's painful/slow in either the time it takes to get it working per character and per motion capture file with just native Lightwave or you have the pain in your wallet to buy Autodesk Motion builder which is $4000.

just had a look at the UK pricing for Motion Builder £3700 plus 20% VAT so £4400 inc vat.
so it's quite an investment...however looking at the alternatives such as a exoskeleton motion capture option which is $8000 it's still the cheapest option around when you add in the ipisoft kinect or ps3 camera options.

you could opt to go and invest in project messiah and then export/link to Lightwave for $599 as there are apparently better import BVH tools for messiah.


If someone can demonstrate in a video that's it's simple to use just lightwave on it's own to load 5 or 6 different motion capture files [from different lib's] onto a non standard [cartoon proportioned] character i'll eat my words and buy a humble pie too, currently as understand it the process is time consuming and a laborious task with some bvh files having different bone setups and scaling

currently this keeps me using another app as the motion capture load/edit app for my character needs...i then import to lw as mdd or point cache for rendering.

artstorm
01-07-2011, 04:53 AM
Jasper Brekelman posted this video the other day where he shows his development project that captures a realtime stream with the Kinect and feeds it directly into MotionBuilder:
http://www.brekel.com/?p=145

And followed up that post a few hours ago that he will release it later this week or next week, to play around with:
http://www.brekel.com/?p=152

Despite its limitations and quirks this can be a great way to get some hard data captured to use as a starting point to work from. And the price can't be beaten at this time if you already have MotionBuilder (which plays very nice with LightWave 10), so I'll for sure try this out as soon as it's downloadable.

Philbert
01-07-2011, 09:37 AM
I'm not sure which app you're asking about (NIUserTrackerToBVH or iPi Studio) but, yeah, if it exports .bvh, you can use that format in Lightwave. iPi Studio has always done this.

It's retargeting to an existing 'LW native' rig that's been the tricky part.

I meant NIUserTrackerToBVH mainly, but anything that exports BVH I tried it last night with a BVH file I had laying around didn't bother applying weights or special bones or anything just loaded the BVH and the character model pretty much, it was cool to see how well it worked with just that. I had never considered transferring the motion to a different rig.


I'm quite hopeful that now Newtek have Rob Powers [Avatar VAD top guy] in place at the top of the Lightwave development such tasks as throwing motion capture onto existing lightwave characters will be as simple as 'load motion capture and apply to character' and the days of jumping thru several other apps or hand crafting each and every import will be a distant memory.

It's a nice thought but, and correct me if I'm wrong, I don't think Rob had anything to do with mocap on Avatar. As I understand it the VAD team did pretty much only backgrounds and camera related work.

cresshead
01-07-2011, 10:15 AM
I meant NIUserTrackerToBVH mainly, but anything that exports BVH I tried it last night with a BVH file I had laying around didn't bother applying weights or special bones or anything just loaded the BVH and the character model pretty much, it was cool to see how well it worked with just that. I had never considered transferring the motion to a different rig.



It's a nice thought but, and correct me if I'm wrong, I don't think Rob had anything to do with mocap on Avatar. As I understand it the VAD team did pretty much only backgrounds and camera related work.

i thought VAD covered all aspects of previz live in camera for the director so he saw the characters 'live' onset when the mo cap actors were acting on stage, that would need riggred mo cap ready characters to be created and exported to the live system.

Greenlaw
01-07-2011, 11:17 AM
Currently i think most lightwave artist would agree that it's painful/slow in either the time it takes to get it working per character and per motion capture file with just native Lightwave or you have the pain in your wallet to buy Autodesk Motion builder which is $4000.
Right now I'm feeling the situation is better than what I thought it was, thanks mostly to help from several rigging and mocap experts in these forums who took an interest in my project.

The MB approach is very good, but expensive unless you can qualify for an educational license. For example, my wife sometimes teaches Maya and lighting at schools like Gnomon, which allows her to buy software at educational rates. Even if you take a short course at a local community college, you may qualify.

A great alternative will be Animeeple. It's free and I'm pretty sure the FBX bug I'm running into will be fixed quickly once I get my data to them. (I'll send it to them this weekend. My time for getting this stuff together is a bit fractured at the moment.)

Using either of these programs, it should just be a matter of loading your LW FBX, loading the .bvh, dragging the .bvh on top of the LW rig, and then export. Then in Layout, you open your character rig and using LFS MOME to apply the motions.


If someone can demonstrate in a video that's it's simple to use just lightwave on it's own to load 5 or 6 different motion capture files [from different lib's] onto a non standard [cartoon proportioned] character i'll eat my words and buy a humble pie too, currently as understand it the process is time consuming and a laborious task...
SplineGod has a method that only uses Lightwave. I think it's good approach for a small number of shots and a compressed schedule, but it's too work intensive for a large project. What Lightwave really needs is a way to retarget .bhv data to a rig from within Layout, the way Animeeple, MB and iPi DMC does.

That said, I think Animeeple (when fixed) and IK Boost will give you most of what you need for almost free. (Animeeple is free but the FBX plug-in costs $50.)


currently this keeps me using another app as the motion capture load/edit app for my character needs...i then import to lw as mdd or point cache for rendering.
When I was getting very frustrated with Lightwave 10, I seriously considered starting over using Messiah. From what I was seeing at Setup Tab, the .bvh import seems better thought out in that program. My second alternative was to switch to Maya for mocap and animation like we did at work, and just use Lightwave for lighting and rendering.

But then the artists in HardCORE helped me figure out a good workflow for using FBX and LW 10. I'm still planning to use Messiah for my next keyframe animation project (the way long overdue 'RVJ',) but I think I can safely stick with Lightwave + Animeeple or MB for the next mocap one (probably more Brudders shorts.)

BTW, Rebel Hill posted two new new Lightwave FBX videos on his YouTube channel. Definitely worth checking out. Here's the link for part 3 and 4: Lightwave and FBX Part 3 (http://www.youtube.com/user/RHLW#p/a/u/0/1iqFP3I5-78)

Over in HardCORE, Cageman prepared a couple of videos just for me the other day, which were every bit as helpful and actually changed my mind about rejecting LW 10 for our project. I can ask if he will post the links outside of HardCORE.

G

Philbert
01-07-2011, 11:21 AM
i thought VAD covered all aspects of previz live in camera for the director so he saw the characters 'live' onset when the mo cap actors were acting on stage, that would need riggred mo cap ready characters to be created and exported to the live system.

You may be right. I wasn't there. ;)

cresshead
01-11-2011, 06:22 PM
lipsync demo using kinect

http://kotaku.com/5730112/this-is-how-kinect-turns-your-face-into-a-talking-avatar

http://kotaku.com/5726116/avatar-kinect-is-a-new-xbox-live-gold-perk-and-a-fancier-chatroom

Greenlaw
01-23-2011, 10:02 AM
Andrew at iPi Software posted this today:


Hello!

As to demo: we have published the following video (several weeks ago): http://www.youtube.com/watch?v=GCu8KTrC4sc.

As to exact date: it depends. We still have some issues which involves research with unpredictable results. Also we don't decide yet about beta version: to publish or not to publish release candidate.

As to comparison with plugin for MB from brekel.com: our main advantage is higher robustness and accuracy and as a result more realistic and solid result animation. Also our decision is not limited by one 3d package. Meanwhile disadvantage is processing speed (offline, not real time). Note that this is only our observations and we hope you'll be able to make your own view as soon as we publish our new software.


The original post is here: iPi Kinect Info (http://www.ipisoft.com/forum/viewtopic.php?f=2&t=4858&sid=7c953eacc9ca16da89a87201480c5f86&start=10)

G.

Greenlaw
01-23-2011, 10:03 AM
Oh, and this was from the day before:

Sorry for delay in reply. Right now we are in a hurry to release Kinect version in early February. Currently our main focus is robustness of tracking for Kinect. And we perform progress in this direction every day.

As to characteristics of Kinect version.
Pros:
* there is no need in large space, tripods, USB-cables, fast computers for recording, monochromatic clothing, complex camera setup and calibration
* easy to use
* for tested set of motions quality of tracking is comparable with 4 camera version (but we need to perform more tests with the latest version of algorithm)
Cons:
* capture volume is limited by 2x2 meters
* self-occlusion problem (one point of view) thus there is no actual 360-degree freedom
* tracking speed in current implementation is about speed of version for 3-4 cams
G.

Greenlaw
01-23-2011, 10:18 AM
In another post he stated that the price will be in the $300 to $400 range.

Personally, I don't think I will be getting a Kinect version for some time because I already have the PS3 setup that can capture larger performances more accurately than the Kinect, but the idea of having a 'portable' mocap system that you can use in a tiny space is quite appealing.

Sigh! Maybe someday. :)

G.

Edit: Hmm. Since it's only a single camera system, I wonder how well it performs with a duo core laptop. That could make it a truly portable system. For comparison, my original PS3 tests (two years ago) with iPi Studio were recorded using a laptop, and that handled four cameras. (But at only 320 x 240 at 30fps. Moving up to quad core and SSD allows me to do six cameras at 640 x 480 at 60fps. I imagine Kinect data can get a bit heavy but then again it is only one stream of data.)

artstorm
01-23-2011, 02:03 PM
I just started experimenting with a Kinect myself. I got it earlier this week for this very purpose and today I had some time to start exploring it. I emptied out a small room to make the setup in (an older quad core machine, a desk, a monitor, the kinect and empty space).

I just had it up and running with Brekel's software a few moments ago, and first impression is... works really well!
Screenshot:
http://dl.dropbox.com/u/7970511/temp/johan-doing-mocap-on-a-shoestring.jpg

I had just emptied the room when taking that shot (was used for storage before, so tons of stuff to carry out), so I still have some draping and stuff left to do to make it as easy as possible for the Kinect to work.

I'm going to evaluate and see how far I can take it tonight to get a better picture of it overall what one can expect.

The software might support more Kinects in a future version, and if it has worked well so far I might then add another Kinect to deal with occluded areas and fortunately I have a much larger room available to move the "stage" into if needed.

@Greenlaw: I think I'm making somewhat of a similar journey like you at the moment, setting up a pipeline and workflow with LightWave as the centerpiece to make a short movie. I've the intention to start documenting the progress as well so I can sharing my findings too. :)

The Brekel software works really nice so far, and the realtime feedback is cool. But I'm very interested to keep track and an eye on the iPi Kinect software as well and see what they come up with, the youtube clip posted looked really nice and clean. And the less time needed to cleanup the data, the more time can be spent on detailing other aspects.

cresshead
01-23-2011, 02:12 PM
as long as the kinect version runs on vista and can output biped (3dsmax) BVH friendly files i'm "in"
:thumbsup:

cresshead
01-23-2011, 02:24 PM
tutorial>>

http://www.reubenfleming.co.uk/tuts.html



3dsmax beta version...it's not ready for prime time use for sure but takes out the need for
motion builder...no price as yet but it's not working fully so i'll keep an eye on it as it develops.

http://mocap.cguse.com/en/

Greenlaw
01-23-2011, 03:50 PM
@Greenlaw: I think I'm making somewhat of a similar journey like you at the moment, setting up a pipeline and workflow with LightWave as the centerpiece to make a short movie. I've the intention to start documenting the progress as well so I can sharing my findings too.
That's awesome! Good luck with your project!

I was putting together a production journal but I haven't made it public yet because I kept stumbling and changing course with the mocap workflow. I finally have a solid system in place though, so I'll probably start posting pages soon.

G.

mrbones
02-03-2011, 10:46 AM
Why are all your avatars cats?

http://www.animstreet.com/animations/6783

Greenlaw
02-03-2011, 11:21 AM
Avatars? We ARE cats!

Philbert
02-03-2011, 11:26 AM
Why are all your avatars cats?

http://www.animstreet.com/animations/6783

That link gave me a malware warning about the unity web player.

mcsdaver
02-08-2011, 09:58 AM
I have seen some cool work using Kinect with a free program called Miku Miku Dance, also called MMD.
Motion files can be made, edited and used in MMD.
Someone made a program to comvert motion files from MMD to work in Lightwave.
MMD is used to make animated dances and funny videos.
Adding Kinect just added so much more power to this free program.
Using motion files from MMD in Lightwave would add more power to Lightwave.
The progrm to convert the motion files to use in Lightwave was made by a Japanese programmer. He is not good with English, so I am having a hard time using his program.
If anyone has figured out how to convert from MMD to Lightwave let me know.

Philbert
02-08-2011, 03:44 PM
Yeah either I or someone else posted links to dow3nload MMD from his website earlier in the thread. I don't think I saw the LW import though.

cresshead
02-15-2011, 05:59 PM
looks like the ipisoft kinect is entering beta next week, so not too long to wait for the commercial released version to surface with luck.

Greenlaw
02-19-2011, 07:10 PM
I launched iPi DMC today, saw there was an update, clicked Update, and guess what? iPi DMC now includes the much anticipated Kinect support. I haven't tried it yet because I don't own a Kinect camera, but this is really tempting me to get one. :p

G.

Greenlaw
02-20-2011, 05:18 AM
Build 113 is faster but less accurate. iPi is aware of the problem and already working on the issue. In the meantime, back to trusty old Build 112. :)

BTW, I was told that the Kinect enabled update for iPi Recorder should be out any day now. Need to find somebody with a Kinect camera fast!

G.

cresshead
02-20-2011, 05:51 AM
Build 113 is faster but less accurate. iPi is aware of the problem and already working on the issue. In the meantime, back to trusty old Build 112. :)

BTW, I was told that the Kinect enabled update for iPi Recorder should be out any day now. Need to find somebody with a Kinect camera fast!

G.

Borrow a kinect...someone's kids you know of must have one!..let's see what yo think of it!:thumbsup:

Philbert
02-20-2011, 06:08 AM
Probably gathering dust. lol I've heard complaints from game review sites that people have gotten tired of the games that launched with Kinect.

serge
02-20-2011, 07:00 AM
I've heard complaints from game review sites that people have gotten tired of the games that launched with Kinect.
Physically or mentally?

Philbert
02-20-2011, 07:13 AM
perhaps "getting bored with" is better wording.

jasonwestmas
02-20-2011, 07:23 AM
perhaps "getting bored with" is better wording.

I'm certainly unamused. :) Though potentially the technology bears merit.

cresshead
02-20-2011, 07:54 AM
my friend's who have a kinect love it so much they've not put any other game type in...just kinect games they think it's amazing and are eager for a golf game or any sports types games, there was a discussion down at gamestation melton mowbray where the kinect was overwhelmingly well received by gamers and the staff compared to the sony move.

Greenlaw
02-20-2011, 09:08 AM
Build 113 is faster but less accurate. iPi is aware of the problem and already working on the issue. In the meantime, back to trusty old Build 112.
Nevermind. I did a new track, this time loading an actual size Actor rig into the project (iPi Soft previously recommended a rig scaled slightly taller than actual size,) and the new results are very accurate. That is comparable to 112's accuracy. (And possibly even better now?)

So all is good with 113 (apparently.) Hopefully, that's all it was. Waiting to hear from iPi since they were able to reproduce the tracking errors I described at their forums.

Looking forward to the six-camera shoot this afternoon. Stay tuned. :)

G.

jasonwestmas
02-20-2011, 10:00 AM
my friend's who have a kinect love it so much they've not put any other game type in...just kinect games they think it's amazing and are eager for a golf game or any sports types games, there was a discussion down at gamestation melton mowbray where the kinect was overwhelmingly well received by gamers and the staff compared to the sony move.

good news!

Greenlaw
02-20-2011, 04:28 PM
Okay, I'm going back on my last post (again.) The tracking issue in 113 is not as bad as I thought but it's just problematic enough that I've gone back to 112. Interesting thing is that one of the errors I spotted this time is being caused by self-occlusion (even with four cameras) and it exists even in my 112 track. Since I'm testing with six cameras today, it will be interesting to see if the two extra cameras will fix this problem. (If not, it's no big deal. This particular error is easily fixed in MB, Animeeple, and even Lightwave.)

This does illustrate an important advantage with the PSP Eye camera setup; with the PSP cameras you can solve self-occlusion problems by adding more cameras and positioning them strategically to prevent the problem. With the Kinect system, you're either stuck with what you get from the single camera, or if you're really hardcore about it, you can shoot the same performance two or more times from different angles with the Kinect and then blend sections of performances to fix the occlusion errors. (Best done using Motion Builder but you can probably do this in Lightwave too using Motion Mixer.)

G.

robertoortiz
02-21-2011, 12:46 PM
All of the amazing hacks people have unleashed on Kinect to do crazy, weird, awesome stuff? It's about to get more insane, because it won't just be hackers doing the noodling. Microsoft's going to release a full SDK for Kinect on Windows 7 that lets developers access more than just the messy raw data people have been extracting, allowing devs to start coming up with even wilder, more sophisticated Kinect apps.

The goal of the new SDK, coming this spring, is to "meet the needs of higher-level people who don't care how the camera works" but just have an amazing idea of what they wanna do with Kinect, says Microsoft's head brain Craig Mundie. The SDK will provide drivers and libraries that live above the raw data, allowing devs to fully tap the RGB camera, mic array, and motors that move Kinect around.

Initially, it'll be for enthusiasts and academics, not people who'll be making money off their Kinect apps—but there's a plan for those folks in the future.


http://gizmodo.com/#!5766328/the-full-power-of-microsoft-kinect-is-about-to-be-unleashed-with-an-sdk-for-all

Greenlaw
02-21-2011, 05:02 PM
Yes, this is opening an interesting door for homebrew productions.

I'm actually considering getting a Kinect to complement my current PS3 system. It occurred to me that because the Kinect is capturing only one camera's worth of data, it may work actually well with my old duo core laptop, and the idea of a fully mobile mocap studio is intriguing.

G.

Greenlaw
03-09-2011, 01:36 PM
Our Kinect arrived a couple of days ago. I didn't know when I'd have time to try it with iPi DMC at home so I brought it into work today. I'm using it with an old laptop running XP Pro (x86), and it was simply a matter of installing the latest Recorder software and plugging the Kinect into a USB port. Immediately I saw the depth data in realtime on the laptop screen. More info here:

http://newtek.com/forums/showthread.php?t=114498&page=5&highlight=ipi

The res is 640x 480 and framerate is 30fps. The PS3 Eye cameras can do 60fps which is preferred for eliminating motion blur. I'm not sure if 30fps is the Kinect's limit or if this is because I'm using my old laptop to capture with. I should know for sure tonight when I can check it against a modern desktop computer.

I'm amazed that this system is allowing fully body capture in my 'smallish' office. I'm standing only about 8 feet from the device.

I haven't done any actual capture yet; maybe after lunch. More later.

G.

Mr Rid
03-09-2011, 03:05 PM
http://www.kinect-hacks.com/kinect-news/2011/03/08/kinect-motion-capture-animated-series-under-hud

Greenlaw
03-09-2011, 03:54 PM
Posted some more info at the link above. I forgot to bring in an eSATA cable so I can't capture yet. (USB is definitely not fast enough.) Setup was remarkably easy though so I will continue this at home tonight.

G.

Greenlaw
03-09-2011, 04:27 PM
Wow. In the time I was out to lunch they posted another software update for iPi Recorder. This version is supposed to have improved Kinect support and uses different Kinect drivers. More later.

erikals
03-09-2011, 04:39 PM
is there a price on it yet?...

Greenlaw
03-09-2011, 04:50 PM
They haven't nailed down the price for the Express version (Kinect only) but it's supposed to be somewhere between $300 and $400 depending on how well it works and how many features they wind up with in version 1.0.

FYI, Kinect support is included with the Standard version (what I use,) so you get both systems.

G.

erikals
03-09-2011, 04:54 PM
hm,
but Kinectix can't possibly beat a 4cam setup, can it...?

so the standard edition might be the way to go?

Greenlaw
03-09-2011, 06:58 PM
No, I don't think the Kinect will be as 'production worthy' as the multi-camera PS3 Eye setup (up to six cameras in the Standard edition.) The main reason is that the Kinect cannot resolve occlusion and also it has a limited range (about 5m vs the PS3's 20m.) I'm not 100 percent sure yet but Kinect's frame rate may also be limited compared to the PS3 camera (30fps vs. 60fps*).

That said, what the Kinect gives you is convenience. You can use it in a small room, you don't need to light your performance (technically you don't need any light,) and you don't need special clothing. It also doesn't require as much horsepower for capture and tracking since there is less data to process.

The reason I decided to get the Kinect in addition to the PS3 system I already have is that it takes a bit of time to plan and set up the PS3 sessions. With the Kinect system, it's pretty much plug-and-play. For some types of motions that may be all the effort I want to put into it.

I don't know how accurate the Kinect system is compared to the PS3 setup yet. For certain types of motions, it could be just as good. I'll have the answer to that soon.

G.

*Technically, the PS3 Eye Camera is capable of 120fps, but the iPi guys say that at that speed the image quality gets too grainy for accurate tracking.

erikals
03-10-2011, 05:01 AM
haha, great news... :]

jasonwestmas
03-10-2011, 10:04 AM
*Technically, the PS3 Eye Camera is capable of 120fps, but the iPi guys say that at that speed the image quality gets too grainy for accurate tracking.

So less accurate tracking means what exactly? Does it mean the tracking points on the rig would be more shaky or frames will be overlooked and missed by the camera?

Greenlaw
03-10-2011, 11:03 AM
Not completely sure since they've limited the framerate in iPi Recorder to 60fps. Just have to take their word for it. In any case, six 640x480 video streams at 60 fps is pretty demanding on any single computer...I don't think most current desktops could possibly capture that much data at 120fps.

BTW, earlier I posted incorrect info about the maximum capture areas. According the the products page at ipisoft.com the maximum capture area for the Kinect version is 2 x 2 meters; for the PS3 Eye it's 6 x 6 meters.

G.

Greenlaw
03-10-2011, 11:31 AM
More info here about what I'll try to do today with the Kinect:

http://www.newtek.com/forums/showthread.php?t=114498&highlight=ipi&page=5

G.

geo_n
03-10-2011, 11:40 AM
http://www.kinect-hacks.com/kinect-news/2011/03/08/kinect-motion-capture-animated-series-under-hud

Thats not bad at all for internet indy films on a budget.
But I notice they're always in the front view and just standing and waving arms. Will it capture more complex actions?

Greenlaw
03-10-2011, 11:50 AM
I don't know about this system but the iPi version does allow some walking, at least that was the case in their early demos. I think it would break as the actor turns away though. I'll let you know what happens here.

Funny thing is that when I look at the mocap we've done so far for the Brudders short, most of it is basically "...standing and waiving arms." I'll try to be more ambitious about our performances when we get to our next mocap project. :)

G.

P.S., I just looked at those early iPi Kinect tests again and the feet are a bit slippy. I don't know if this is characteristic of Kinect mocap but it might be the reason why the characters in HUD don't walk around. Hopefully, the feet problem will be improved by the first 'non-beta' release.

OnlineRender
03-10-2011, 11:52 AM
Wow. In the time I was out to lunch they posted another software update for iPi Recorder. This version is supposed to have improved Kinect support and uses different Kinect drivers. More later.

exciting times

Greenlaw
03-10-2011, 12:03 PM
I just watched the whole video. Oh, my...I can really identify with these guys. For us too, producing a short with a 'garage mocap' system has turned out to be been way more work than what my wife and I imagined. However, now that we have our pipeline and most of our assets build, I think the effort will pay off for future animated 'Brudders' projects. (I sure hope so anyway.) :)

G.

jeric_synergy
03-10-2011, 01:05 PM
In digital, the first thing is just as hard as, say, claymation. It's the SECOND thing that pays off.

Greenlaw
03-11-2011, 10:20 PM
Finally got to do a Kinect test today. More info here:

http://newtek.com/forums/showthread.php?t=114498&page=5&highlight=kinect

G.

cresshead
03-12-2011, 04:31 AM
almost realtime mo cap for modo and daz3d in the works

http://www.kinect-hacks.com/kinect-hacks/2011/03/08/kinect-daz-and-modo-real-time-motion-capturealmost

cresshead
03-12-2011, 04:35 AM
blender and kinect mo cap

http://www.kinect-hacks.com/blender

http://www.kinect-hacks.com/kinect-hacks/2011/02/12/controlling-virtual-robot-kinect-and-blender

cresshead
03-12-2011, 05:09 AM
come on newtek get your hands out of your pockets...create a kinect connection before the others take the lead...your app was and is always touted as 'out of the box'....get a kinect plugin sorted out.

erikals
03-12-2011, 08:19 AM
does LW have any ability to receive and show realtime input though?

cresshead
03-12-2011, 09:37 AM
does LW have any ability to receive and show realtime input though?

if it doesn't, it should be in their PLAN to add it as soon as possible:thumbsup:

newtek could create divers and a capture/edit plugin no doubt..they just need to open their eyes and see what's cooking around them...there's some anime character dance games that have live capture to kinect already..they cost $30...so what's stopping newtek from doing the same?

if newtek's focus is the lower end of the market and not competing with autocash then they should pick this up and run with it before others come on the scene and take that market.

KScott
03-12-2011, 10:25 AM
Hi, late in the conversation, so if it was mentioned I missed it. can you get still object data. Like point cloud info? Like a scanner. I guess it would be, could you?

Kevin

Greenlaw
03-12-2011, 10:40 AM
I haven't been keeping up with the open source development but I thought I saw this done in one of the demos. Anybody know?

As far as iPi Studio goes, it looks like it builds a point cloud of the shoot space and performer but it uses this data internally for calibration and tracking; it doesn't allow you to export this data for other applications. IMO, I'm not sure the data would be very useful for anything besides motion capture because it's very low-res (the video source that the geometry is created from is only 640 x 480). Well, maybe that's good enough if you intend to use the data for camera mapping but that's a whole other topic.

I'll try to record a session today if I have time. I'm curious to see if it will capture and track my four year old daughter.

G.

jburford
03-14-2011, 03:45 PM
does LW have any ability to receive and show realtime input though?

wouldn't that be LW10? I remember watching the Videos of them moving the new supported camera around in studio and instant feedback within Layouts Scene.......

OnlineRender
03-14-2011, 03:51 PM
wouldn't that be LW10? I remember watching the Videos of them moving the new supported camera around in studio and instant feedback within Layouts Scene.......

do you know how much that camera cost ? , I want it for Kinect and I want it now :) no seriously ! ...


it would easier and much lager user base to use the iphone's gyroscope for moving a real time camera around . . . but for bvh you have to go kinect,although there is a realtime scanner for the iphone now .

walfridson
03-14-2011, 03:53 PM
lw10 realtime input.. that would only be motion modifier. single item/object position and rotation, not like a realtime point cloud.. unless I've missed something major?..

edit: oh looked at the blender video... that's would be enough actually. :)

jburford
03-14-2011, 03:56 PM
just look at http://www.youtube.com/watch?v=JhJauu_vB2A , as well as information about...

Support for the InterSense VCam virtual camera system used in feature film Virtual Art Departments (VAD)

which is hardly low end.......

so....

walfridson
03-15-2011, 03:48 AM
mm well, if you want something like the blender video - driving nulls in realtime you could do that in 9 as well.

Greenlaw
03-23-2011, 12:07 PM
iPi Soft officially released today the Kinect-only version called iPi Studio Express. Price is $395 and early adopters get a 30% discount ($276.50) till April 22. The full version with both six-camera PS3 Eye support and single-camera Kinect support is on sale too (normally $995, now $696.50) till April 22. More info here:

iPi Software Products (http://www.ipisoft.com/products.php?utm_source=iPi+Soft+Newsletter+2011&utm_campaign=8ccb462dc8-Newsletter+2010-12+NY+2011+Discounts&utm_medium=email)

As some of you know, for several months I've been using the Standard version in production of a short film and I recently added a Kinect to our toolset, so feel free to ask any questions about the system here or in the thread on testing iPi Studio for Lightwave (http://www.newtek.com/forums/showthread.php?t=114498&page=5&highlight=ipi).

G.

Greenlaw
03-23-2011, 03:32 PM
Latest video demo from iPi Soft: iPi DMC Demo 2 (http://www.youtube.com/watch?v=7ssb0ZN1MSA&feature=uploademail)

It doesn't say so, but I know this was not done with Kinect. The video does show what you can do with the four- and six-camera PS3 Eye setup. The quality of the somersaults surprised even me, and I've been using this system for a while.

G.

Philbert
03-23-2011, 04:49 PM
Yeah it's obviously not from Kinect due to the 4 thumbnails in the corner. It does look very good though.

ianr
04-02-2011, 08:59 AM
FOR NEWTEK STAFF 2 READ PLEASE:

'Cressheads' post nails it, an asked for & needed,Newtek Kintec plug-in surely properly 'bookends' the InterSense VCam
in the Rev 10 package and segues into the lightwave ethos, better than 3D Mice,tagged on after everyone else ?
(Lo, my SpaceBall gathers dust! )

Look you'all at Newtek, 'Greenlaw' done all your R&D, hasn't he just?
Just sprinkle a little bit of 'garage-inspirational obtainabilty' into the next point release.
Then you'll see in the forums a flourish & kudos for LW10 work using this set-up for start-up gamershops,etc

Making this Plug-in cannot be the FBX nightmare,thats past.

Let us get some more Mojo magic in there!

Newtek has wanted to push char-then this pushes char, maybe with a IPI bundle?:thumbsup:

Titus
04-02-2011, 09:51 AM
Latest video demo from iPi Soft: iPi DMC Demo 2 (http://www.youtube.com/watch?v=7ssb0ZN1MSA&feature=uploademail)

It doesn't say so, but I know this was not done with Kinect. The video does show what you can do with the four- and six-camera PS3 Eye setup. The quality of the somersaults surprised even me, and I've been using this system for a while.

G.

What's the benefit of going from 4 to 6 cameras?

erikals
04-02-2011, 12:09 PM
less jitter :]

cresshead
04-02-2011, 12:22 PM
What's the benefit of going from 4 to 6 cameras?

more coverage/less ocllusion

Titus
04-02-2011, 12:39 PM
more coverage/less ocllusion

I've 4 cameras and mocap is pretty decent right now, with 6 cameras I need a bigger room, that's why I'm asking.

Greenlaw
04-02-2011, 07:34 PM
The main reasons for six are as mentioned above: improved accuracy of the capture and less jitter. I was using iPi with four cameras in a semi-circle, and getting very good results. When I added two more cameras, I just inserted them within the semi-circle, and it did reduce jitter.

The reason I haven't expanded the semi-circle with the two additional cameras is because of my small shoot space. Plus, the bright lights in front of the stage may interfere with cameras that face them. But I've been thinking that if I position them up high enough and looking downwards (i.e., not into the lights,) it might work just fine. (I'll write more about this when I get to test it.)

BTW, the Configurable Jitter Removal in recent versions of iPi DMC is so good that you can get very stable results with just four cameras.

G.

OnlineRender
04-03-2011, 01:23 AM
ok going to mess around with kinect again , anybody want help me get it working with LW?

erikals
04-03-2011, 03:37 AM
what i'm really curious about is if iPi is looking into connecting several Kinect cameras...

edit:
Q: Can I use 2 or 3 Kinects with iPi DMC?
A: No. Current version supports only one Kinect sensor.

from this FAQ page,
http://www.ipisoft.com/en/wiki/index.php?title=FAQ

 

OnlineRender
04-03-2011, 03:40 AM
I have seen 2 kinects working together and the results where awesome

OnlineRender
04-03-2011, 03:41 AM
http://www.youtube.com/watch?v=9MwS_nk9n2A like so Multiple Kinects - 3D Body Scanning Demo by [TC]2.avi , its not scanning that I am intersted in , its getting bvh data and into LW is my goal ...blender first perhaps

OnlineRender
04-03-2011, 01:46 PM
GreenLaw seen you over at ipi , what kinda info will you give a newbie and what do you know about exporting the rigs to LW ? using Kinect !

Greenlaw
04-03-2011, 02:58 PM
what i'm really curious about is if iPi is looking into connecting several Kinect cameras...

edit:
Q: Can I use 2 or 3 Kinects with iPi DMC?
A: No. Current version supports only one Kinect sensor.

from this FAQ page,
http://www.ipisoft.com/en/wiki/index.php?title=FAQ

 

I was told that their plan is to support two Kinects but that will come after they add head tracking. (This is just me: I don't care about two Kinects right now but I do need head tracking as soon as possible.) :)

G.

Greenlaw
04-03-2011, 03:46 PM
GreenLaw seen you over at ipi , what kinda info will you give a newbie and what do you know about exporting the rigs to LW ? using Kinect !
There are a lot of options for this with varying degrees of simplicity/complexity and quality in the results.

The simplest approach that should work (but didn't at the time of testing) is to import a Lightwave DAE of your character directly into iPi DMC and retarget your data there, and then export a DAE from iPi DMC back to Lightwave. (Or even better, use Merge Only Motion Envelopes to get the data onto a control rig in Lightwave.) If you need to edit the motion in Lightwave without a full control rig you can do it easily using IK Boost. That's it. Well, it should be anyway. The problem I ran into was that DAE from Lightwave didn't work very well in DMC. This was some time ago and before I understood the necessity of explicit weights, so maybe I need to test this again.

If anybody can get this working, the iPi Studio/Lightwave combo has the potential to be the cheapest and easiest mocap solution, and could make Lightwave an ideal program for 'home brew' mocapping because of IK Boost (even in its underdeveloped and long neglected state,) and a great renderer. Well, okay, cheaper if you ignore Blender I mean.

There is hope though: iPi Soft is working on FBX support, so maybe we'll be able to get useable Lightwave data into iPi Studio yet.

Another approach is to output a BVH file for Lightwave. Raw BVH however has limited support in Lightwave. You can 'hammer' the BVH rig to fit your character and then use IK Boost to edit, which is perfectly fine for 'one-shots' but it's not very efficient if you have a lot of shots, and certainly a PIA if you have a lot of different characters with different body types.

If you have Messiah, there is a more elegant solution. Messiah has tools that let you reference an imported BVH and bake the data to your character's rig. You can use Messiah's own animation tools to edit the motion, and the result can be sent back to Lightwave either as .mdd (preferred) or a rigged mesh. I haven't done this myself so I can't go into detail about how well this works. BTW, Maddox reported that Messiah does well for auto-weighting and FBX support, so be sure to check out his blog for more info (http://themaddoxcorner.blogspot.com/2011/03/messiahs-crash-course-video-tutorial.html).

What I'm doing is more convoluted than any of the above but it gives me a lot of control over the results. I'm modeling in Modo, setting up a control rig in Lightwave and using Lightwave's native weighting for final anim, exporting the rig to Maya via FBX for painting explicit weights for Motion Builder, importing that rig and mesh to Motion Builder for receiving the iPi BVH motions, using Motion Buidler to edit and andd animation enhancements, exporting an FBX from Motion Builder for Lightwave, and then finally using Lightwave's Load From Scene - Merge Only Motion Envelopes to import the data back onto the original Lightwave rig. This is expensive because of all the AD programs involved and obviously requires you to learn a lot of additional programs, but I'm very excited about the results I'm getting and, for me anyway, it's all been worth it. (As a bonus, everything I've been learning on this project is paying off at work too.)

One more solution is a variation of my current workflow using Animeeple instead of Motion Builder. Animeeple is a free program which can retarget iPi BVH to a FBX or DAE of a rigged character from Lightwave. Animeeple is not as robust as Motion Builder but it has some nice tools for editing your motions. You can then export an FBX or DAE from Animeeple for Lightwave, and then import just the joint motions (as described above) directly onto your Lightwave rig. In my experience, I had much better luck using FBX over DAE for this but the FBX feature is a $50 option. IMO, it's worth it, but I ran into a small problem with Animeeple's FBX for Lightwave 10. I haven't come back to this problem for some time so I can't tell you what the current state is but I'm going to take another look at this after I get our short film finished. If anybody gets this working well, it could be another strong free/almost free solution for getting iPi mocap to Lightwave.

G.

P.S., FYI, there is no technical difference between PS3 Eye and Kinect motion data after you get things tracked in iPi DMC. In fact, even though the input data is different, the tracking tools work pretty much the same way with either, and DMC produces the same type of data for both in the end.

OnlineRender
04-03-2011, 06:15 PM
cheers for the detailed info much appreciated

bvh data :

http://img593.imageshack.us/img593/3629/biped.jpg

OnlineRender
04-03-2011, 06:24 PM
I have bvh data looking at it atm

OnlineRender
04-03-2011, 06:56 PM
http://img816.imageshack.us/img816/5619/mescano.jpg

cresshead
04-04-2011, 06:18 AM
i see Onlinerender is playing a 'bit' part here :D
mind you his talent appears to have some 'depth'

OnlineRender
04-04-2011, 06:47 AM
needed a subject kid was closest to hand ........

http://img97.imageshack.us/img97/8075/lara2l.jpg

http://img9.imageshack.us/img9/3426/larai.jpg

OnlineRender
04-04-2011, 07:46 AM
I look like I'm made out of Lego , and yes its all about me

http://img854.imageshack.us/img854/4857/3dme1.jpg

OnlineRender
04-04-2011, 10:48 AM
I suppose its a form of ART , not very pretty art , but still art ......

http://img231.imageshack.us/img231/7862/3dme2.jpg

erikals
04-04-2011, 11:58 AM
ditto that,.... post #153 sure looks like lego... :]

never saw brekel, looks fun :]
http://www.brekel.com/?page_id=155

but early beta for sure... :]
http://www.youtube.com/watch?v=MsOGfG54Z78

  

OnlineRender
04-04-2011, 12:14 PM
http://www.youtube.com/watch?v=rD-pkt5zg0Y&feature=player_embedded

Shawn Farrell
04-04-2011, 09:40 PM
THIS IS GENIUS.:agree:

Greenlaw
04-07-2011, 01:49 PM
Check this out:

Danse Kabyle (http://vimeo.com/22028464)

A beautifully designed trailer by an artist using iPi Studio with three cameras. This was created for the current iPi contest. The artist admits some of the motions still need tweaking, but even so it looks awesome.

Be sure to check his making of video here:

Making of Danse Kabyle Part 1 (http://vimeo.com/22036677)

Note the clever use of the system for the 'free fall' shots.

G.

curious
04-09-2011, 06:27 AM
Oh wow, this all looks very exciting, indeed.

You guys saw this?:
http://vimeo.com/21676294

Philbert
04-09-2011, 06:33 AM
Yeah that's pretty neat, it's just a shame that the Reprap used produces such low quality.

OnlineRender
04-10-2011, 05:42 AM
im working on capturing decent bvh data atm, truebones dev is also on the brekel board helping out , capturing 3d Scans with a kinect can only go so high , esp with latency and the kinect tech scope , you might aswell set up several 500D's and get pretty accurate results .

erikals
04-10-2011, 06:04 AM
or go for this one,
http://www.david-laserscanner.com/

névada
04-14-2011, 09:48 AM
Hi!
i just installed the Brekel Kinect.
That works really fine!
i tried several ways to scan space with cheap machines
for example with cameras. PhotoSculpt Textures; Autodesk Photo Scene Editor ; Photosynth
The best results i had were with ARC_3D_Webservice with samples of a video travelling
i rebuilt the mesh with Meshlab wich is realy usefull.
Brekel Kinect exports quad mesh and point clouds in .ply too but i don't get the colored points!
Did somebody try?
thanks

vink
04-14-2011, 12:29 PM
Hi!
i just installed the Brekel Kinect.
That works really fine!
i tried several ways to scan space with cheap machines
for example with cameras. PhotoSculpt Textures; Autodesk Photo Scene Editor ; Photosynth
The best results i had were with ARC_3D_Webservice with samples of a video travelling
i rebuilt the mesh with Meshlab wich is realy usefull.
Brekel Kinect exports quad mesh and point clouds in .ply too but i don't get the colored points!
Did somebody try?
thanks

If you want make 3d scan based on photos you should try AgiSoft PhotoScan

névada
04-14-2011, 03:38 PM
I didn't try AgiSoft PhotoScan
there are many who make the same (photomodeler..)
But Autodesk Photo Scene Editor, Photosynth, ARC_3D_Webservice are free!
they made me discover Meshlab (free too) who rebuild pointclouds
but not only...
Some researches let me hoping modelling simplifications in quads (for Zbrush):
http://meshlabstuff.blogspot.com/2009/12/practical-quad-mesh-simplification.html
You can merge multiple scans :
http://www.youtube.com/watch?v=4g9Hap4rX0k&feature=mfu_in_order&list=UL
The Kinect directly in Meshlab:
http://www.youtube.com/watch?v=fRfWIQUDa68&feature=player_embedded
or
http://www.youtube.com/watch?v=gu5Ywwb4RaU&feature=player_embedded#at=23
I'm realy enthousiast whith that kind of showing 3D (over with toys, too old)

in all these exemples the colors come from the pixels and could be "print" on the pointcloud
but i can't success on my ply exported from Brekel Kinect (develloped for MotionBuilder Live!!!!)

mrbones
04-16-2011, 09:24 PM
Heres my one camera kinect and ipisoft.

http://www.youtube.com/watch?v=eB0mD-JLUY4&feature=player_profilepage

Short but gives you the idea of the pwoer and accuracy of this software.

mrbones
05-02-2011, 11:53 AM
Whoops I meant this link. CLICK HERE (http://bit.ly/iLmvUv)

cresshead
05-02-2011, 03:28 PM
SDK for Kinect
http://research.microsoft.com/en-us/um/redmond/projects/kinectsdk/

Philbert
05-02-2011, 03:32 PM
That's funny I was just at my local Game Stop a few minutes ago noticing that they have like 4 used kinects. They were 2 prices though, $99 and $119, I didn't quite understand the difference though, I'll have to look it up. I wonder if MS will release a Kinect without the Xbox branding.

cresshead
05-02-2011, 03:37 PM
That's funny I was just at my local Game Stop a few minutes ago noticing that they have like 4 used kinects. They were 2 prices though, $99 and $119, I didn't quite understand the difference though, I'll have to look it up. I wonder if MS will release a Kinect without the Xbox branding.

difference in price might be lack of box/manual and or thefree packin game

i'm gonna get a kinect this week and give it a go for mo cap with breckel open source software.

http://www.brekel.com/?page_id=243

Philbert
05-02-2011, 03:39 PM
The girl there said one was made for the 360 slim model (the cheaper one). But I'm not sure if that means anything for hooking it up to a PC.

Edit: after googling around to find the difference the only answers I'm seeing is that there is no difference. Though the girl at the store may have said something about one coming with an AC adapter. People in my search results seemed to think you wouldn't need the A/C though since PCs put out my USB power than the xbox does. I guess if they have both I couold just get the cheaper one and if it doesn't work exchange it for the other one.

Greenlaw
05-02-2011, 06:14 PM
FYI, my Kinect uses an AC adapter when running from my laptop. The device does quite a bit (it even has a motorized servo,) so I don't think it can run purely on USB power. I haven't tried but I'm almost sure it needs the power supply.

If you guys get a Kinect be sure to let us know. I'd love to see what you do with it.

As for my own project, lately I've been busy painting textures for the environments and digging into Vue 9.5, so no new mocap sessions for a while.

G.

cresshead
05-09-2011, 03:48 PM
not THAT much of interest unless you also run 3dsmax but i made a quick n simple vid on importing bvh from the ipisoft sample bvh's into 3dsmax...note "Poly" was meowing in the background as i made the video...she was wanting attention i think..typical female!

http://www.youtube.com/watch?v=5iaH9HxDPS0

cresshead
05-10-2011, 12:36 PM
Q.
is it just me or do most, if not all of the kinect motion capture demo vids and sample BVH files look like they were animated with IKbooster?...in that the feet just will not STICK and constantly drift and rotate.

i have yet to see any file either JUST stand there or walk properly.
is it any good long term...are we just needing to wait on refining the apps and general development or really should we just pony up for a multi ps3 eye cam setup and a FAST computer to process the streams of video?

i'm talking abut the kinect version of ipisoft and the open source brekel app.

cresshead
05-15-2011, 06:21 PM
some of these vids are looking much more respectable

http://www.youtube.com/watch?v=pRCFIt1gEdk

manholoz
05-16-2011, 12:54 PM
Can you do a complete turn mocap with the kinect and the ipisoft combo? My intuition is that with only one motion capturing device, you cannot, but I could be wrong. (Like for a salsa dancing spin sort of mocap)

Greenlaw
05-20-2011, 12:19 PM
Here's a good video on using iPi DMC with Kinect for machinima:

How To make Machinima using kinect (Motion Capture Tutorial) (http://www.youtube.com/watch?v=pY2HZs6HWcQ)

A little vague at points but otherwise a nice presentation.

G.

Greenlaw
05-20-2011, 12:43 PM
Can you do a complete turn mocap with the kinect and the ipisoft combo? My intuition is that with only one motion capturing device, you cannot, but I could be wrong. (Like for a salsa dancing spin sort of mocap)

It would be a little tricky with Kinect. First, occlusion would be an issue, but that said iPi DMC does its best to interpolate what it can't see. If your limbs aren't hidden for long, it can sometimes carry on the action, and tracking backwards at trouble points can sometimes help. To a degree you even manually pose what iPi can't see (you might call this supervised tracking) to guide its interpolation. Major fixes need to be done in another app like Motion Builder, Animeeple, or Lightwave (using a control rig or IK Boost.) The second problem is frame rate: the Kinect only does 30 fps which isn't very fast for capturing fast motion.

The PS3 Eye setup is more suitable for this type of performance because you can use up to six cameras in a full circle (even half circle works well,) and the framerate is 60 fps. The downside is that it requires a lot more space and more computer horsepower.

Both are good systems but have different technical strengths and weaknesses. If you can't get both camera systems for iPi DMC, you'll have to decide which one is more appropriate for your needs. Cost is more or less equal: PS3 Cameras are much cheaper than a single Kinect, but for the PS3 Eye system you also need add the cost of repeater cables, steady mounts, good lighting, and appropriate wardrobe. Both systems require a very fast drive. (SSD or RAID is recommended. IMO go with a SSD; they're cheaper now and you don't really need a huge one if it's dedicated for iPi Recorder capture.) You might be able to get away with a fast SATA drive for Kinect capture or for PS3 Eye capture set at lower quality, but a constant data rate might be less reliable. Using a SATA 6GB or USB 3 drive is another possible alternative to SSD, though I still recommend the SSD for this purpose. (For SATA 6GB or USB 3, you many need to add a special controller card to your computer.)

But before you spend any money, try borrowing a Kinect if you don't already own one and try it out with the trial version of iPi DMC.

If I have time this weekend (which I probably won't because of work,) I'll do a simple 'spin' test with the Kinect and let you know what happens.

G.

Edit: BTW, iPi Software is working on dual Kinect support which may solve some of the occlusion issues. I'm guessing it might be equivalent to four PS3 Eye cameras arranged in a semi-circle but with smaller capture space.

Philbert
05-20-2011, 12:48 PM
But before you spend anything, try borrowing a Kinect if you don't already own one and try it out with the trial version of DMC.

This is a good idea, they have some used kinects at my local Gamestop and GS has a 7 day return policy on used stuff for any reason. So that gives a week to try it and see if you like it. The down side is a that it's a little pricier, a Kinect on Amazon is about $100, Gamestop's Kinect (with power cord) is $120.

Greenlaw
05-20-2011, 01:29 PM
This is a good idea, they have some used kinects at my local Gamestop and GS has a 7 day return policy on used stuff for any reason. So that gives a week to try it and see if you like it. The down side is a that it's a little pricier, a Kinect on Amazon is about $100, Gamestop's Kinect (with power cord) is $120.
That's really not bad I think. Only months ago I paid about $135 for mine through Amazon. It came with a power adapter but also a game that I haven't played. (I don't own an Xbox; if I did I'd never have time to play it anyway.) :)

Philbert
05-20-2011, 01:46 PM
Yeah the Amazon one has the game, the GS one does not, both are used.

Here's the Amazon link, you can see the used price right now is even $93 (plus shipping)
http://www.amazon.com/Kinect-Sensor-Adventures-Xbox-360/dp/B002BSA298/ref=wl_it_dp_o?ie=UTF8&coliid=I2X273SNGXS7E0&colid=BM1ZCPMDAU1F

OnlineRender
05-20-2011, 04:18 PM
I bought 3 ps3 camera for a total of £34 bargain .

mrbones
05-28-2011, 12:07 PM
Again, Check out http://www.truebones.com They provide sales and free training/support for the new Kinect iPiSoft Mocap Combo!

OnlineRender
05-28-2011, 12:29 PM
Again, Check out http://www.truebones.com They provide sales and free training/support for the new Kinect iPiSoft Mocap Combo!

You STILL working with Berkel ?

mrbones
05-28-2011, 01:53 PM
You STILL working with Berkel ?

Nope, I am not in anyway using brekel at the moment. To be honest,

I found it to not tracking accurately. It doesnt track feet either. Which doesnt bode well for my dancing animations.

However it is interesting when in MB, But MB is not FREE!

I will keep an eye on the berkel software. For future use,

Just not right now.

mrbones
05-28-2011, 02:01 PM
Thanks Cress,

I appreciate your comment.


some of these vids are looking much more respectable

http://www.youtube.com/watch?v=pRCFIt1gEdk

mrbones
05-28-2011, 02:05 PM
Hi Philbert,

Just a reminder that the used versions may not come with a Power adapter.

The new ones do come with the needed power adapter supply. Plus a nice game, I havent played it yet. I dont have an xbox



Yeah the Amazon one has the game, the GS one does not, both are used.

Here's the Amazon link, you can see the used price right now is even $93 (plus shipping)
http://www.amazon.com/Kinect-Sensor-Adventures-Xbox-360/dp/B002BSA298/ref=wl_it_dp_o?ie=UTF8&coliid=I2X273SNGXS7E0&colid=BM1ZCPMDAU1F

Philbert
05-28-2011, 02:16 PM
I'm going to assume that if Amazon says it comes with a power adapter, then it probably does, even if it's used. Amazon probably requires that sellers follow the description the site has posted.

mrbones
05-28-2011, 02:31 PM
Hi Phil,

I would assume so too, except we couldnt find any info to support that claim.

As it stands, I would buy a new one, as to not take any chances..

Cheers


I'm going to assume that if Amazon says it comes with a power adapter, then it probably does, even if it's used. Amazon probably requires that sellers follow the description the site has posted.

Philbert
05-28-2011, 03:25 PM
Yeah well everything I do is on a budget so if I can save a few bucks here or there that's what I'll do.

OnlineRender
05-28-2011, 04:34 PM
Yeah well everything I do is on a budget so if I can save a few bucks here or there that's what I'll do.

agreed I do not know many "3DARtist's" that have the dosh to go mental .

wrong thread but continue on ,retrospective of buying of mocap hardware .

if I can do it via ipi or brekel or any app under 1k I will . homebrew FTW

Greenlaw
06-09-2011, 09:32 AM
Some info of interest from Michael of iPi Software:


Support for multiple Kinects is in the works and will take a month or two to release. We expect it to have accuracy comparable to a multi-camera system based on PS3 Eye cameras. But capture area will be smaller.
That's pretty cool. This may bring the Kinect system closer to becoming a 'pro' tool.

Last night I finally got around to playing with a single Kinect again, this time using my Tablet PC. First I tried recording to a USB drive but this could only sustain recording for about 37 seconds, then IO would drop to zero. However, if I recorded to the internal system drive (a SATA3 7200 drive) the Kinect could capture motion data at 30 fps indefinitely. Though it's not recommended that you record to your system drive but I guess if you really have to, it can work. At least in this case.

Normally for the Kinect, I use an old laptop with an external eSATA mini-RAID I built (two 7200 laptop drives,) which works very well. According to iPi, you can use a very modest cpu for Kinect but you do need a reasonably fast drive and faster than USB IO if it's an external drive. (My guess is that USB 3.0 for external storage would be fast enough but I don't have a USB 3.0 interface for either of my mobile devices.)

G.

Philbert
06-09-2011, 09:48 AM
Normally for the Kinect, I use an old laptop with an external eSATA mini-RAID I built (two 7200 laptop drives,) which works very well. According to iPi, you can use a very modest cpu for Kinect but you do need a reasonably fast drive and faster than USB IO if it's an external drive. (My guess is that USB 3.0 would be fast enough but I don't have a USB 3.0 interface for either of my mobile devices.)

G.

Yes I tried it with a 4 year old laptop and it ran smooth but the drive speed was a problem.

Anyone have suggestions for what to do with the point cloud data? I have Meshlab installed but haven't looked up anything about that yet. Figured I might as well ask since I was already typing.

Greenlaw
06-09-2011, 09:58 AM
Hmm. Just learned that they make inexpensive USB 3.0 Express Cards for laptops. My mini-RAID works fine but I might have to give USB 3.0 a shot; I've been in need of additional fast storage for this computer anyway.

At a glance, the cards look like they're about $30 or so. I'll look into this further later today and let you know what I decide to do.

G.

Philbert
06-16-2011, 05:21 PM
So the official beta SDK is now available:

http://research.microsoft.com/kinectsdk/

cresshead
06-18-2011, 12:55 PM
new sdk video

http://www.youtube.com/watch?v=37Mrprtn1uE

Philbert
06-18-2011, 01:34 PM
Well it does look very accurate, we'll see how it does with recording.

mrbones
08-08-2011, 02:44 PM
There has been an update to the iPiSOft for Kinect DEMo.

They have removed the 5 second limitation for kinect users.!:help:

Visit my web to get the new demo here.

http://www.truebones.com

Greenlaw
08-10-2011, 11:30 AM
iPi Software posted an interesting Siggraph video yesterday. They're having random visitors step in front of a Kinect and generating mocap and retargeting in DMC to characters right there on the event floor:

Motion Capture of Siggraph 2011 Visitors (http://www.youtube.com/user/mnikonov#p/u/0/98tRav8uN30)

Bear in mind that what you see here is more or less 'raw' mocap data retargeted to oddly proportioned characters directly in the DMC tracking program, which is definitely not an animation or animation editing program. Considering that, these results are really quite good.

FYI, iPi Software announced a short time ago that the dual Kinect system was going to be available for beta testing soon, but I think they're using the single Kinect system in this video. Can anybody present at Siggraph confirm this?

G.

mrbones
08-11-2011, 12:17 AM
Heres the direct link on the NEW iPiSoft demo.

http://tinyurl.com/6dwsor

Thanks and Cheers

Philbert
08-11-2011, 12:20 AM
That link didn't work.
http://www.ipisoft.com/downloads_index.php

Greenlaw
10-12-2011, 07:14 PM
Today, iPi released a beta version of iPi DMC with dual Kinect Sensor support. I wasn't going test this until after I finished Happy Box for Gothtober 2011, but now that it's available, I just couldn't resist; I picked up a second Kinect Sensor this afternoon. (What makes this funny is that I don't own even a single XBox.) :p

Happy Box was written around a single Kinect, and the motion capture actually turned out pretty good. There is one complicated shot (for a single Kinect) that I wanted to reshoot, so the timing for this release is perfect. Will post some notes about the dual Kinect Sensor session later.

In the meantime, here's some official info about how the dual Kinect Sensor setup works:

http://www.ipisoft.com/en/wiki/index.php?title=Quick_Start_Guide_with_Dual_Kinect s

G.

erikals
10-13-2011, 05:01 AM
i was wondering when they were gonna get to that, dual Kinects...

it'll be interesting to see later on what will be better, multiple Kinects, or multiple Cameras...
 

Greenlaw
10-13-2011, 09:20 AM
It's too soon for me to say with certainty (especially since I haven't even opened the box for the second Kinect yet,) but here's what I know about the iPi's Dual Kinect Sensor (DKS) support so far:

- A month or so ago, iPi Soft suggested that the quality would be comparable to using four PS3 Eye cameras.

- Based on the 'Quickstart Guide', the second Kinect has to cover the same space seen the first Kinect, just from a different angle (offset by about 90 degrees. A second configuration allows for 180 degrees (facing each other for full 360) but they don't recommend this for the beta yet.)

- Another iPi user reported that the capture space might be smaller than with a single Kinect. This isn't fully confirmed yet, it was just a first impression.

So, DKS is probably as accurate as four PS3 Eye cameras arranged in a semi-circle or less (though it may eventually cover 360), but limited by a smaller capture space than you get with four PS3 Eye cameras, and possibly smaller than single Kinect.

Based on my experience, improved precision is a big deal and a fair price to pay for reduced capture space. Single Kinect capture on a character can look a little 'wobbly' when the final render camera is close up, and it is especially noticeable if the character is supposed to be relatively still. I imagine the improved precision with DKS should reduce or eliminate this 'wobble' effect.

(Just to be clear, 'wobble' is not the same as 'jitter' which can be cleared up using iPi's Configurable Jitter Removal. What I call 'wobble' is smaller and slower oscillating motion in the spine that becomes more pronounced in the shoulders. I think this is a tracking artifact that comes from tracking a fairly low-res 3D volume of the performer generated from the single Kinect data. Dual Kinects should refine the volume detail and coverage, so 'wobble' and 'jitter' should get reduced.)

I was having a noticeable 'wobble' problem on two shots for Happy Box, and I planned to fix this by simply eliminating keyframes when I got back to these shots (the character is mostly 'still' in these shots.) Before I do that though, I think I'll try recapturing the motions using the DKS setup first, assuming I don't run out of time for this project. :p

There is also one shot where there is a lot of motion which I knew would lead to problems caused by momentary occlusion, but I also knew I could easily edit around the problems 'in camera' so you wouldn't see them. I'm curious, however, to see if DKS will eliminate these occlusion errors to begin with.

I'll let you know how it all works out for me.

G.

erikals
10-13-2011, 12:01 PM
 
maybe the Kinect will be improved in not too long, hopefully... ;]
http://news.cnet.com/8301-17938_105-20026315-1.html

it's very cool to see how this evolves... :]
i might go with the Kinect solution... (as of space)

...microscopic apartments here in Norway/Oslo i tell ya' :]
 

Greenlaw
10-13-2011, 12:33 PM
i might go with the Kinect solution... (as of space)

...microscopic apartments here in Norway/Oslo i tell ya' :]
It's perfect for small spaces! I shot all of Happy Box in my living room with the Kinect and laptop on the dinner table. Distance to subject was about 2.2 m and there was room to move closer or farther. (A bit closer if I didn't raise my arms above my head.) Dual Kinect sensors may require more space but only a little more I think.

Useful tip: You don't want to stand too close to a wall or furniture...the 'human' volume iPi Studio generates may 'blob' into other volume masses, affecting tracking results.

G.

erikals
10-13-2011, 12:47 PM
thank you for the tip,...

it won't be just yet, but definitely looking into it, and will probably mix the motions in JimmyRig Pro...

Greenlaw
10-13-2011, 01:17 PM
Speaking of which, the current Jimmy|Rig beta features Kinect support. This is a temporary feature just for the duration of J|R Pro beta testing, and it will be included as part of a different upgrade path for J|R. (Something above 'Pro' I think.)

I haven't tried it yet but it looks interesting. The devs took a realtime approach (as opposed to tracking video data in post like in iPi DMC) which is pretty cool but I imagine some precision might be sacrificed for this kind of feedback. But for artists who really want instant gratification this could be just the thing. :)

G.

erikals
10-13-2011, 01:57 PM
now,... iPi capture quality inside Jimmy|Rig, that'd be nice,... ;] :]

Greenlaw
10-13-2011, 03:00 PM
The sample video they posted a few weeks ago actually looked pretty good. You can see it here: kinect to animate multiple characters with Jimmy Rig (http://www.youtube.com/watch?v=mBWEu_V0IrQ&feature=related)

I'd like to try this myself but that will have to wait until I'm done with this project. Taking just a few hours today to try out dual Kinect in iPi DMC is already stretching my personal 'bandwidth'. :)

G.

Greenlaw
10-13-2011, 03:46 PM
I just completed a bandwidth test using dual Kinect Sensors with my laptop...no good. I was hopeful but not especially optimistic about this setup because this is a very old laptop and I think using even a single Kinect with it was pushing it. :p

Fortunately, my main desktop computer sits just outside the living room so I can test dual Kinect Sensors with that computer tonight. Unlike the laptop, I'm not expecting any problems with this computer.

Stay tuned!

G.

erikals
10-13-2011, 04:10 PM
yeah, i seem to remember you had a kick*** computer config... :]

erikals
10-13-2011, 04:29 PM
edit: scratchthat...

(ps. JR will soon support FBX)

Greenlaw
10-13-2011, 10:19 PM
yeah, i seem to remember you had a kick*** computer config... :]
Well, when it comes to computers, kick***ness only last about six months. :P

G.

Greenlaw
10-13-2011, 11:27 PM
If anybody is still interested, I'm continuing this line of discussion here:

iPi Studio Desktop Motion Capture Tests (http://forums.newtek.com/showthread.php?t=114498&page=9#post1188413)

G.

Greenlaw
11-01-2011, 03:48 PM
I almost forgot to mention it here: a preview version of Happy Box premiered at Gothtober last Saturday. To see the short, just go to gothtober.com (http://gothtober.com) and click on the 29th. Happy Box is an animated short based on my webcomic (http://www.littlegreendog.com/comics/brudders/brudders054.php#.TrBpfILw9IE) and it was entirely 'puppeted' using one or two Kinect Sensors and iPi Studio. The results were assembled and rendered in Lightwave 10.1.

I want to point out that this is a 'preview' version of Happy Box as I had to cut a lot of corners to make the deadline for Gothtober. The final version will have lip sync animation, a few bug fixes and improvements, and a tighter edit. It will come out later this month, and a full HD release should follow in December. In the meantime, please enjoy this version for what it is. :)

If you're curious, there's more info about the preview release on my blog:

'Happy Box' Preview is Live at Gothtober! (http://littlegreendog.blogspot.com/2011/10/happy-box-preview-is-live-at-gothtober.html)

The final version of Happy Box will appear at www.littlegreendog.com. I'll be sure to make that announcement when it's ready.

G.

Philbert
11-01-2011, 04:01 PM
That's looking pretty good, do you intend to do lip-sync? Perhaps with drawn on mouths like they are here?

Greenlaw
11-01-2011, 04:10 PM
Yes! I actually had about half the lip syncing done last week but when it became obvious that it wouldn't be ready for the event, I decided to leave lip sync out entirely. It just looked weird to have only half the short with lip sync animation, and leaving it out completely made it look more like I made a 'creative decision.' :p

I'm finishing up the lip sync animation this week, along with many bug fixes and other improvements. I'll let you know when the 'final' version comes out.

G.

cresshead
11-01-2011, 04:12 PM
nice work, looking forward to the HD tweeked version soon also

Greenlaw
11-01-2011, 04:21 PM
Thanks!

Happy Box was a lot of work in a short time and it got a bit stressful towards the end but right now I'm actually eager to get started on my next 'mocap' project. I made a lot of mistakes on this one that cost me more time than I'd care to admit but hopefully I learned a few things and the next project will go more smoothly.

G.

erikals
11-01-2011, 05:42 PM
Cute! :]

a bit tighter edit sounds good.
liked the idea :]

Philbert
11-01-2011, 07:22 PM
Looks like the commercial SDK is officially announced.
http://www.engadget.com/2011/11/01/kinect-commercial-sdk-coming-in-2012-video/

Shawn Farrell
11-06-2011, 09:52 AM
Shot this with a kinect 3D...edited with VideoStudioProx4

https://www.facebook.com/photo.php?v=2373051364218:spam:

OnlineRender
11-06-2011, 10:20 AM
Shot this with a kinect 3D...edited with VideoStudioProx4

https://www.facebook.com/photo.php?v=2373051364218:spam:

? what do you mean shot this with kinect ? The resolution is 320x240 , you mean you pointed it at your desktop ? and filmed VLC media player ?

I shot this with kinect http://www.youtube.com/watch?v=rD-pkt5zg0Y

cresshead
11-06-2011, 10:38 AM
Shot this with a kinect 3D...edited with VideoStudioProx4

https://www.facebook.com/photo.php?v=2373051364218:spam:

...do not bother coming here spamming, thank you i am not impressed.

jasonwestmas
11-06-2011, 02:30 PM
Shot this with a kinect 3D...edited with VideoStudioProx4

https://www.facebook.com/photo.php?v=2373051364218:spam:

what does bad 80's music have to do with kinect.

OnlineRender
11-06-2011, 03:32 PM
this technically has more relevance to the thread http://www.damnlol.com/pics/60/5584de1263e458a2eee9f642e37398d5.gif

OnlineRender
11-06-2011, 04:12 PM
back on kinect watch this http://mrdoob.com/

Greenlaw
11-06-2011, 04:24 PM
this technically has more relevance to the thread http://www.damnlol.com/pics/60/5584de1263e458a2eee9f642e37398d5.gif

Trying to decide if that's wonderfully freakish or freakishly wonderful. :)

jasonwestmas
11-06-2011, 05:39 PM
this technically has more relevance to the thread http://www.damnlol.com/pics/60/5584de1263e458a2eee9f642e37398d5.gif

If anything got me interested in compositing that just did lol!

Shawn Farrell
11-08-2011, 06:39 AM
SHUTUP BEFORE I VIDEO PROOVE YOU GUILTYWITH MY 3D STEREOSCOPIC KINTECT! *****!:dito:

Vagrants
11-08-2011, 08:26 AM
DSTORM is going to announce the new kinect motion capture plug-in for LW ?
http://www.dstorm.co.jp/event/interbee2011/

OnlineRender
11-08-2011, 08:37 AM
SHUTUP BEFORE I VIDEO PROOVE YOU GUILTYWITH MY 3D STEREOSCOPIC KINTECT! *****!:dito:

your getting mixed up with Stereoscopic and rgb depth maps http://en.wikipedia.org/wiki/Kinect

jasonwestmas
11-08-2011, 10:02 AM
DSTORM is going to announce the new kinect motion capture plug-in for LW ?
http://www.dstorm.co.jp/event/interbee2011/

I may have to learn Japanese to read the manual though. =)

Philbert
11-08-2011, 10:40 AM
Well, here's the translation of the page at least.

http://translate.google.com/translate?sl=ja&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&layout=2&eotf=1&u=http%3A%2F%2Fwww.dstorm.co.jp%2Fevent%2Finterbee 2011%2F

erikals
11-08-2011, 10:55 AM
DSTORM is going to announce the new kinect motion capture plug-in for LW ?
http://www.dstorm.co.jp/event/interbee2011/


LightWave / kinect

Title: Introduction to Motion Capture plug-ins using the new features introduced with LightWave 10 Kinect

The podium: Kohara Aikawa (de-Storm)
Content Outline:

A new feature of the viewport preview rendering LightWave 10 (VPR), Stereo Scopice, introduced as Linear Warkflow. In addition, also introduced in LightWave plugin incorporate motion information from the kinect.

hard to say, might be a demonstration of Jimmy|Rig Pro

LW_Will
11-08-2011, 11:08 PM
Wow! It might be a "LightWave plugin incorporate motion information from the kinect!!"

DUDE! This is AWSOME!

I liked the iPI version of Kinect, the Brekel plug in for MoBu is awasom, but if this works directly in Lightwave... a-bloody-mazing! Cannot wait!

erikals
11-08-2011, 11:37 PM
that's a BIG if... :]
(but not saying it can't be...)