View Full Version : Ipisoft Question

01-22-2013, 03:53 PM
I'm almost buying my ipisoft rig but I have a question about USB cables.

Playstation cameras allow to capture a 20 m area, but wich is the USB cable I will need?

Kinect allow a 7m area. Does Kinect uses USP port? Where to buy these cables?

01-22-2013, 04:02 PM
You want USB repeater cables... the ipiWiki will give you all the details... U can get em on amazon, etc.

01-22-2013, 11:16 PM
Yes, repeater cables. They are also known as active USB cables. This is the kind I use:


I probably own at least a dozen by now--these are fairly cheap and I've been using them for years.

Today I found a new use for a repeater cable: if you attach a bluetooth adapter to one, you can bring it closer to your performance space for more accurate capture data from your PS Move or Wii Motion Plus devices (for wrist and props tracking.)

If you have USB 3.0 in your computer, they now sell a USB 3.0 version here:


A bit more expensive. I've never used this model so I can't vouch for it.

Good luck.


01-25-2013, 10:05 AM
Does Kinect uses USP port?

Yes it does.

01-25-2013, 10:40 AM
If you're going to use Kinect (1 or 2 sensors) with iPi DMC, make sure you get it bundled with the power supply. The Kinect power supply has a special connector that adapts the Kinect to USB.

Also, you may want to consider Kinect for Windows over Kinect for XBox. The Windows version is more expensive but I'm finding a few compelling reasons to use it over the XBox version. The other day, one of my Kinect for XBox sensors failed so I swapped it out for the Windows version that we've been using for 3D scanning and face capture experiments. When I plugged it in, I was surprised to discover that many additional controls became available in iPi Recorder but only for the Kinect for Windows sensor.

Even with the default settings, the RGB channel looks significantly better in low light than with the XBox version, which may become important when they begin utilizing the new Align RGB feature to assist in tracking (most notably, for the head.) Additionally, if you use hands/props tracking devices like the PS Move or Wii Motion Plus, you'll need to be able to visually reference the RGB channel for initial hands/prop alignment.

Here's an example of what I mean:


There are other reasons to get the Windows version. The original reason we got a Kinect for Windows to begin with was to use it's 'Near Mode' for enhanced face tracking--this feature is unavailable in the XBox version. It's also useful for scanning objects using ReconstructMe, which I don't think is compatible with the XBox version. Finally, Microsoft officially supports only Kinect for Windows for the Windows platform--this isn't a technical issue because Kinect for XBox operates just fine under Windows but it may be important to some users for warranty reasons.


01-25-2013, 11:01 AM
Interesting stuff G. I'd have mentioned the power supply, but I assumed they all came with one (both the ones I've got did - xbox versions).
If you get a spare 5-10 minutes sometime, I'd love to hear the different uses you have found for the kinect (new thread?), and what software you are trying out. My Kinects have been gathering dust a little bit since we got them about a year ago, so I'd love to put them to more use! (Though I did try using one to take a still 3D bobbly model... it was good reference, but barely usable for anything else)

We tried ipi on the trial, but at the time (on a short deadline as ever) we couldn't get the kind of results we wanted. I should give it another try actually. I think the real problem we had was getting the result to work in Lightwave correctly if memory serves. The results in ipi looked and worked great with a proxy though.

Anyway, yes, any interesting uses to share?


p.s. I've used the one Kinect with Jimmyrig, and for simple uses, that works great!

01-25-2013, 11:31 AM
I hope to start posting a lot more info about our current production on the website in the next few weeks, which should include more mocap info. I'll make an announcement when that's up.

Anyway, I found iPi DMC works great with Lightwave but I use it by way of Motion Builder--this is mainly because our characters have 'non-human' proportions and Motion Builder is able to retarget to them almost perfectly. iPi Studio can also retarget data to other rigs but the built-in retargeting system is more limited and there are no tools for editing animation. (iPi DMC's focus in primarily on raw capture, tracking and basic clean-up retargeting, and it's expected that you will edit the motions in your animation program of choice.) Regardless of how you choose to retarget, the key is to use FBX and shared rig hierarchies to get the data into Lightwave. Also, I've had the best success importing FBX motion data using Load Items From Scene - Merge Only Motion Envelopes to import only the animation to a rigged character in Lightwave. If you set this up correctly, it works flawlessly every time.

We recently got an alternative to Motion Builder but we haven't tried it in our iPi DMC to Lightwave workflow yet. I'll post info whenever we can get to that but right now we're too busy finishing our current movie project.

Another alternative is Ikinema WebAnimate--I've never used this tool but I believe some users here have had success using it to get iPi mocap data into Lightwave.

If you're curious, there's already a big thread on the topic of using iPi DMC with Lightwave here:

iPi Desktop Motion Capture Tests (Page 20) (http://forums.newtek.com/showthread.php?114498-iPi-Studio-Desktop-Motion-Capture-Tests/page20&highlight=ipi+kinect)

This thread goes back a long way in iPi to Lighwave development and a lot of the early pages contain obsolete info, so you'll probably want to skip ahead to later pages.

FYI, I found that the current release of iPi DMC has a few tracking issues but the developers are aware of them and they're working on the problem as we speak. The situation is not a complete show-stopper as we can still capture our motion data just fine but we'll have to hold off tracking it until they can release the fix. The iPi guys are actually quite prompt about squashing bugs so hopefully that will happen very soon.


01-25-2013, 12:00 PM
Regarding 3D scanning, I just answered this exact question on the iPi forums the other day, so here's a 'cut and paste' for your enjoyment. :)

We're using ReconstructMe and Agisoft PhotoScan. We got the idea to try this ourselves after reading 1k0's blog (http://1k0.blogspot.com/) (1k0 is another avid iPi DMC user and a very talented artist.) We haven't spent a lot of time with either program yet because we're busy trying to finish our current movie but here's what we know from brief testing so far.

ReconstructMe is interesting because of the immediate feedback and results. The detail is not hi-def and you'll want to retopologize the result, but the result really is not bad for a base to sculpt from. (Like, say, in ZBrush or 3D Coat.) IMO, the main issue is lack of texture data. If you want to scan a human, you'll want to construct an adjustable support structure for the subject. We built one inspired by the one we saw on 1k0's website--it's basically two light stands ($12 each from Cowboy Studios), two extra film foam softballs, and a coupler with a threaded rod inserted and glued into each ball with E6000. It's pretty solid and just what we needed--cheap and effective. Here's a pic of our human scanning support rig:


PhotoScan is much more capable but also requires a lot more work. It does not use Kinect--you need to capture a series of photographs of your subject, and the software will reconstruct the object from the photos with a high-res texture applied. I found that the most time-consuming part is setting up masks for the object, so you may want to do your shoot in an uncluttered room.

We haven't used either in our productions yet but these tool are in line for R&D for future projects.

Our other Kinect-related R&D project is face capture. We were experimenting with Brekel Pro Face but had to put it aside for now because we don't need face capture for our current project.


01-27-2013, 11:20 PM
Great stuff, thanks Greenlaw, very interesting stuff. It makes you wonder what kind of technology we'll have available to us in another 5-10 years!