PDA

View Full Version : MYCAP Studio 2012 Pro v1.2 now available!



vnaber
06-18-2012, 07:37 AM
MYCAP Studio 2012 Professional v1.2 is now available. Version 1.2 allows you to export your 3d motion capture data to .LWS for maximum compatibility with LightWave.

About MYCAP Studio 2012 Professional:

MYCAP Studio 2012 Professional is a powerful software solution for true 3D motion capture.

Unique features:

-Fast camera calibration
-True 3D capture using two cameras (webcam/DV/DSLR)
-Automatic marker identification and stereo matching
-Kalman filtering for clean, professional quality data
-Global motion stabilization
-BVH, FBX, LWS, MA, ASC and CSV support out of the box
-Compatible with Blender, 3ds Max, Maya, LightWave 3D and Cinema 4D

MYCAP Studio 2012 Professional works with cheap markers and does not require special lighting (works outdoors)

Limitless applications:

-Digital puppetry/animation
-Human and animal gait analysis
-Kinematic modeling
-Robotics
-Structural analysis
-Multiple 3D point tracking
-3D scene reconstruction

Price: 99 Euros (excludes cameras)

Videos: https://vimeo.com/user10779198/videos

For more info and sample data files visit the official site: www.rebanrobotics.com

Ernest
06-18-2012, 07:50 PM
Oh, the swallowing looked awesome! I'll definitely try it. Hopefully the upper face can look as good as the bottom.

vnaber
06-19-2012, 02:20 AM
We used limited marker density on the cheek area, so this could definitely be improved. We'll be posting a 'making of' video soon so stay tuned to our Videos channel: https://vimeo.com/user10779198/videos

DigitalSorcery8
06-19-2012, 01:48 PM
This looks EXTREMELY promising.

Looking forward to seeing "the Making of." :thumbsup:

JamesCurtis
06-19-2012, 02:09 PM
Can this do full body tracking too?

vnaber
06-19-2012, 02:26 PM
Yes, full body capture is also possible. The following video may give you a good idea of the possibilities: https://vimeo.com/41201849

OnlineRender
06-19-2012, 02:28 PM
just a heads up but EsetNod "which can be a little touchy at times " really doesn't like the exe MYCAP_Studio...

silviotoledo
06-20-2012, 09:08 AM
cool! really loved it!

How is the LWS transfer? Will it create the bones, or just export markers ( nulls ) ?

Does it do retarget for lightwave?

vnaber
06-20-2012, 10:08 AM
Hi Silvio. MYCAP Studio exports the motion capture data as nulls. You can have a look at some sample data files here: http://rebanrobotics.com/downloads.html

rcallicotte
06-20-2012, 10:51 AM
This is significant. It's very interesting. Out of curiosity, do you have any demos of a person jumping around or doing somersaults or, basically, anything that would show a greater range of motion?

Greenlaw
06-20-2012, 12:20 PM
MyCap looks pretty cool!

My wife and I use iPi DMC 2.0 for our short film projects (see link below for our first film.) This system works great for body capture, however it's not designed to capture 'closeup' data for hands and faces. Naturally, we're curious to try your product as a companion to our current system.

We'll be looking at MyCap soon. :)

G.

vnaber
06-20-2012, 03:40 PM
This is significant. It's very interesting. Out of curiosity, do you have any demos of a person jumping around or doing somersaults or, basically, anything that would show a greater range of motion?

All our tests have been performed using standard webcams running at 30fps max. High speed motion will require cameras with higher frame rates, tighter synchronization and more advanced lighting. We currently also only support stereo camera arrangements, so marker occlusions are inevitable when performing somersaults.

vnaber
06-20-2012, 03:47 PM
MyCap looks pretty cool!

My wife and I use iPi DMC 2.0 for our short film projects (see link below for our first film.) This system works great for body capture, however it's not designed to capture 'closeup' data for hands and faces. Naturally, we're curious to try your product as a companion to our current system.

We'll be looking at MyCap soon. :)

G.

Loved the short!

Greenlaw
06-20-2012, 03:58 PM
Thanks! Glad you enjoyed it.

I was wondering if you can make MyCap compatible with PS3 Eye cameras. iPi DMC uses this USB camera and it easily captures at 60fps. (Technically, the Eye can record up to 120fps but apparently the video is too grainy at that speed for tracking.)

G.

vnaber
06-20-2012, 04:05 PM
I don't have much experience with the PS3 Eye cams. We do have a free stereo video capture application that you can download to try with your cameras: http://rebanrobotics.com/downloads.html

The stereo video capture application works with just about any webcam and supports up to 640x480 video resolution.

Greenlaw
06-20-2012, 04:30 PM
Cool, I'll have to check it out. We do have a few Logitech cameras which is what we started out using but replaced with the PS3 Eye cameras when they became supported.

So at least we may put those cameras to use again. :)

G.

Pamukkedi
06-21-2012, 01:43 AM
Hi Greenlaw, nice to meet you here. Well, I work with Lw, Modo and 3dsmax/Motionbuilder. We have an Animazoo Gypsy and now Ipisoft.

MY question to Vnaber: Can I use Fbx from your software to drive facial expression in Motionbuilder? Personally, I dont like facial bone rigs because retargeting is an issue... always looks like rubber faces and a lot of acting is lost. Have you made tests with FACS system?

vnaber
06-21-2012, 09:59 AM
Hi Greenlaw, nice to meet you here. Well, I work with Lw, Modo and 3dsmax/Motionbuilder. We have an Animazoo Gypsy and now Ipisoft.

MY question to Vnaber: Can I use Fbx from your software to drive facial expression in Motionbuilder? Personally, I dont like facial bone rigs because retargeting is an issue... always looks like rubber faces and a lot of acting is lost. Have you made tests with FACS system?

The FBX format is compatible with 3ds Max. You can try out some of our sample data files here: http://rebanrobotics.com/downloads.html
Unfortunately we don't have any experience with FACS.

silviotoledo
06-21-2012, 02:50 PM
I think several bones on face without a rig will always get a rubber animation. A rig is always required, so nulls will drive the rig itself, but if you're doing monsters you can get interesting results if replace each null with a bone. Anyway too much bones will be required and lightwave will work too slow.

It's easy to do retarget on lightwave if you move NULLS pivot, anyway a cool rig is required! To get etter results the nulls must drive some morphs too.

I've not seem a rig on lightwave so cool for face. It seems layered deform ( what lightwave doesn't have ) makes a big difference.

The key is using some nulls to drive morphs, best option on lightwave.

Greenlaw
06-21-2012, 03:49 PM
I'm curious but can you go into more detail about the technical setup for the Prometheus Unbound (http://vimeo.com/43848223) demo? It really looks quite good (and the actor's performance is great!) Is this setup an example of bones driven by the nulls? Was there much cleanup involved?

Thanks in advance.

G.

RebelHill
06-21-2012, 04:09 PM
So this basically just outputs a 3d scene containing the markers as animated nulls... doesnt put out any kind of bone rig/rotations, etc...

If that's the case then it's pretty much useless with LW alone and needs an inbetween like motionbuilder.

Ernest
06-21-2012, 06:08 PM
If that's the case then it's pretty much useless with LW alone and needs an inbetween like motionbuilder.

Or RHiggit 2.0 ;)

vnaber
06-22-2012, 03:12 AM
To answer some of your questions, the facial motion capture data for Prometheus Unbound was captured in a single 2 minute take using two webcams. The data was stabilized using MYCAP Studio Pro's stabilization feature (to eliminate head motion). Data was filtered using MYCAP Studio Pro's second-order Kalman filter. No manual cleanup was required.

The resulting data was imported into Blender 2.6 as .BVH (effectively replacing each null with a bone). A few extra bones were added manually for the assumed rigid bodies (skull, eyes, jaw and chest) as well as a few bones to control the neck. Motion capture data was parented to the skull.

All facial muscles were controlled using stretch-to constraints on the data. The marker on the chin was used to guide the motion of the jaw. Head motion and eye saccades were added later using separate motion capture data.

Greenlaw
06-22-2012, 11:08 AM
Well, I'm impressed! Thanks for the details.

I'm under a heavy crunch at work right now but I hope to take a close look at MyCap as soon as I get some of my life back again.

G.

silviotoledo
06-24-2012, 03:40 AM
Tried it.

The cloud of nulls is too terrible to match the character in lightwave itself. Can't you add a 3D mesh mask like Maskarad? So it would be easier for visual adjust.

We can retarget in Lightwave just moving Nulls pivots. Tried 30 bones on character's face and it looks like jelly. Tried individual weightmaps and it still deforms like jelly. Too much more bones would be required. I thing the best way is to do a rig and use the nulls only to drive it. So we will not need more than 16 or 20 nulls.

The solution needs a prebuilt and adjustable lightwave rig. So we would do markers that match the rig controls.

It would be amazing if the great rig artists we have here would provide a face rig based in bones that can de driven by these points.

silviotoledo
06-24-2012, 04:01 AM
This is almost 8 years old done by a brazilian artist Kris Costa ( antrophus ) that now is senior modeler at ILM ( he modeled Hulk for Avengers ).

He used 120 bones at face and 16 controls points in Maya, driven by a little elements in joystic interface:

VIDEO HERE:
http://www.antropus.com/private/custom_portfolio/videos/plumber640x480.avi

I wonder why can't we replicate this in Lightwave.

RebelHill
06-24-2012, 04:57 AM
It would be amazing if the great rig artists we have here would provide a face rig based in bones that can de driven by these points.

Ive said unmerous time, Silvio (and you still dont seem to believe me), but its NOT POSSIBLE IN LW.

There's no way to retarget, aside from taking all the necessary measurements between markers, mapping them proportionally to the target face, and then setting up all the necessary percentile offsets between marker and controller, and doing all this manually, every time, for every individual character.

Ok, so its possible to retarget, with a huge amount of work.

But this whole "moving the nulls pivot" thing... that DOESNT retarget the motion either... it just moves the pivot point... The motions (scale wise) are still the same... there's NO proportional rescaling (which is what a retarget is).

This, or any other software that ouputs solely tracked markers is useless for anything inside LW alone, there's just NO WAY to do anything meaningful with it.

And again... bone face rigs in LW are a DEAD END. Have you ever seen one, and I mean good and wide functioning one, not just some lil thing where someone's deformed a face shaped mesh with a handful of bones in a basic manner... I mean a full on rig? No... neither have I. Reason, cos its a dead end. LWs bone's sytem, whilst versatile for sure, is a lil tricky to get the right kind of deformation in a face mesh to begin with, but even if you do, LW lacks any of the kind of controllers that are needed to be able to pose such a structure properly, and easily. So even if you skinned such a thing succesfully, animating it would be a nightmare.

vnaber
07-16-2012, 03:46 PM
I'm curious but can you go into more detail about the technical setup for the Prometheus Unbound (http://vimeo.com/43848223) demo? It really looks quite good (and the actor's performance is great!) Is this setup an example of bones driven by the nulls? Was there much cleanup involved?

Thanks in advance.

G.


Have a look at Prometheus Unbound (the making of)
https://vimeo.com/45826244

Regards,

Vincent

Greenlaw
07-16-2012, 06:19 PM
Yes, I did, and I was even more impressed. :)

silviotoledo
07-18-2012, 07:57 PM
In blender it works very well, but I'd like to see the same done in Lightwave.