PDA

View Full Version : Perception Neuron Realtime Support



Zimtower
02-01-2016, 02:11 PM
https://neuronmocap.com/

Integrate realtime streaming from PN, then I'll buy your product.

mch
02-06-2016, 04:42 PM
Hi Zimtower !

Yes - mocap live streaming with Perception Neuron directly into Lightwave 3D would be a tremendous feature !!!

Perception Neuron works very well with iClone 6.4 and their beta mocap live streaming plug in.
I used the 32 Neuron version for some testing :-)

I guess Lightwave could also work with the BVH live streaming since it already supports all the kinect input.
Perception Neuron mocap live streaming would be great !!!


Nevron Motion is fine for retrageting moves via BVH import but lacking the full support for fingers as well.
The complete rig implementation would also be very useful . . .
Please support the full body specs including fingers in NevronMotion !!!
Adding, renaming and editing items to create custom setups would be a helpful addition !


So.
+1 for realtime support !!!!!


Regards,

Marcus

erikals
02-07-2016, 01:59 AM
should be easy...
maybe in LW2017 ?


Perception Neuron works very well with iClone 6.4 and their beta mocap live streaming plug in.
I used the 32 Neuron version for some testing

Cool http://erikalstad.com/backup/misc.php_files/king.gif


Please support the full body specs including fingers in NevronMotion !!!

hm, curious to why it didn't... :/

mch
02-07-2016, 02:01 PM
should be easy...
maybe in LW2017 ?/

;-) - I love optimists . . .


Cool http://erikalstad.com/backup/misc.php_files/king.gif

I missed the kickstarter campaign and ordered the Perception Neuron suit late November 2015.
One day before Christmas it was delivered . . . :-)




hm, curious to why it didn't... :/

I guess because most MoCap systems don't record fingers.
Traditional optical mocap systems like vicon cannot do this and most mocap libraries don't include it either.
Fingers had to be animated in general till now . . .

The Perception Neuron including full body and fingers at this price point is a huge achievement.

Since the BVH standard allows for mocap skeletons with fingers it would be nice to include the maximum possible rig and simply uncheck what is not needed within Nevron Motion in the retargeting dialogue.
The Genoma Nevron Motion rig has all fingers included. Now Nevron Motion needs just a longer list for retargeting.

As Ryan Roye told me maybe Lino already has it in the making and we all hope for the best . . . ;-)


Regards,

Marcus

Ryan Roye
02-08-2016, 12:14 PM
hm, curious to why it didn't... :/

to be fair, Nevron came out like 2 years before Perception neuron became publicly available, and it is going to take some time for word to spread around that it is as solid as the developers claim.

mch
02-08-2016, 03:46 PM
to be fair, Nevron came out like 2 years before Perception neuron became publicly available, and it is going to take some time for word to spread around that it is as solid as the developers claim.

I totally agree.
Since there have been several mocap crowd funding projects around for some time which are still strugling to deliver and perform properly you can't expect full support from the start.

In contrast to these other systems the Perception Neuron MoCap suit is a rock solid product in my opinion.
Their software is still evolving and they work hard on improvements.
But you could not tell before actually testing this for yourself.
Not to forget it took them a long time to deliver the first units to the funding people and now to regular customers like me . . .

It's good to see that Reallusion managed to build a stable live mocap plug in for their realtime animation software (iClone 6.2+).
Proof of concept. It works.
Just like Nevron Motion performs well with Kinect in its different flavors.
So it is a logical step to enhance the rig retargeting set with fingers in NevronMotion and of course wishful thinking to get this plus realtime support for Perception Neuron.
But in my "simple user mind" understanding it is a possible task to get the BVH live mocap streaming integrated into Lightwave similar to the Kinect support.

To map the motion data of the fingers an update of the (internal) retargeting lists is needed in NevronMotion.
This cannot be done by the user at the very moment.
Editing items for custom rigs would of course be the most flexibel way.
Combined with proven presets what users mostly need.

Anyway.
NevronMotion is a useful tool and I still have to learn a lot about rigging in general and setting up a decent mocap pipeline.
Thanks for all the information from the wavers out there :thumbsup:. . .

Regards,

Marcus

Zimtower
02-09-2016, 03:27 PM
https://www.youtube.com/watch?v=Xc4Gn9ZyUoQ

I would love to be able to do this directly in Lightwave, rather than relying on Maya.

mch
02-09-2016, 04:42 PM
I would love to be able to do this directly in Lightwave, rather than relying on Maya.

Hi Zimtower !

Really nice test !!!

Well - Motionbuilder and faceshift are out of reach for me.
MoCap Live Streaming into Motion Builder from Axis Neuron - quite impressive !

https://www.youtube.com/watch?v=hBPeVMKBpFg
You already included facial motion capture at the same time with the body mocap - amazing !!!

And yes.
I also would like to do this directly in Lightwave without an import /retargeting chain.
Realtime live feedback on your actual characters is a huge step...

Since Perception Neuron is widly accepted by now and the live streaming possibillities to other programms already expand it would be nice having Lightwave 3D on board :lightwave


Regards,

Marcus