PDA

View Full Version : TAFA 2 - not yet...



erikals
12-14-2019, 07:07 AM
is there going to be a TAFA 2 ... ?

...perhaps, some of you might find this interesting.

https://i.imgur.com/bcwLfNX.gif


copied from > http://www.macreitercreations.com


(Long overdue) Update June 6, 2019:

Maybe I'm slow, but I just now found out about Emscripten. C++ to Javascript compilation with OpenGL and OpenAL support, and near-native speed. Has been used to port various things, including both Unity and Unreal Engine to the web. Hmmm... So, TAFA is written in (ancient) C++ and OpenGL.

Its audio handling is finely tweaked Win32 multimedia code, so that would all have to be redone (I looked briefly at OpenAL back in the day, so I'm not opposed).
Possibly more problematic is that all of the non-3D windows are currently Win32 GDI code, which would require a lot of rework.
And, of course, there's the whole issue of 'files'. How would I load or save them? Would I have to require cloud storage? Don't want to force storage on my servers... And the resulting code would all be running locally in your browser anyway. Just have to see how the file system inside Emscripten works with local files and browser sandboxing.

If I can recover some of my earliest morphing code via an archaeological expedition into old hard drives, I might be able to find a simple OpenGL-only morphing codebase that I could used to test just that core against Emscripten...

Not sure if any of this is worthwhile. I mean, it would be cool to see TAFA running under Linux, on a Mac, and on phones/tablets. But as time marches into the future, I keep seeing so much more advanced technology for facial animation -- automatic face tracking to match a performer's behavior, or engines that can take text and generate both natural speech audio and natural facial expressions. Very cool, but I can easily see how these kinds of tools could render traditional facial animation tools irrelevant within the next few years.

And, clearly, I'm not so speedy at creating products in my 'free' time these days. But I'm still curious. If I come up with anything, I'll make sure to put it up here for others to play with.

-----

Hello everyone.

A recent malware scan noticed that my MediaWiki had been compromised and malware installed. I've needed to do something with the site for a long time, anyway, so rather than just trying to clean up and go back to my stale old style, I thought I'd take this opportunity to experiment with some newer website techniques and systems. Hopefully, if I stick with stuff that my hosting provider sets up, their auto-update systems will keep me protected :) In the interim, this low-key placeholder will have to do.

I guess this also serves as a way -- not a great way, admittedly -- to show that I (and TAFA) are still here. No real news at this time, which is part of why the website got so dusty. 1.2.2.8 still works, though I've heard that the new Object file format in LW2018 isn't supported. Fortunately, saving as the older style still works. Just another impetus to find a way to make something happen with TAFA 2, I guess.

As always, thanks for your interest! I'll try not to let the garden get so weed-covered again.

Mac

P.S. Wow, has this trivial exercise reminded me how much I hate HTML. Pointed two different mobile browsers at this page. One has unreadably small text, even after I jumped the font size to 4 (mid-range), and if I zoom in it won't re-wrap the text. But at least the paragraph breaks are visible. On the other, the text is nice and big, but didn't change when I set the font size. Unfortunately, it barely shows any paragraph break at all, making the text hard to read in a different way. Hopefully whatever browser you're using isn't choking too badly.

Ryan Roye
12-14-2019, 08:07 AM
Although it requires scripting or programming knowledge to use outside of Unity/Unreal, i'm using Oculus's lipsync tech for character animation work now (it does not require a headset at all... and its free). It gets ~90% of the work done with just an audio clip and computes instantly.


https://www.youtube.com/watch?v=4JGxN8q0BIw

Typically the main issue with things like these is the unnatural "stutter lip" effect, yet if you smooth things out too much you lose detail that's required to make it work. So, right now there are still some things you just have to hand-animate in terms of lip syncing, but the amount of that work is shrinking.

So, the competition Tafa2 has makes things increasingly difficult in terms of marketability.

ianr
12-14-2019, 10:00 AM
i like Mac he's a charming guy in my dealings with him.


If your getting this thread Mr.Mac, sell it to LW3DG it makes sense

It was built for LW morphs & has the best morphing Joystick after

Joe's Lipservice. Approach Chuck B.

good luck!

tyrot
12-14-2019, 12:48 PM
TAFA is the ONLY software makes sense in a second - i still use it daily.. .. and i would sell my kidney for TAFA 2

erikals
12-14-2019, 07:05 PM
for semi-realism i just don't feel there are any good alternatives...

i'm also checking out Papagoyo now, but it's a whole other story, since for "Project X" i'm looking into more of a Stop motion style.



https://www.youtube.com/watch?v=4hEgE3m50Hw

erikals
12-14-2019, 07:26 PM
here, example from another Application, but same principle for Papagoyo (LW) applies >


https://youtu.be/g1AWrwr6yXk

... additional tweaks necessary.

erikals
12-14-2019, 07:48 PM
adding, found an old TAFA example by coincidence >


https://www.youtube.com/watch?v=ca6OOmewMIg

TheLexx
12-14-2019, 07:55 PM
Same deal, shown with the base software head but apparently does have an export path to Lightwave, mentioned here (http://www.annosoft.com/lightwave-3d%c2%ae-support).



https://www.youtube.com/watch?v=lzLqzSW8Hh0

tyrot
12-14-2019, 08:15 PM
think about - adding IphoneX features - plus adding cartoonize effects - FBX export - there you go - a perfect solution for everything- I hope he could consider TAFA2 for lightwave or even Unity :) - It would be an awesome asset...

erikals
12-14-2019, 08:23 PM
Same deal, shown with the base software head but apparently does have an export path to Lightwave,..
yes, but,... couldn't find the cost.

that said, Papagoyo is free.


think about - adding IphoneX features
looked alot into it, but not there yet.  (imo)
[yet]

Rayek
12-14-2019, 11:55 PM
P.S. Wow, has this trivial exercise reminded me how much I hate HTML. Pointed two different mobile browsers at this page. One has unreadably small text, even after I jumped the font size to 4 (mid-range), and if I zoom in it won't re-wrap the text. But at least the paragraph breaks are visible. On the other, the text is nice and big, but didn't change when I set the font size. Unfortunately, it barely shows any paragraph break at all, making the text hard to read in a different way. Hopefully whatever browser you're using isn't choking too badly.

Well, duh: he didn't declare a viewport meta, so mobile browsers on small screens render it full size. That's really basic.
It shows he has been out of the loop of web dev for a long, LONG time.

Anyway....

I for one would love to see a new version of Tafa, but upgraded to integrate with modern facial motion capture, and or though the use of a simple web cam tracing your hand movements and/or facial expressions. I think Tafa 2 could then be used to polish the acting, add additional movements, and then export for use in any 3d software.

But I agree with Ryan. I am unsure if there's still a viable market for Tafa 2. Things are progressing quite fast.

By the way, what the heck happened to Thirdwishsoftware and Magpie Pro? Does anyone know? I attempted to contact them a few years ago, but never got any response. The site was pulled offline a while ago as well.
I was hoping they'd open source it, but seeing they were a small company run by two people (I think), the software seems now forever lost. A shame.

TheLexx
12-15-2019, 11:27 AM
But in practice when we talk about face capture, not being Weta Digital we are actually talking about our own faces aren't we ? Which presents a problem because we are not exactly Laurence Olivier, or are we talking about exaggerated theatrical or cartoon style perfomances ? My real question is to ask if a workflow using audio track analysis alone is obsolete going into 2020, or is more sophisticated audio extraction possible to allow an automated emotional range ? A bit like Mimic Pro using AI. Is such a thing possible ? :)

raymondtrace
12-15-2019, 11:47 AM
By the way, what the heck happened to Thirdwishsoftware and Magpie Pro? Does anyone know? I attempted to contact them a few years ago, but never got any response. The site was pulled offline a while ago as well.
I was hoping they'd open source it, but seeing they were a small company run by two people (I think), the software seems now forever lost. A shame.

I believe they offered an early version as freeware. Maybe that's still lingering out there on the internet.

However, we are definitely heading toward puppetry (mocap and audio interpretation) over keyframed animation.

Greenlaw
12-17-2019, 11:23 AM
By the way, what the heck happened to Thirdwishsoftware and Magpie Pro? Does anyone know? I attempted to contact them a few years ago, but never got any response. The site was pulled offline a while ago as well.
I was hoping they'd open source it, but seeing they were a small company run by two people (I think), the software seems now forever lost. A shame.

Magpie has been gone for many years and, yeah, I wish it went open-source too.

I own two licenses of the Pro version but at the moment, only the license on my old Wacom tablet computer is still active, and I haven't used that computer for a couple of years. I can't activate Magpie Pro on my newer computers because the activation server is gone. :(

I used Magpie Pro for some of my short films. I had a good workflow in place but, TBH, Magpie's UI was very dated and there was very little development being done in the last years Third Wish was still active.

Compared to the other programs mentioned here, Papagayo is closer to how Magpie worked but it's has only a small subset of Magpie Pro's features.

I'm looking forward to TAFA 2 whenever it comes out. TAFA doesn't really do the same thing Magpie did but it's a cool program. Thanks for posting the updated info about TAFA 2. I haven't visited Mac's website since shortly after the hack. Glad to see some activity there, even if the message is from last June. (I hope somebody can help him with the website.) :)

erikals
12-17-2019, 10:58 PM
testing this now...
http://moviemation.de/facial-motion-capture-software-en.php


https://www.youtube.com/watch?v=61uwy0S1NHE

erikals
12-26-2019, 04:47 PM
this was quite good also >


https://youtu.be/cb3jvt4H2Rs


update; to export the animation i noticed that a $300 3DXchange fbx plugin is needed. ouch!  https://i.imgur.com/Q3dxkGq.gif
main app is $150 iClone7. so a total of $450. a bit too much for some i'd suppose.

erikals
12-26-2019, 06:54 PM
and for the animators, both good tech and info >


https://www.youtube.com/watch?v=vniMsN53ZPI

Greenlaw
12-27-2019, 10:08 AM
Typically the main issue with things like these is the unnatural "stutter lip" effect, yet if you smooth things out too much you lose detail that's required to make it work. So, right now there are still some things you just have to hand-animate in terms of lip syncing, but the amount of that work is shrinking.

I've always felt that way about auto-lipsync and mocapped lipsync in general. I still experiment with these type of tools from time to time but, to me, the lipsync never looks as good as keyframed lipsync and trying to fix the data is often more trouble than it's worth.

Right now I'm experimenting with the latest versions of Reallusion's iClone/Headshot for 3D and Cartoon Animator 4 for 2D. Some of the new tools will be incredibly useful in combination with other tools I use (I've been using iClone 7 with Pipeline to edit mocap from iPi Mocap for Lightwave for some time,) but I'm still skeptical that I'll get decent quality lip-sync from the face capture tech in each program. I guess we'll see.

One product Reallusion sells that I'm curious about is their Leap Motion plugin for capturing hand and forearm rotation capture.


https://www.youtube.com/watch?v=RZjkbp1nrw0

Has anybody here used this? What do you think?

It's on sale now and the individual parts to add Leap Motion input seem reasonably priced ($100 for the Motion Live universal mocap plugin, $99 for the Leap Motion Profile, and about $150 for the device,) but added together along with iClone and Pipeline, it does get a bit expensive. Since I already owned iClone and Pipeline, I almost got the Leap Motion add-on the other day, but then I decided I have my hands full with too many tools to learn and projects to work on right now. Still curious about it though.

TheLexx
12-27-2019, 10:25 AM
FWIW, there is also Bannaflak (https://vimeo.com/user84869517). Here is a test in Blender but the makers have tested and listed Lightwave (http://www.bannaflak.com/face-cap/index.html).


https://www.youtube.com/watch?v=n1YeZbeCPuw

erikals
12-27-2019, 11:48 AM
FWIW, there is also Bannaflak
Thanks, looks nice for quick stuff.  https://i.imgur.com/bcwLfNX.gif


[Leap] Has anybody here used this? What do you think?
i haven't used it but I read yesterday that it had quite a few glitches and Jitter.
i'm on the fence on that one...

- it cannot be attached to a mocap suit, making movement Very limited.
- and if you could, you'd need two.
i've seen VR experiments with them on a suit, but this is also limited
so, again we see tech that is a bit 'meh'...  https://i.imgur.com/xbmO2c4.gif
no winner, and no clear second place...
not saying it is a hack, but it certainly comes with a percentage of "gatcha"

+need to hook up the Perception Neuron i bought years ago...  https://i.imgur.com/4UKo6V4.gif
that's the best solution i guess, but costs $$

erikals
12-27-2019, 12:24 PM
regarding hand mocap, making your own glove isn't that hard, if you have the time....  https://i.imgur.com/5O6mwtQ.png


https://www.youtube.com/watch?v=oBpehYPtOAA

Greenlaw
12-27-2019, 01:24 PM
i haven't used it but I read yesterday that it had quite a few glitches and Jitter.
i'm on the fence on that one...

- it cannot be attached to a mocap suit, making movement Very limited.
- and if you could, you'd need two.
i've seen VR experiments with them on a suit, but this is also limited
so, again we see tech that is a bit 'meh'...  https://i.imgur.com/xbmO2c4.gif
no winner, and no clear second place...
not saying it is a hack, but it certainly comes with a percentage of "gatcha"


In the iClone demos I've seen, only a single Leap device is used, and it sits on the desk in front of the user as the user is seated; it's not attached to the body at all. I don't believe Leap Motion is meant to be attached to anything because the device is essentially a mouse alternative that tracks your hand movements and finger gestures to operate the computer.

The way it's used with iClone is it tracks your hand and forearm (wrist) rotations with Leap and re-targets that data to your character's body mocap data captured through other means (i.e., Perception suit, iPi Mocap Studio, etc.,) iClone has a pretty decent mocap editing toolset and it's designed to let you 'lego-construct' any captured motions coming from various sources. So in this case, the idea is you can replace existing forearms data with Leap Motion mocap capture. From the demos I've seen, you can even record new hand/finger capture interactively while character's body motion is in play in the program.

The iClone program actually does a whole lot more than that but it's always been the ability to edit imported iPi Mocap Studio data for use in LightWave that's my primary interest in this package.

Finger capture is one of the things iPi Mocap Studio lacks. I can capture wrist rotation in iPi Mocap Studio using PS Move and Switch JoyCons, but not fingers. I know the iPi devs got a Leap Motion to test a few years ago but they haven't released anything for it yet. I'm guessing they probably intended to create something like what Reallusion has created for iClone.

That said, I'm still on the fence about this device for hand/finger capture too. I'll post again about this if I decide to, uh, make the leap. :D

erikals
12-27-2019, 01:39 PM
maybe >
https://www.captoglove.com/product-category/consumers


https://www.youtube.com/watch?v=NHQzsP3YPM0

Greenlaw
12-27-2019, 02:23 PM
Bannaflak seems pretty neat!

BTW, I forgot to mention that I got the Kinect 2 capture plugin for iClone the other day. (It was on sale.) I'm not expecting anywhere near the accuracy of iPi Mocap Studio with multiple Kinect 2 sensors, but maybe it will be good for creating quick test data. Will post a video when I have anything to show.

Also, whenever I can get around to it, I'll post some iClone face capture tests too. (I only have the laptop webcam version though; don't own an iPhone X to use the phone's face tracking tech. The latter looks pretty good but I'm not buying a brand new phone just to do face capture.) :)

tyrot
12-28-2019, 05:54 AM
Bannaflak seems pretty neat!

BTW, I forgot to mention that I got the Kinect 2 capture plugin for iClone the other day. (It was on sale.) I'm not expecting anywhere near the accuracy of iPi Mocap Studio with multiple Kinect 2 sensors, but maybe it will be good for creating quick test data. Will post a video when I have anything to show.

Also, whenever I can get around to it, I'll post some iClone face capture tests too. (I only have the laptop webcam version though; don't own an iPhone X to use the phone's face tracking tech. The latter looks pretty good but I'm not buying a brand new phone just to do face capture.) :)

well i did.. i made some initial tests - with the FBX spitted out from iphone.. using FaceCap. It was good. FBX morph animation directly loads into MorphMixer .. so it is good.. I was gonna ask which facial capture tool for Iphone i should buy - ICLONE or FaceCap.. Did not try Iclone yet fully. May be i should just go to Face Cap..

The thing is i wanna also fix facial performance in TAFA - i have no idea how to match the morph names - etc.. I must make more test...

Greenlaw- i think technology of Iphone X-11 (i have 11 .. same sensor) is very cartoon friendly too .. THat is why i like it.. It is not like lifeless facial capture clones..

- - - Updated - - -


Bannaflak seems pretty neat!

BTW, I forgot to mention that I got the Kinect 2 capture plugin for iClone the other day. (It was on sale.) I'm not expecting anywhere near the accuracy of iPi Mocap Studio with multiple Kinect 2 sensors, but maybe it will be good for creating quick test data. Will post a video when I have anything to show.

Also, whenever I can get around to it, I'll post some iClone face capture tests too. (I only have the laptop webcam version though; don't own an iPhone X to use the phone's face tracking tech. The latter looks pretty good but I'm not buying a brand new phone just to do face capture.) :)

well i did.. i made some initial tests - with the FBX spitted out from iphone.. using FaceCap. It was good. FBX morph animation directly loads into MorphMixer .. so it is good.. I was gonna ask which facial capture tool for Iphone i should buy - ICLONE or FaceCap.. Did not try Iclone yet fully. May be i should just go to Face Cap..

The thing is i wanna also fix facial performance in TAFA - i have no idea how to match the morph names - etc.. I must make more test...

Greenlaw- i think technology of Iphone X-11 (i have 11 .. same sensor) is very cartoon friendly too .. THat is why i like it.. It is not like lifeless facial capture clones..

erikals
12-28-2019, 07:40 AM
The thing is i wanna also fix facial performance in TAFA - i have no idea how to match the morph names - etc.. I must make more test...
check video 7 - http://www.ta-animation.com/FA/tutorial1/index.htm
i've also backed them up here > https://drive.google.com/file/d/1RadFFIn8aogUy_qSeZhc-IhqWiY8J6m_/view?usp=sharing

for fast morph split, buy TA tools
https://www.liberty3d.com/store/tools/ta-tools

TheLexx
12-28-2019, 08:08 AM
Are the TA tools all up to date for LW2019 ?

erikals
12-28-2019, 11:06 AM
Yes. :). But 11.6 format

tyrot
12-28-2019, 07:08 PM
thanks erikals - you are an alive LWWIKI:)

erikals
12-28-2019, 07:46 PM
not too far stretched from the truth perhaps...  https://i.imgur.com/tJGL61i.png

sometimes feel like a 24/7 test machine.  https://i.imgur.com/bcwLfNX.gif  https://i.imgur.com/QS5J6ck.gif

ianr
12-29-2019, 09:36 AM
JALI___Anybody know the pricing?

I understand its MAYA & Unreal

erikals
12-29-2019, 11:39 AM
[JALI Lip-Sync]

After researching a bit I found that it is not released yet but not too far away from a launch.

They seem to avoid the word "affordable" when it comes to the price, this could indicate that it would be somewhat expensive.

http://jaliresearch.com

Qexit
12-30-2019, 05:45 AM
JALI___Anybody know the pricing?

I understand its MAYA & UnrealFrom the JaliResearch website:

'Pricing and Licensing

We’ve worked with a number of different customers from mobile technologies, animated and VFX studios, independent and AAA game studios. If we’ve learned one thing, it’s that one size does not fit all.

Downloadable plug n’ play solutions fall short of expectations because they fail to recognize this about their customers.

Whether you are looking to animate hours of run time character dialogue in multiple languages, humanize an interactive consumer avatar or streamline your character animation workflow in an animated series, we can help you select the license option and solutions to achieve your unique creative and technical goals.'

There's a button under this text that allows you to request a quote. So it looks like they vary the price according to the customer. As a hobbyist, you should be able to get a reasonable price Ian.

jwiede
12-30-2019, 02:59 PM
They seem to avoid the word "affordable" when it comes to the price, this could indicate that it would be somewhat expensive.

o/~ She saw him standing in the section marked: "If you have to ask, you can't afford the lingerie." o/~

Places that don't list an upfront price are rarely any reasonable definition of "affordable" in pricing.

erikals
12-30-2019, 08:36 PM
request a quote
yepzy, usually means $$$

TheLexx
12-31-2019, 07:59 AM
A thousand dollars for just one blow morph ? Not worth it. Okay I'm going, mod delete... :D

erikals
01-06-2020, 11:30 AM
NEON



https://www.youtube.com/watch?time_continue=19&v=Q6f6EXX-79w

erikals
01-06-2020, 11:58 AM
https://www.neon.life

13 hours to go...


https://fdn.gsmarena.com/imgroot/news/19/12/neon-ces-2020/-1200x900m/gsmarena_006.jpg

MacReiter
01-08-2020, 10:33 AM
Well, duh: he didn't declare a viewport meta, so mobile browsers on small screens render it full size. That's really basic.
It shows he has been out of the loop of web dev for a long, LONG time.

To be honest, I was never "in the loop" of web dev ;) Just not of interest to me at all... Even when my page wasn't pure text, it was hopelessly dated -- brushed metal backdrop about 5 years after that was "cool". Ah well.

And I agree with the concerns of viability for TAFA 2. Granted, Real Life (TM) has been kicking me around a bit since the late 2000s, but I also have been trying to figure out "what makes TAFA unique?" I know that, at a high level, it stands out because it lets you get from zero to production in a very short time period (I had one customer who purchased it on a Saturday morning because they had a video due Monday morning that they weren't going to get finished in time. They bought it, installed it, learned it, and completed their video on time. Very few tools hit the ground running that fast.) But, as a developer, looking at all the deep fakes and Neons and other advanced systems coming online, what can I bring to the table now, 15 years later?

I _think_ (and, of course, I'd love to hear what users think or feel) that the major benefits of TAFA are basically two things:
1. Real time performance -- scrubbing, looping, and real-time animation editing with sound to polish details as much as you need. This seems useful for any animation or production tool.
2. A spline system that (mostly) just "does what you mean." This, more than anything, seems to be what makes keyframe animation in TAFA so smooth. 80-90% of the time, splines don't need to be tweaked at all.

I did some experimentation wtih puppetry for a friend. Basically let you map any axis of any DirectX controller to any morph or morph pair, and map any controller button to morph keyframe insertions (so you can tap a blink in at a particular frame, and the spline system would smooth it just like any other dropped in keyframe). It worked pretty well, but I never got it polished up. But that might be a direction to consider...

I have wondered about auto-lip-sync. The argument for TAFA was always "if you can get good quality in TAFA in 10 minutes, what's the benefit of getting bad quality in 10 seconds?" But I have to wonder, now: If I could extract phonemes from the audio, apply some of Timothy's/Emily's heuristics for discarding the phonemes that lead to stutter-face, and then let the splining system do it's normal job, could I have the best of both worlds?

Oh, geez... Why do I start thinking down these lines again? WTH, other things I considered:
* skeleton animation - thinking puppetry, mostly, mapping controller axes to joints. Multiple configurations, so that you could have a body map to rough in gesticulations, then a hand map to tweak out pointing and such.
* cross platform - TAFA is hopelessly tied to Windows(TM). It's written using low-level Win32 SDK and Windows Multimedia calls. That did allow me enough control to get the audio sync I needed in the real-time performance (which is non-trivial), but it also means that, as Microsoft seems determined to kill Windows, I'm at somewhat of a dead end. I don't actually use Windows for anything myself any more, so a cross platform solution is definitely needed. Would also like to not have to deal with file formats at the byte level myself any more ;) Current thoughts are Unity or possibly Godot. Hopefully, they should provide tight latency audio control...
* face tracking - lots of sample code suggests I could get something running. Don't know if direct mapping is the best solution (see other's comments about "we're not exactly Sir Olivier"), or if something like the puppetry mappings would be better. Hmmm... I like that. Each character could have their own weighting for scaling from your performance into their personality. Then you act within your normal range, and let the system stretch or dampen the responses in the character.
* multi-tool - I know these are the NewTek forums, and I certainly want to support LightWave, but I have never wanted to be tied _into_ LW exclusively. Whatever engine I look at would need to support a decent range of object, scene, and animation formats. My understanding is that FBX seems well supported across tools now, which is a situation that simply didn't exist back in 2005. But I freely admit I haven't been tracking 3D tools all that closely.
* randomness/tics - something I wanted to try in TAFA but didn't get in place is a "random" system to add small fluctuations and tics to morph tracks. I've seen what this can do, and done well it makes an amazing difference to feeling "alive" (because we're soup technology, so we're never totally still or as precise as a robot)

I dunno. Lots of stuff that sounds fun. Just, realistically, I can't promise anything at all. It's a good week when I get the free time and mental energy together at the same time to spend a couple of hours doing something interesting. And that just won't cut it for this kind of development. TAFA was three years of 16 hour days, real work and TAFA, and I realistically can't see it happening again. Maybe I'm wrong -- maybe Unity makes things so blissfully easy that I could actually make progress in those little windows of time. That would be great. But I've been wishing that for 10 years now, and we see how far I've gotten, so...

On the other hand, coming back to these forums and seeing some of the things people have said about TAFA has given me a warm feeling I haven't had in a long time, so I do want to say THANK YOU (which, hopefully, I've said better in my "set it free" thread :) )

Greenlaw
01-08-2020, 11:10 AM
Hey Mac!

Great to see you here and thank you very much for sharing your thoughts about the future of TAFA 2 (or whatever it becomes.)

Yeah, I completely understand how life tends to get in the way of personal work. Lately I've been putting a lot of effort into just being able to work on my personal projects. And once I get the time, I'm often too pooped to do anything with it. :)

Wishing you great luck on your project, and I'm looking forward to seeing what you come up with.

erikals
01-09-2020, 09:19 PM
* face tracking - lots of sample code suggests I could get something running. Don't know if direct mapping is the best solution (see other's comments about "we're not exactly Sir Olivier"), or if something like the puppetry mappings would be better. Hmmm... I like that. Each character could have their own weighting for scaling from your performance into their personality. Then you act within your normal range, and let the system stretch or dampen the responses in the character.
this is the future, i believe.

erikals
01-11-2020, 12:02 PM
1. NEON did unfortunately not deliver what we had hoped. The marketing was quite hyped.
The marketing videos showed actual people talking, not fair, almost snake oil.  https://i.imgur.com/0MfIYXF.gif
https://www.youtube.com/results?search_query=Neon+Samsung

2. TAFA is fantastic, open source seems too gracious, on the other hand the code is tricky to break down from what i understand?
so maybe one would have to go another route ?

MacReiter
01-11-2020, 06:21 PM
1. NEON did unfortunately not deliver what we had hoped. The marketing was quite hyped.
The marketing videos showed actual people talking, not fair, almost snake oil.  https://i.imgur.com/0MfIYXF.gif
https://www.youtube.com/results?search_query=Neon+Samsung

I had wondered. I've used http://thispersondoesnotexist.com, and some of its results are impressive for stills, but even there you start to see the games it's playing. Going full body, full expression seemed a tremendous leap, and I couldn't see any telltales in the videos I was seeing. Well, except that the people's clothes and stance are slightly weird, but nothing in the face.


2. TAFA is fantastic, open source seems too gracious, on the other hand the code is tricky to break down from what i understand?
so maybe one would have to go another route ?

Dunno how tricky it is. It's just in a state of flux, as I was trying to get to v1.5... Gotta find it, first, though. Made lots of backups and scattered them around, but if I was too clever for myself in password protecting those backups, I may be out of luck :facepalm:

erikals
01-12-2020, 07:27 PM
if you guys haven't already, download this archive >
TAFA Tips Tricks
https://tinyurl.com/T-TT-download

erikals
01-12-2020, 09:54 PM
just to add, interesting to see the IrishMan still using morphs, not bones. [7:50 into video]



https://www.youtube.com/watch?v=OF-lElIlZM0


https://forums.newtek.com/attachment.php?attachmentid=146603&d=1578887679

MacReiter
01-12-2020, 10:05 PM
My understanding was that most of the advanced "match a real person" stuff was morph based. One kinda cool bit was including textures with morphs, so that fine details get morphed as well as the shape.

I know there was a push a while back for essentially virtual animatronics -- bone movement with associated muscle flexing and then soft body deformation. It all sounded like it was going to be great, but every video I ever saw was still hydraulic jerkiness inside a bag of gelatin...

But my knowledge is out of date. Haven't kept up with cutting edge the last several years...

erikals
01-12-2020, 10:24 PM
morphs / bones

both still have their gotchas i think.

from what i understood, they used an algorithm to calculate the in-between morphs (the non-linear morphs so to speak).

for skin-glide i would personally have used animated UV morph-maps
https://forums.newtek.com/showthread.php/85719-Fun-with-Nodes-UV-morphing


facial animation is interesting stuff, and an overwhelming amount of data/info to take into account.