PDA

View Full Version : Adding geometry to follow morph targets



mummyman
10-16-2015, 07:33 PM
Not sure if there is a way to do this.. but I'm revisiting an old model, has a bunch of morph targets for the face already setup. I'm thinking of using 3rd powers tools, to wrap polys around the face like mummy wrappings. Is there a way to apply them all to the base mesh where I want, then have them follow the morph targets automatically? I'm thinking most likely not, unless I bake them in as displacemnts or paint them in Zbrush, which I don't want to do. Any ideas? I don't really want to redo all the morphs.

Just a side project.. but hit a snag like this within minutes of testing some ideas. Thanks for any suggestions..

Brett

Greenlaw
10-17-2015, 12:40 AM
Here are a few suggestions.

If you have DP Metafit installed, try this:

130376

The contained demo scene looks like this:

130375

On the far left is a head object with a single morph target to open the mouth. To the far right is a form fitted bandages object with no embedded morph target. And in the middle is the bandages object attached to the head object with its mouth open. The head's morph displacement is driving the deformation of the non-morphing bandages. Is this what you're asking for?

What's unseen in the bandages object is a copy of the vertices from the head object. The vertices are present to allow DP Metafit to accurately drive the bandages.

This scene should be interactive--my system isn't the fastest in the world but it can run this in realtime. If your mesh is really high res though, expect a long 'initialization' process when you open the scene or activate DP Metafit. DP Metafit uses this time to scan the object so it can operate in realtime or near realtime.

This is essentially what the old FX_MetaLink tool did except with DP MetaFit you don't need to scan an MDD using ClothFX to make this work. Plus DP MetaFit is nodal.

When you're finished, you will probably want to scan an MDD of the whole thing to keep the scene 'snappy'. After applying the MDD, you should disable MetaFit or remove it to keep it from performing the now unnecessary scan.

There also exists a native MetaLink node but based on past experience with this tool, it probably won't work like it does in the above demo. However, it's been a few years since I last tried MetaLink so it might be worth a shot. I set up the demo scene with DP MetaFit because I knew this tool would work.

Alternatively, you can use Bullet Deforming to drive the bandages. Just be aware that if the face is subpatched, Bullet will only see the unsubpatched mesh for collision. You'll need to bake the face to MDD for this to work though and Bullet may be slower than usual with this setup.

IMO, the DP MetaFit suggestion may be easier and more realiable.

Hope this helps.

G.

Greenlaw
10-17-2015, 12:54 AM
BTW, Third Power's Heat Shrink Plus might be ideal for fitting your bandages. That's what I used to wrap the bandages in the above demo.

I used this tool in 'Belt' mode with the bandages laid out flat in front of the face. If I had pushed the tool a little farther, it could have wrapped the bandages completely around the head. Granted, I did a sloppy job because I didn't want to spend more than a few minutes setting this up, but with proper modeling and an honest effort, Heat Shrink Plus tool is capable of much better results.

G.

jeric_synergy
10-17-2015, 01:07 AM
First time I've heard of "DP Metafit". The description seems quite.... parsimonious. Is it straight-forward to use?

mummyman
10-17-2015, 06:28 AM
Carp!.... I mean, CRAP! I've never heard of DP metafit either. I'm using Heat Shrink Plus already.. great plugins. This is exactly what I'm trying to do. I'll have to try this. Thank you!!!


And, if this does what I think it does as good.... I should kick myself. I've been doing FX metalink for years. LOTS of files I've been managing. THANK YOU...


My god.... this works like a charm. Hope it holds up to multiple objects.. which I'm sure it will. I can't believe I didn't know about this sooner! I'm not huge on following node stuff...especially the deformation ones. Now I will be. Thanks again! Awesome

jeric_synergy
10-17-2015, 10:37 AM
Mummyman, if you have a quick example (test objects) to post, I for one would be interested.

Denis sneaks one by us again! :thumbsup:

mummyman
10-17-2015, 11:11 AM
130388130389

Well, animating them is a different story. Sliding the morph target sliders works great, but animating say 20 frames on the morph, there is slipping, or a delay.. I guess I can bake or something...but it's a great start. YOu can see the lower bands that pass through the jaw when the mouth opens. This was rendered from a 20 frame render. When the jaw lowers, the bands "lag" then catch up. But it's off.

jeric_synergy
10-17-2015, 11:26 AM
Maybe an interpolation error?

BTW, I'm sure Denis would appreciate test scenes.

RebelHill
10-17-2015, 11:34 AM
turn on studio live

Kevbarnes
10-17-2015, 01:08 PM
turn on studio live

I haven't tested this but............

I recently watched a vid on utube where the guy was using (DP kit -Point info) with a vertex index to attach
a null to follow a mesh displacement.

This initially had the same problem you describe.

He switched on Motion Blur with 0(zero) setting, and this forced a frame by frame update
to the node

.. worth a try?

Kev

Greenlaw
10-17-2015, 10:41 PM
I guess I should have posted a link. MetaFit is part of DP Kit. Like all of Denis' plugins, it's free but please send him a donation if his tools are useful to you. (There's a Paypal 'Donate' link on his home page.):

http://dpont.pagesperso-orange.fr/plugins/nodes/Additionnal_Nodes_2.html

G.

Greenlaw
10-17-2015, 10:56 PM
First time I've heard of "DP Metafit". The description seems quite.... parsimonious. Is it straight-forward to use?

I guess so. Here's the nodal setup in that scene file I posted:

130393

It's much simpler to use than the combination of ClothFX and FX_MetaLink anyway.

Note that there is a Weight map option--this allows you to constrain where the effect takes place.

Also note that, unless you REALLY need it, you should leave After Deformation disabled. You may lose some accuracy when it's disabled but enabling it on may drag down the performance.

G.

Greenlaw
10-17-2015, 11:10 PM
Regarding the lag, do you really need the bandages enabled during animation? IMO, you should just leave the MetaFit thing switched off until you finish animating your morphs. Consider the bandage stuff an effect animation and leave it disabled until you get all of the main character animation locked.

Same goes for other effect animations like clothing, hair and attached accessories. Save that for after you finish the character animation. Trying to apply effects while your still animating the character is not efficient and it can waste a lot of time.

Just a suggestion.

G.

mummyman
10-18-2015, 11:02 AM
Yes, I would keep them disabled, but was testing motions out...so regardless when it's turned on, it was still lagging.. I guess better to find out now during testing. I'd need to find a fix if I continue doing anything with it. I'll have to try the motion blur trick. Haven't had time to mess with it since I posted the 2 images. Thank you all for the help!

jeric_synergy
10-18-2015, 11:39 AM
I for one await the results of either the Studio Live or MB tests.

EDIT: on the lagging: what kind of curve does the morph slider have? Is placement accurate at keyframes but not between?

It might be you'd need to bake the curves (ugh!) to make it work. Bleagh. --Is there a way to UNBake curves? I mean, you could bleed away keys or whatevs (seems laborioius), but something guaranteed to replicate the placement&values of keyframes? (Ain't going to happen, but an extra "bit"{literally} on a keyframe to differentiate between Baked keyframes and manual keyframes would make that possible.)

Greenlaw
10-18-2015, 12:17 PM
Oh, duh...here's an even more obvious tip:

Use DrainBGVmap or Weighter 2 to simply transfer the morph data to the bandages. Once you have the morph positions (and weights if needed) transferred to the bandages, you can move them to the same layer as the head and animate them using the same morph targets in MorphMixer.

The two plugins are similar tools but they each have some unique features.

DrainBGVmap is free but it only runs in x32 Modeler. It's easy to use and works really well but, being an x32 plugin, you may run into memory issues with dense models or lots of fur fibers. You can download it from Dodgy's website.

Weighter 2 is a commercial plugin from Liberty3D and it runs in x32 or x64. It works super well too but can produce different results, so it's good to have both tools handy. The big issue I have with Weighter 2 right now is that it's incompatible with LightWave 2015. The devs are working on a fix but in the meantime, you'll want to keep LightWave 11.x around.

BTW, DrainBGVmap is how I transfer morph targets and weight map data (as well as UV map coordinates) from my character meshes to FiberMesh guides for FiberFX. (FiberMesh guides do not export from ZBrush with this data.)

Both tools are very useful and highly recommended.

IMO, this approach makes the most sense for what you're trying to do.

G.

Edit: here's an old video for Weighter 2 by the original developer. The Liberty 3D version has a newer interface but the basic functionality is the same: https://vimeo.com/12838390

Liberty 3D's link is: http://www.liberty3d.com/

Here's Dodgy's website where you can download DrainBGVmap: http://www.mikegreen.name/

mummyman
10-19-2015, 04:09 AM
The Motion Blur test didn't seem to work, or the Studio Live. They are fine if you do an F9 render, but for some reason the lag happens rendering the sequence. I'll have to zip up the scene/objects if anyone wants. Maybe I'm on an older version of the plugin? I'll have to check that. Greenlaw, I'll have to try those other plugins and see if I can get it to work.

RebelHill
10-19-2015, 04:21 AM
No... its nothing of the sort, its LW not evaluating "in time" a displacement thats driven by displacement of some other mesh. Push comes to shove, just MDD out the lagging mesh, and reload it with a 1 frame offset.

mummyman
10-19-2015, 04:24 AM
130424130425130426


Here are the 2 objects and scene file.

- - - Updated - - -


No... its nothing of the sort, its LW not evaluating "in time" a displacement thats driven by displacement of some other mesh. Push comes to shove, just MDD out the lagging mesh, and reload it with a 1 frame offset.

Was thinking this was the route I would have to take. It's nice to be able to make any object now, and add this effect! I usually do tons of testing before committing to anything. Thanks for the help

mummyman
10-19-2015, 04:25 AM
I also thought it was the base object being frozen polys, versus a subpatched original mesh. That wasn't the case. Tried both. The object (base head) was the frozen mesh I used to HeatShrink the pieces too. It's been a fun little test!

jeric_synergy
10-19-2015, 08:55 AM
If the lag persists, I'd say get those test files to Denis with as complete a description as possible. He's really good about fixing things.

Greenlaw
10-19-2015, 09:10 AM
I usually do tons of testing before committing to anything.
Me too, even when I have a super tight schedule--and maybe because I usually do have a super tight schedule, I'll make time for a lot of testing. There's no worse feeling than to crash and burn when you think you're in that home stretch, especially when you realize the problems could have been avoided with a little better planning.

G.

hdace
10-19-2015, 10:06 AM
I haven't tested this but............

I recently watched a vid on utube where the guy was using (DP kit -Point info) with a vertex index to attach
a null to follow a mesh displacement.

This initially had the same problem you describe.

He switched on Motion Blur with 0(zero) setting, and this forced a frame by frame update
to the node

.. worth a try?

Kev

I haven't seen the video you mention, but I use Point info to animate items almost everyday (a special purpose which would take too long to explain). Anyway, It's not really necessary to bother with the Motion Blur trick if you bake whatever is being controlled by Point Into. If you're controlling an object, bone, or a null MentalFish Baking works best, and the lagging problem magically disappears. Then you remove the rig. I would have thought it could work for controlling points on a different object too. If you've animated joint bones with no length they in turn can be used to control individual points in a different object, like the bandages. Then you can scan the bandages in ClothFX and use the resulting mdd to control the bandages and remove all the bones.

mummyman
10-19-2015, 11:45 AM
Thanks.. I'll try baking it out tonight. I usually try not to bake things because it's just one more thing to manage and keep track of. But whatever works to get a shot(s) done.

jeric_synergy
10-19-2015, 11:47 AM
Thanks.. I'll try baking it out tonight. I usually try not to bake things because it's just one more thing to manage and keep track of.
Not to mention that doesn't it really lock you into one, singular path?

I understand that it might be necessary (or simply desirable for speed) but it always seems so irrevocable...

mummyman
10-19-2015, 12:08 PM
I don't usually do character work... but exactly.. the workflow is totally different for baked files IMHO. Especially for the generalist using Lightwave. So many things to "know" and keep track of.

hdace
10-19-2015, 12:33 PM
RebelHill explained some stuff to me about baking a couple of years ago. Ever since I've been doing it more and more to the point where I'm baking stuff several times a day. I'm also doing a lot more special effects with my character animation that really demands it. Once bones and other stuff are baked in you can remove various rigs and suddenly the scene seems to have a great weight lifted from its shoulders and one can start focusing on scenery and rendering and not have to worry whether a character's clothes or hair or something is suddenly going to fly off into infinity! I now don't know what I'd do without it.

You can even bake separate objects that are part of a Bullet sim. That can be really cool.

Greenlaw
10-19-2015, 12:41 PM
FWIW, except in the most basic setups, I always bake my characters to MDD, especially if there's hair/fur or dynamics involved. Most of the time, it's faster to render characters this way because mesh evaluations and other time consuming processes can be skipped, and certainly the animation becomes more stable and predictable for network rendering. This practice is fairly standard at many places I've worked too.

In LightWave, baking to MDD is not 'irrevocable'. All you need to do is disable or remove the MD Reader, and re-bake after you updated your animation. No biggie really. Tip: be sure use the MD Multi-baker and MDD Multi-Loader tools. These tools can really speed up your workflow, especially if you have multiple characters, complicated characters, or dynamics with lots of objects. Works great for imported Maya characters too!

FYI, setting your object preview level setting to match the object render level setting can significantly speed up renders too. You only need to do this just before submitting the scene. Not to much of a hassle if you're using the Scene Editor, but at the Box we had an in-house scene submit tool that allowed us to match the levels throughout the scene when the scene was submitted. This way, we didn't have to change the scene files we worked in. IMO, this feature/option should be standard in LightWave.

G.

mummyman
10-19-2015, 12:45 PM
Baking is one thing that turned me OFF from looking more into Janus...even though I enjoyed how Janus looked / worked. But I had a recent project for Medical and I had to bake almost everything into XSI. So learned a LOT... thank god for MultiBaker!

hdace
10-19-2015, 01:19 PM
...setting your object preview level setting to match the object render level setting can significantly speed up renders too. You only need to do this just before submitting the scene. Not to much of a hassle if you're using the Scene Editor, but at the Box we had an in-house scene submit tool that allowed us to match the levels throughout the scene when the scene was submitted. This way, we didn't have to change the scene files we worked in. IMO, this feature/option should be standard in LightWave.

G.

I've been in the habit of double-checking this immediately prior to rendering for several years now. I agree it would be great if there was a global switch. That'd be great. Somebody write a script?

jeric_synergy
10-19-2015, 02:15 PM
:stumped: So, somehow having the preview (subd, right?) different than the render level incurs a "significant" time penalty??? Weird. :eek:

Greenlaw
10-19-2015, 02:27 PM
It's easy to see what I mean if you open LWSN and watch what happens during a render when the levels are matched and when they are not. The time is compounded when more objects using subdivision surfaces are present, and of course when the render levels are increased.

Basically, the conversion process that occurs for each sub-D object and for every frame in the render does not happen when the sub-division levels are equal.

G.

P.S., this is why I hate it when an IT guy hides this stuff from the artists. It makes it harder to optimize a render.

jeric_synergy
10-19-2015, 02:52 PM
Oh, I don't doubt you (what am I, crazy?), it's just I don't know why the s/w would be evaluating the display level at all, during a render. Double my confusion as to why SN would be doing anything display related.

Greenlaw
10-19-2015, 03:24 PM
I agree. Once the scene is loaded to LWSN, you would think it would just use the render level.

Maybe it has to do with Subdivision Order or because the render levels are keyframeable, but now I'm just tossing out jargon. I don't really know why it needs to do that--I'm just a button pusher. :)

mummyman
10-19-2015, 05:16 PM
No... its nothing of the sort, its LW not evaluating "in time" a displacement thats driven by displacement of some other mesh. Push comes to shove, just MDD out the lagging mesh, and reload it with a 1 frame offset.


Seems like this worked...baking, then offsetting it by 1 frame. Thank you all for the responses. Glad it's a "faster" method. Love these forums.. (sometimes)

jeric_synergy
10-19-2015, 05:52 PM
So, to get to basic principles: it's accurate at keyframes, not in-between? That yells "interpolation" to me.

erikals
10-19-2015, 05:53 PM
Bryphi testin'


https://www.youtube.com/watch?v=E2gVyyMT8W4

mummyman
10-20-2015, 05:20 AM
Yes, just saw that this morning via Facebook. How awesome is that? I didn't check my gmail all weekend, so didn't see this. I had absolutely NO clue metalink worked that way, or I'd be using it way more. Glad to keep learning this stuff. I've been using it for stuff already baked out and high rez / low rez. This is pretty fabulous. Thanks Bryphi and everyone for the help on this. Maybe it should be posted over at the thread about little known Lightwave stuff? Man, I'm sure they'll be looking into this / fixing this tedious workaround soon.

Now I can use my low rez model for the scan! Awesome stuff..

jeric_synergy
10-20-2015, 05:56 AM
:foreheads 38 messages and none of us thought of this before Bryphi? --I'd say we got a training problem.

mummyman
10-20-2015, 07:16 AM
Yup... lol. And I was using metalink for a huge project. It just all depends on what you know and how you use things. I've never needed it to work that way.. I use it in a totally different way. But now that I know, it's locked to my training! Crazy.

hdace
10-20-2015, 10:48 AM
It never ceases to amaze me how NewTek shoots itself in the foot. I've tried Metalink loads of times but could never get it to work because I didn't understand the important role of ClothFX. That's because the manual doesn't explain it clearly. Also it's not intuitive. Also ClothFX's role is completely unnecessary (as mentioned in above video).

Unfortunately I still don't think I can use it. I use DP Point Info a lot (as I mentioned earlier). One of the many uses is our main character's beard. He has a long beard made of 2-poly guides. We use mdd's generated in Face Robot to animate the face. These mdd's are applied using DP MDD Pointer because of its streaming and time mapping features (TM very important to maintain accurate lip sync). I should do some testing (which is crazy since we're in the last two or three weeks of animation on this project) but I'm assuming that turning ClothFX on, scanning, and then turning it off (especially since our scenes tend to be very long) could cause a conflict with MDD Pointer. Our beard animation rig is a little cumbersome to apply so it would be nice to use Metalink, but right now I'm skeptical.

jeric_synergy
10-20-2015, 10:52 AM
OK, so I decided to research Metalink some more, watched a couple of videos, esp. Proton's and now I'm wondering: why is Metalink_Morph required when using morphing with FX_Metalink? Why isn't it just built into Metalink?

Does it incur such an overhead penalty that the Ancient Ones decided to split them up? Or is it just some sort of ugly bandage on Metalink?

Seems like just one more damn thing (OMDT) we have to remember to make things work.

AND ::wags finger in air:: using ClothFX to read displacements is a bit of a training issue, ::sheesh::. A cut down version that JUST read displacements called, ohhh, "Displacement Scanner" might be more self-explanatory.

(crosspost) I see hdace agrees with me. SERIOUSLY, some of the terminology used by the coders is rubbish. And no matter how fine the writing in the dox is, MORE EXAMPLES are better, which could be linked if only the dox were extensible (and moderated).

"Shooting themselves in the foot" is right on the money: relying on super users like RH, RR, and Bryphii (and SplineGod, RIP) to explain the poorly named and organized system is not a good idea.

Greenlaw
10-20-2015, 11:27 AM
Actually, I'm surprised these tools are coming as such a surprise...they've been around for about 15 years.

When I was with the Box, I routinely used ClothFX with FX_Metalink, mainly for attaching hair guides to mdd scanned characters. It's really pretty simple to use but the key thing is that you MUST use ClothFX to scan the MDD. Other MDD readers besides the original 'FX_' readers cannot read this data properly. Denis Pontonnier once explained to me that ClothFX appears to be doing an additional process when it writes and reads its MDD file, so he had to come up with his own take on that process when he created DP MetaFit. (This is what I was meant on the first page when I compared MetaFit with FX_Metalink.)

BTW, if you have an .mdd from Maya or another program, to use it with FX_MetaLink you will need to re-scan the object with ClothFX and write out a new 'FX' blessed MDD file.

The main advantage with DP MetaFit and the native Metalink node is that they don't require scanning with ClothFX. However, there are some tradeoffs. With DP MetaFit, there's the 'scanning' delay I mentioned. I can't recall the issue I had with the native Metalink tool--I think it had to do with a sub-D order problem I was having with the Brudders characters and fur guides, which is when Denis turned me on to his tool.

I agree that FX_Metalink Morph should be part of the tool and not a separate plugin you add on top. But, again, this was a plug-in written over 15 years ago so there may have been technical reasons for breaking it down like this.

It's been a long time since I last used FX_Metalink Morph. I recall I used it for Gloria the hippo in a commercial for a Madagascar video game. I was using either ClothFX or SoftFX for her belly jiggle, and FX_Metalink Morph allowed me to animate her facial animations on top of the dynamics. It worked brilliantly for the time...but nowadays there are better ways to do this.

I think the biggest problem with the FX_ tools was the documentation when it came out--it listed a description of each tool but offered little in 'real world' examples. Also, the plug-ins were written by a Japanese developer and I think a language barrier kept many of its features from being adequately translated to English. In fact, we had the good fortune to meet the developer at the Box where he explained some of his tools' mysteries in person--that was a big eye opener for me and some of the other fx guys present.

Okay, having just written that, I guess I'm NOT surprised that these tools are a surprise. Back then, NewTek really should have done a better job showcasing and explaining these tools...but I suspect at the time, they probably didn't know what they had. :)

G.

jeric_synergy
10-20-2015, 11:50 AM
Okay, having just written that, I guess I'm NOT surprised that these tools are a surprise. Back then, NewTek really should have done a better job showcasing and explaining these tools...but I suspect at the time, they probably didn't know what they had. :)
G.
Gee, why does that sound familiar? ::cough:: IKB ::cough::. :devil:

Repeatedly, one of the super-users show a long-existing feature and it's a revelation: my latest example was Ryan R's bafflement on the ASSIGN TOOLS-- there they are, buried in obscure menus (or not even assigned to a menu, sometimes), ..... the foot-shooting continues.

I hope the current batch of beta testers are as whiney and demanding as I am. :grumpy:

Greenlaw
10-20-2015, 11:58 AM
If I'm not mistaken, IK Boost was created by the same developer, so probably the same translation issues.

erikals
10-20-2015, 12:09 PM
as far as i recall, yes, same developer, Ino i think, works at D-Storm (yep, just checked... Daisuke Ino)

even with quirks, i still love IKB, the FX tools, and metalink... Awesome work!

Ino Rocks!! http://erikalstad.com/backup/misc.php_files/king.gif http://erikalstad.com/backup/misc.php_files/035.gif

Greenlaw
10-20-2015, 12:32 PM
Yes, I agree! He was like Worley, Fori, Dave Vrba, Prem and other early pioneering third party power developers. I could not have come this far with LightWave without his influence. :thumbsup:

At least Dave is still around and active (and on the inside now.)

G.

erikals
10-20-2015, 01:20 PM
Absolutely one of the LightWave Masters http://erikalstad.com/backup/misc.php_files/smile.gif

jeric_synergy
10-20-2015, 01:36 PM
It's a pity they couldn't be bothered to hire a competent translator.

I blame Topeka/Austin.

Greenlaw
10-20-2015, 02:30 PM
Yes. That was a long time ago though. You could argue that they should still make an effort to update the docs for these tools but, like I said, these are REALLY old tools--the current LW3DG team might rather be focused on modern and better ways to do these things. (As they should be.)

G.

jeric_synergy
10-20-2015, 02:42 PM
As you know, I would argue that the dox should be group-enhanced and moderated, so all the typing HERE could go to some good use. 8~

And I specifically said "Topeka" to differentiate from the current management. OTOH, they had YEARS of Splinegod telling them what they had in IKB, and they didn't do jack about that either.

I think BeeVee is doing a fine job, but the fact is there's just no way he can cover everything adequately, which is why leveraging the user base, in a moderated manner, would pay-off bigtime. One of the biggest shortcomings of the dox is lack of examples, which can't even EXIST in any number until a feature is released. Linking to examples on the internet is the easiest of documentation enhancement, but apparently even that is too much. --For that matter, the dox could link to LW3dG's own demos , but nooooooooooooooooooooooooo.

hdace
10-20-2015, 03:36 PM
Topeka? I don't get it. I'm in Manhattan, 60 miles from there. What's the connection with Austin?

Greenlaw
10-20-2015, 03:49 PM
I agree.

BeeVee has done a marvelous job of completely reworking and rewriting the manuals, making them far more accessible to new and veteran users. I have nothing but high praise for his work.

But you're right, there needs to be some kind of user managed information repository online. Just the other day I was commenting in another thread about how I feel like I keep posting the same information over and over again. It would be nice to have a place where info and tutorials could be posted, edited, cataloged and managed by users. Searching for info in these forums can be so hit and miss.

Unfortunately, an open community wiki might become too chaotic--it would clearly need to be moderated.

I was hoping Light-Wiki would become such a place but it seemed to lose steam after the hacking incident. I'm not sure what the status of that project is.

G.

Vong
10-20-2015, 04:39 PM
Topeka? I don't get it. I'm in Manhattan, 60 miles from there. What's the connection with Austin?

I believe jeric was referring to Newtek starting off in Topeka, Kansas and then moving to Texas. Although it's not Austin, it's San Antonio. :D

hdace
10-20-2015, 07:20 PM
Holy Cow. I can't believe I never knew that. Of course, I was living in London way back then...

jeric_synergy
10-20-2015, 07:54 PM
Unfortunately, an open community wiki might become too chaotic--it would clearly need to be moderated.
As I always say, and as I said in my two page physical letter to NewTek dated March 3, 2009. It proposed a position of "Manager of Dynamic Documentation".

I sent it to Jay Roth, Jim Plant, and Chuck Baker.

++++++++

San Antonio!! Gahd, 'wayyy worse.