PDA

View Full Version : More lies!



Freak
02-15-2003, 03:33 PM
----------------------------------------------------------------------------------
"The new NVIDIA Cg development solutions represent a huge step forward in graphics programming. Finally, we'll be able to achieve incredible realism without tying our applications to low-level hardware features that restrict the lifespan of the software. Programmer productivity will skyrocket and DCC technology users are going to love the proliferation of high-fidelity visualizations. LightWave users should applaud the efforts by NVIDIA that have brought us to this point."

Brad Peebler
President
Luxology, developers of NewTek LightWave 3D
-----------------------------------------------------------------------------------


Make of it what you will, i just don't want to know anymore...

XSI 3.0 here i come.... (Does anyone know if Avid still exchange copies of LW for a reduced XSI price?) I believe they had some kind of deal going. (Can you exchange multiple copies?)

Mike_RB
02-15-2003, 04:05 PM
"LightWave users should applaud the efforts by NVIDIA that have brought us to this point"

Where is that quote from?

Mike

Meshbuilder
02-15-2003, 04:08 PM
It's from a Nvidia webpage..

http://www.nvidia.com/view.asp?IO=cg_testimonials

But we don't know how old this page is.. But who cares, really??

rabid pitbull
02-15-2003, 04:09 PM
omg will it ever stop?:rolleyes:

biting my tounge.... hanging by a threads now.:(

WilliamVaughan
02-15-2003, 04:34 PM
....I read that back when I first started working at NewTek....

Freak
02-15-2003, 06:14 PM
Hmmm, Perhaps Newtek should contact 3D World and Nvidia and tell them who the makers of Lightwave3D really is...

Having quotes like that publically listed, can be great for lawsuits... :)

Especially With Chuck trying so hard to maintain a one owner policy.

When will it stop?
Most likely when princess peebler stops opening his big mouth..

The question that gets raised, is:

Where the hell are my CG shaders?

If the president (Huh!) of Lightwave tells me
to be thankfull to Nvidia for all my DCC hardware optimizations.
I'd actually like to use them first!

As Nvidia also seem to think that Lightwave is being built by Luxology, i guess i could assume, that NT have had no contact with Nvidia nor do they have any CG shader support planned no enhancements to hardware support etc...

But the company that's "Not" making Lightwave officially,
has "publically" supported Nvidia hardware support.
for a product they have nothing to do with.

Sorry, but it makes me laugh....... (ahahahaheheheheh) see!

I'm just rapidly losing interest left, right and center, and in all dimensions too..

Yes i am being a troublemaker (like woody woodpecker)

Because VB bulliten sucks, it won't accept my cookies, i have to log in for every message i post, and then it can't find the thread, so i actually have to log in twice, to post one message.
I don't have this problem on CGTalk.

Also Avid is no longer taking trades of LW dongles. Arrgghhhh!!!

It's one of those days! :)

lwscottk
02-15-2003, 09:54 PM
...watch out for that door!...

Freak
02-15-2003, 11:46 PM
...watch out for that door!...


WHACK!!!!!........ Watch out for tha what? Oh i see!


:)

mav3rick
02-16-2003, 05:00 AM
hey cim:) i totaly agree with you and heh nothin to add more..... it is exactly what i also think will happen or is happenin.....

Beamtracer
02-16-2003, 06:10 AM
Yeah, that nVidea statement is really old. I saw it some months ago after doing a Google search on Luxology.

More recently Brad Peebler refuses to comment about the Newtek / Luxology saga.

Mike_RB
02-16-2003, 08:15 AM
CIM, i think you sound a little *too* sure of yourself. I really don't think Lux is working on the dev of LW at all.

Mike

hrgiger
02-16-2003, 08:22 AM
None of you know what is really happening so why speculate?

McLeft
02-16-2003, 10:20 AM
>XSI 3.0 here i come....

I exploring XSI and can say it has a lot of own shortcomings
Modeling workflow is not as fast as LW's and it's missing some important tools as "drills", slicing and so on. it has some good modeling stuff that LW doesn't have but overall workflow is slower (though for character modeling it's pretty good)
It doesn't have normal motion blur (i mean curved one).
It has quite limited support for HDRI.
Dynamic is quite slow even compared to MD... and say turbulence field won't work with cloth but will with particles for example.
Though Render Tree is amazing and gives you a lot of control.
PPG (Property Editor in LW) allows you to make changes for multiple items at once.. really convenient and you don't have to use Spreadsheet which doesn't have access to all params anyways.

Well, whatever, but i guess you won't be able get away from LW that easily :)

hrgiger
02-16-2003, 10:56 AM
Well that McLeft and it's way too expensive.

Freak
02-16-2003, 02:42 PM
Yeah Mcleft:

I have been teaching myself XSI 2.0, and i was impressed, at how easy it was to many things that LW can't or can't do easily.

However.... I agree that it's not the ducks guts like everyone would suggest....It is slower than LW.... (by quite a bit)

It's why it surprises my that everyone wants an integrated package..... (it has some good bits, but i'd prefer them seperate) (and fast)

I do refuse to Model in XSI....... I can't model in an integrated package. (I still use Modeler) and if it gets glued together, i'll go to wings3d for my modeling.

My worklflow in LW is many times faster compared to that of XSI.
But that could just be because i know LW much better.

Having said that XSI is under active development by one company.... And it seems to have some direction.

I don't have to hear about peebler every 5 mins,
which is always a plus... ;)

Almost a year since the last minor bug update.
Conflicting stories of development, little or no updates.
Lots of promises of cool tools soon. No Lscript 2.6 still.

(New staff, dev team, beta testers) it all takes time to get working together... (and i will upgrade to 8 when it arrives)
but meanwhile i have plenty of time to learn new software.
And XSI is moving forward.

Thier comes a time when you must put the business, before your loyalty to your tools and their developer.

(And i have been loyal to NT, and i do have much respect for Chuck, Duece, and many of the NT team)

However unless Avid bring back the promotional trade in,
(LW dongles) i will only be upgrading the one licence of XSI2
It's too much money, for something i'm only half happy with.
And it's no use having multiple copies until i'm adept with it.

I am not moving on, just starting to add other products into the pipeline..

I'd rather waste money than time...
I find you do usually get what you pay for too.

Chris S. (Fez)
02-16-2003, 06:13 PM
I did the XSI 2.0 tutorials a while back and was concerned with the slowness of subds. Avid optimized everything in XSI 3.0. Open GL and Subds in XSI 3.0 are MUCH faster than 2.0 and certainly faster than Lightwave. I still like Modeler better.

Freak
02-16-2003, 06:46 PM
Thanks Chris....

I have not really had a look at the XSI 3.0 product details or reviews.

There did not seem to be a lot of new stuff that interested me, in 3.0. However speed is always a welcome addition.

However it's nice to have a forward movement......
XSI 2.0 and XSI 3.0, have been released in the same time as only one version of LW.

It's actually nice to have a community,, thats not talking about Lux Vs NT, but about all the cool new features....
Peace at last......

The ripper
02-17-2003, 03:44 AM
Originally posted by Mike_RB
CIM, i think you sound a little *too* sure of yourself. I really don't think Lux is working on the dev of LW at all.

Mike

Maybe not but they're for sure developing something better! :D

mattclary
02-17-2003, 05:51 AM
It is old, it's from June 13, 2002. You have to scroll waaay down to read the quote. The reason I give a different link than the one above, is this one has a date in the header.

This dates back to the time when Lux was new and before Newtek "corrected" Lux's misconceptions.


http://developer.nvidia.com/view.asp?IO=IO_20020612_7133

nonproductive
02-17-2003, 07:04 AM
Originally posted by Freak
It's actually nice to have a community,, thats not talking about Lux Vs NT, but about all the cool new features....
Peace at last......

Sorry - I have to comment on this. You start a thread on NT vs Luxology and then *criticize* the community *because* of those discussions? Ummm...

rabid pitbull
02-17-2003, 02:00 PM
Originally posted by nonproductive
Sorry - I have to comment on this. You start a thread on NT vs Luxology and then *criticize* the community *because* of those discussions? Ummm...

i was thinking the same exact thing...

seems like he is really set on going to xsi and leaving LW behind, but before doing that he wants to cause as much a rukus as he can... :rolleyes:

Qslugs
02-17-2003, 05:10 PM
Man, you guys ARE the old ladies in the sewing circle. you guys are as bad as sports fanatics.

Judas cow
02-17-2003, 05:10 PM
What a retarded thread. Hey Moderators, can you ban the 13 year olds from starting such dumbass threads? It's embarassing.

Skonk
02-17-2003, 05:27 PM
Newteks name is on Lightwave, Lightwave is all over newteks website, When u buy Lightwave u buy it from Newtek. I dont see anything on luxology's website and i dont see luxology's name on lightwave so to me its pritty straightforward.
Newtek make it.

IMHO.

James..

SplineGod
02-21-2003, 09:03 PM
I noticed that Newek is looking to hire engineering staff too. :)

Rory_L
02-21-2003, 10:45 PM
What a retarded thread. Hey Moderators, can you ban the 13 year olds from starting such dumbass threads? It's embarassing.

No, please keep it going! It`s as good value as the Darwin Awards! ;)

...and just as meaningful.

R

IndianaJoe
02-24-2003, 02:12 AM
If the inverting-core acceptor deflects the complex chronotron-feedback analysis, try to provoke a coil-composition reflex and several quantum biosphere resonances, this will create a restricted isovolumic cochrane graviton-prediction, which ought to in fact dampen the polarizing maintenance-filament formulas. Then attempt a minimum abstract component-delay correction phase to input a reversible lucifugal primary ionization perimeter operation to cancel the celestial info-sphere greenhouse effect level-limits. As you are doing this, set in motion six homeostasis global-attractors from the constant chemical cybernetic-induction elliptical-beam, this will dislodge a krypton placebo-molecule from the kinetic synthesis-accelerator, as a consequence affect the electromagnetic fiber-feedback engineering fractal controls, enhancing the parallel decaying energy diffusion force. Now turn on the hypothetical cryogenic quadrant energy matrix and logistical passive inducers to absorb the decayed ardamantium alignment chronicle containment-conduit effects. As you must have deduced by now, this will redirect the xanthous laser alignments authorization breach causing a zonifugal antimatter breakthrough in the enhanced chaotic krell-dechyon diodes, and compute cycles over cycles of the digivax engineering artificial atmosphere history and then instigate a cosmic brain-cell breakdown of the magnetic-flux generator flow confirmation. But do not aim at creating those NavCom neon-resonating memory utilization rectification reports prior to maximizing the modulating secondary matrix production of that CPU examination docking-module of the long-range field communication cyber-diagram, without first sub-spacing the tachyon-static singularity propulsion plasma intensity spectrum-resolution of the lunar-gas short range nucleogenic projection subdivision, or this will resonate a suspension of the hard-paradigm particle phase neutrino-level mopology of those soft-shell shutdown static xenon.

:D :D :D

mattclary
02-24-2003, 05:56 AM
I guess if you're too much of a geek to ever have sex, you can still make it on the Darwin award list...

faulknermano
02-24-2003, 10:40 AM
darwin awards list? huh! he just pulled off from a star trek episode! :p

Judas cow
02-24-2003, 05:21 PM
Originally posted by Rory_L
No, please keep it going! It`s as good value as the Darwin Awards! ;)

...and just as meaningful.

R Just checked out the darwin awards site, LOL!!!