PDA

View Full Version : Were the G5 specs exaggerated?



Pages : [1] 2

Ade
06-24-2003, 03:45 AM
lied specs? (http://www.haxial.com/spls-soapbox/apple-powermac-G5/)

Please speculate..

Red_Oddity
06-24-2003, 04:24 AM
Yup, like all Apple's benchmarks, can't blame them though, we all know it's bull, and all companies do it, not just Apple...

http://www.tweakers.net/nieuws/27615

Sorry, it's dutch, but it gives you something to think about, as the benchmarks there speak volumes...

Lightwolf
06-24-2003, 04:38 AM
There is nothing to speculate. The intel SPEC marks by Apple are correct, but the assumption under which they were produced were wrong.
If I used Lightwave Linux compiled with gcc I might care, but I use Lightwave Windows compiled with the intel compiler, so Apple's comparison doesn't have any relevance to me...
Let's all wait for some real benchmarks (Oh, I'd like to see some Opteron benches as well, especially 64 bit benches in September with the 64 bit Windows for AMD).
Cheers,
Mike

Red_Oddity
06-24-2003, 05:13 AM
Yup, to get the full scope what they did, read this:

http://www.haxial.com/spls-soapbox/apple-powermac-G5/

Apperantly they 'cheated' a lot, so these benchmarks are really off this world...

If only Apple would have given us some decent benchmark results, and not just comparing G5s to x86 machines, but also to G4s so i know how big the difference is gonna be between the G5 and my dual G4 1.42GHz machine, i mean, i spend a sh:te load of money on it, so atleast let me know if i have been screwed in the rear or not...or gonna get screwed over again...

Ade
06-24-2003, 05:16 AM
I just posted that url.. Anyway is this page right in saying Apple lied? Should I be pissed off...again at Apple for their BS?
Should we spread this anti G5 info around the net? Teach apple a lesson?

I dont know, after all its been a while since mac faithfuls could hold their heads up high.

Lightwolf
06-24-2003, 05:23 AM
Hi Ade,
since I'm from the dark side ( -> intel / windows)...
That URL isn't the only place pointing out the obvious errors. I would have been quite happy if the G5s would actually perform in relation to intel according to Apple's claims, I like the price, I like the design, but I don't really like the hype, so I'm back to my no-switch position.
As far as spreading is concerned... Well, c't, Germany's largest (and best) computer mag pointed at the SPEC discrepancies at the same time they spread the G5 news, and it will make the rounds anyhow. Shame on Apple though for having to BS like that.
Hey, I'm p***ed off at Apple too, and I'm not even an Apple fan :)
Cheers,
Mike

panini
06-24-2003, 06:07 AM
Of course. Steve is one of the worst liars on the planet.
I wouldn't even trust they had G5 chips in there. It wouldn't be the first time a company used their box with something completely different inside.
Did anybody check?

With every new announcement he says Macs are faster and mhz do not matter.

Now at G5 announcement he says: "We have finally not only caught up with PCs , but leapfrogged them "

So in that sentence right there ( caught up part ) he admitted to lying all these previous years. Why should we believe him now.

The good news for Apple users is that it will be much better than G4 . On the other hand you can't buy this thing yet. You can buy a 64bit AMD chip and stick it into your PC if you really want to. As a matter of fact that has been true since april, so I don't know what Steve is so excited about ( not to mention that Athlon 64 bit is due around the same time as this G5 and Pentium 5 is also ready any time. Intel simply has no need to introduce it when they can keep making money with the current P4 which right now has no competition.


And the Pixar guy , when he says "this is the fastest computer....", his eyes tell more than any lie detector. Anybody who took psychology 101 could recognize that one. It's actually funny.

I also checked that Veritest fine print, so yes they cheated in the whole set up and they still lost some tests ( they just didn't publish those on their site )

Johnny
06-24-2003, 06:12 AM
yup..the time for BS, geek speak, mumbo-jumbo, and funny mirrors is O-VER!

this allegedly super-fast G4 I have on my desk is bought and paid for, does the job and will have no high-dollar company on my desk until there's hard, solid evidence that a truly faster Mac is HERE, in reality, no goofy interpretation of benchmarks, no cheating, no playing around.

we need to de-Enron the bench testing.

Johnny

Ade
06-24-2003, 06:36 AM
Even if apple released a quad G6 pc ppl would still stay pc.

Johnny
06-24-2003, 06:44 AM
Originally posted by Ade
Even if apple released a quad G6 pc ppl would still stay pc.

yeah..and though I'm a big fan of Mac OS, I can't say I blame them...

there's nothing compelling to make them flip to Mac. plus, there's been too much in the way of empty promises and snake-oil salesmanship.

I am completely disgusted...

J

Karl Hansson
06-24-2003, 08:07 AM
I think it's to early to attack the G5. If there is a case of BS marketing it will show soon enough. But right now nobody has access to the new G5 and can not make any real judgement whether it is good enough. All companies have agressive marketing and any adult educated person should know that you cannot base your purchases upon that marketing only, especialy not at these prices. Personally I don't really care if mac is the fastest computer in the world or not, if it is then it is a very temporary state. The important thing is that the mac is way up there now and that apple has a continuing plan (3ghz) something they lacked when releasing the G4. I for one (probably the only one at this forum) is very excited about the new G5.

just my opinion.

skippy
06-24-2003, 08:15 AM
yeah...*IF* Apple is way up there, things might be ok for now.
but I'm not convinced that the G5 really IS 'up there.'

almost every time I've seen a G4 benchmark, it's been with a wagonload of caveats..if this, this, this, this AND that, then the G4 is faster...otherwise, not.

Having a truly blazing Mac is more important with respect to attracting more people to the platform, and another round of RDF will not be a good thing for Mac.

s

Lamont
06-24-2003, 09:55 AM
Anyone can manipulate a graph so it looks like the results are in your favor.

I say wait till it hits the market, one of you guys will get one and tell us how it really is.

policarpo
06-24-2003, 10:42 AM
DUDES!!! get back to making beautiful work.

The new G5 machines will be out in late August and we'll know how they stack up rather quickly.

Hell, build a custom scene now that is just amazing so you can test it on the new boxes from Apple.

Go and make something beautiful and leave the bickering to people who aren't comfortable with their own talent, and evolve into an artist who can pull off something magnificient no matter what kind of computer they use. :)

Lightwolf
06-24-2003, 10:46 AM
Originally posted by policarpo
... and evolve into an artist who can pull off something magnificient ...
I'm no artist, I just need to make money doing 3d ;)
Cheers,
Mike

policarpo
06-24-2003, 10:48 AM
Originally posted by Lightwolf
I'm no artist, I just need to make money doing 3d ;)
Cheers,
Mike

LOL!

well...more power to you....keep making money and doing cool work!!

Darth Mole
06-24-2003, 10:48 AM
Peronally, I think it's a really simple equation, irrespective of Apple's (sadly misleading) benchmarketing...

Would I like a faster machine?

Hell, yes.

Do I want to run Windows or Linux?

Hell, no.

Do I believe a dual 2.0GHz G5 is at least 2x faster than my dual 800 G4.

Hell, yes.

Then go to Apple store, fill in credit card details and worry about it later. Life's too short. (Got a good price on the old machine too!)

Lamont
06-24-2003, 10:56 AM
Originally posted by policarpo
DUDES!!! get back to making beautiful work.No.

Originally posted by policarpo
The new G5 machines will be out in late August and we'll know how they stack up rather quickly. They are going to be released in August... of 2005.

Originally posted by policarpo
Go and make something beautiful and leave the bickering to people who aren't comfortable with their own talent, and evolve into an artist who can pull off something magnificient no matter what kind of computer they use. :) What?! Are you out of your mind. Everyone knows talent is measured in GHZ, MB and disk-space. Get with the program.

policarpo
06-24-2003, 11:03 AM
SCREW IT ALL!!!!

i'm going back to using Infini-D!!!!

if it was good enough for the early Rustboy then it's good enough for me. :p

Darth Mole
06-24-2003, 11:29 AM
I used to really like Infini-D - I bet it'd fly on a new G5...


Sorry. I'll get my coat.

BrianW
06-24-2003, 12:12 PM
I have never seen this many people bi*ch about something so good as the G5! If all of you feel so badly about the G5, why do you own a G4? Why Do you own a Mac period? The reason I switched from PC to Mac is because fo the operating system not the hardware. I am an artist not an IT guy( actually I am in IT, but I hate it) and all I want to do is be creative with my Mac, not spending hours or days trouble shooting this and trying to find that driver to download so I can use this piece of crap modem that doesn't work with operating system and this hardware. I have never had these problems with Mac and we never will. That is what you get when you buy a Mac, piece of mind, stress free computer working. But now that Apple is top with the OS and hardware you still complain! All I can say to those that complain, shame on you! If these machines do not satisfy you, then no computer will! Seems nothing will ever be good enough for you! As for me, I have a $53,000.00 computer budget for this year, and you can bet you a** every dime will be spent on G5s when they are released in August! Way to go Apple!! I am running with my creditcard to the Apple Store!
Brian Warren

policarpo
06-24-2003, 12:15 PM
well....i've used both and never had any big issues with either platform.

so enough with the sob stories about PC/MAC stability issues.

if you set up a computer right. it works.

the G5 is a good thing for apple. it will be a good thing for people who want something new. the IBM/Apple relationship will be good to help get Apple back on track.

and for the record...owning a Mac doesn't make you more creative. I haven't designed on a Mac for over 6 years...and it hasn't effected my creativity at all. so don't use that type of logic to convince people of why the Mac is better. it's pointless.

use what you can afford, enjoy it and exploit it to the fullest!

BrianW
06-24-2003, 12:41 PM
"and for the record...owning a Mac doesn't make you more creative. I haven't designed on a Mac for over 6 years...and it hasn't effected my creativity at all. so don't use that type of logic to convince people of why the Mac is better. it's pointless."

I never said that owning a Mac would make you more creative, it just gives me more time to create, so it isn't so pointless after all!!:eek:
Brian Warren

policarpo
06-24-2003, 12:49 PM
well...like i said...if you set up a machine right it just works.

9 times out of 10 it's user error that makes a system unstable.

i never really crashed on my Mac and I never really crash on my PC.

sure i reboot...but that's only after it's been on for a few weeks. :)

BrianW
06-24-2003, 01:12 PM
"9 times out of 10 it's user error that makes a system unstable."

I am not trying to start a pissing match with you over PC vs. Mac, but that statement is just simply false with my experience with PCs. I don't mean that I know more than you when it comes to PCs, it just my personal experiences that prove that statement wrong. I run a Windows Domain where I work and on that domain are 33 Gateway PCs that are just over 7 months old. More than 2/3 of those PCs have had the power supply go out and the hard drives go out, just bad design on Gateway! Now all of these machines run windows XP, which is by far the best Windows yet, and yet almost on a weekly basis they seem to not want to long on the Domain or just simply refuse users passwords and such(not all of our machines do this, just way too many). This has nothing to do with bad user in put, that is just Windows, though it is getting better, but not good enough for me. Mac is where I stay to do my work and do what I do!

redlum
06-24-2003, 01:16 PM
What kills me is the graphics card has half the ram you'd expect for a new computer. 64 megs? How about 128 megs built in? Now that's a graphics card.

policarpo
06-24-2003, 01:21 PM
dude...i wasn't starting a war with you...i'm just saying that in the end the way you set up your machine determines how it's going to act in the end.

i've had great experiences with Mac and with PC's.

I work in a multi platform environment and our IT guys have everything set up smooth as silk and we rarely have issues with our machines. Whenever we do it's usually the fault of the user since our IT guys set up the machines to last and not collapse since our work is very time critical.

We're in a Dell and Apple environment.

Maybe the Gateway machines are your problem...who knows. To err is human.

serpicolugnut
06-24-2003, 01:23 PM
well...like i said...if you set up a machine right it just works.

Welll, if the machine is a Mac then that statement is true. If your machine is a PC, then that statement is sometimes true.

Case in point: Bluetooth.

Got a new bluetooth phone. Bought a USB bluetooth adapter. It came with Windows drivers. The phone came with Windows software. I installed both the drivers and the phone software. After hours upon hours of trying to get it working, it wouldn't. Called D-Link tech support. They were clueless...

Finally, I plugged the adapter in to my Powerbook. Instantly the System Pref for Bluetooth opens, and it's configured. The phone saw the Mac (and vice-versa) without any problems.

And yes, the Windows box was running XP Pro. Latest SP and all the updates.

I could cite at least 10 other examples where the Mac is just easier to use.

But, hey, if having a wider variety of hardware components to choose from is your thing, then the PC is the way to go. If you don't like wasting your time fiddling with drivers and troubleshooting, go with the Mac.

policarpo
06-24-2003, 01:32 PM
ugh....

man alive....

i think people just like to read their own words on these forums without reading posts or thinking about what they're saying.

i'll say it again:

"use what you can afford, enjoy it and exploit it to the fullest!"

serpicolugnut
06-24-2003, 01:32 PM
What kills me is the graphics card has half the ram you'd expect for a new computer. 64 megs? How about 128 megs built in? Now that's a graphics card.

The Graphics card situation will be the deciding factor on whether or not I will buy a G5 now or wait.

Jobs stated during the keynote that the G5 could use any of the AGP8x "Pro" graphics cards on the market. Too me, that sounds like the ATI FireXL or nVidia the Quadro cards... I'd like clarification on this, as it's one of the key performance areas when using a 3D app like Lightwave.

Lamont
06-24-2003, 01:33 PM
Yeah, I agree... hamsters are facist bastards.

BrianW
06-24-2003, 01:34 PM
"if you don't like wasting your time fiddling with drivers and troubleshooting, go with the Mac."


Amen to That!
Brian Warren

wacom
06-24-2003, 01:41 PM
Read this if you've got some time to waste...sorry for the long post...:o

I feel like an idoit for adding my 2cents to this whole thing, but...

I agree with many who have already said that we should wait and see when the G5 comes out to see just how fast it is. We need to have some one like Tom at Tomshardware.com do a bench test on it, running Lightwave, PhotoShop etc., on diffrent systems (G3,G4,G5-AthlonXP- AMD Opteron- Intel P4s, Xeons, and even Itaniums) and with the 32-bit flavors and 64-bit flavors of diffrent OSs. Then we might start having something to work with in these "chats". Plus there is a HUGE diffrence in the wintel world between one computer and the next even if they have the same basic specs (an HP pavillion VS an Alienware etc) so how do we really know which is faster yet?

However- I think it's good for everyone to help curb marketing on any front no matter where it comes from. It's only healthy in a place like the computer market to help diffuse clames. I've used both types of systems and lucky for me hardware hasn't been as big deal as the software (the stuff that actully means something) since everything I do runs on each system. I could give a sh*t less about the lastest, greatest CPU speeds. If you need that much power in your renders why don't you just do what Pixar does (no they didn't make Monsters Inc. on a single G4;) ) and invest in a render farm or using one. I'll bet you your G7 or Itanium 500 that I'll get my frames back faster every time. Plus a good video card is going to speed up pre-render work way faster than a faster CPU...

The biggest pat on the back I can give to Apple is for ditching that old, stupid OS they used to have and going with Unix. I'm not one to get sentimental with computers so seeing that go was a real blessing. Anyone who ran LW on OS 8 can tell ya. The only catch? Steve Jobs makes it sound like they made OS X what it is...and likes to leave out how much the Unix/Linux community has worked for 30+ years to get this far. It's a lot easier to "make" an OS when you only need to design it for a handfull of system types, and give it a new skin. What is NexT Steveo, is Apple going to sell me the iWheel for my car?

Blah blah blah...looks like my 2cents turned into credit debt...:o

js33
06-24-2003, 01:53 PM
Originally posted by wacom
If you need that much power in your renders why don't you just do what Pixar does (no they didn't make Monsters Inc. on a single G4;) ) and invest in a render farm or using one.

Hehehehe
Or ANY G4 for that matter. Pixar has been using SUN computers to render and work on for years. Maybe the G5 will be used for designers in the future but last I heard is they were switching to Intel for their render farm. But who knows that may all change now. Only time will tell. It doesn't sit very well that a film company owned by Steve Jobs doesn't use Apple computers in any form.

Cheers,
JS

harlan
06-24-2003, 02:24 PM
Actually Pixar has quite a few Apple computers in their operation.

Their Renderfarm is linux based, they recently upgraded from the Sun systems, and I'm sure they'll keep their current investment in the renderfarm for quite some time.

That being said however, the PowerMac G5 would best serve Pixar - or anyone else like them for that matter - as a workstation and not render nodes.

It doesn't matter that Steve Jobs handles both Apple & Pixar; it's called "business". Personally I think its cool that he allows Pixar to use the technology they deem necessary rather than forcing something on them that may not be as adequate for their specific needs.

Harlan

Arnie Cachelin
06-24-2003, 02:28 PM
There are lies, damned lies and benchmarks

While the benchmarks were compiled in ways that may not have been optimal for the PC, there are quite a few holes in the 'haxial' critique of the presentation. First, the benchmark graphs could be ignored entirely, and the application performance would be a sufficient indicator of real world performance. By using open-source compiler versions, Apple assures that the compilers are not unfairly tricked out to do special things on the exact benchmark code. This is not a fanciful concern either. Look at the various graphics card benchmark scandals to see this in action.

In addition, the guy makes some statements that seem so clueless that I have to question the rest of his more esoteric points. For example, he claims that the 3ghz Dell P4 is $2k. He forgets to add the extra 7.5gb of RAM, and for good reason, PCs can't use 8gb of RAM even if you can fit it in. It is also missing a CPU, which makes sense too, since one can't build a dual p4 box.. that is reserved for xeons.

Then he says most people use integer, single-proc tasks most of the time. This irrelevancy is either deceptive or ignorant. Sure email, surfing and word-proc are used most, but people buy expensive dual-proc machines to use a few special apps which do do heavy FP lifting, and are multi-threaded. (i.e LW!). the fact that he can't seem to recognize the difference between a p4 and a xeon also undermines his claims to competance. If he does recognize the difference, then he is intentionally pricematching a singleCPU p4 w/ 512mb against a dual g5 w/ 8Gb.. silly.
The benchmarks are all biased. It is not clear that the mac version used altivec, and if it didn't then using SSE2 on the PC would not be fair.
As for 64-bit AMD cpus, I believe they await a 64-bit Windows and applications to become relevant.

harlan
06-24-2003, 02:32 PM
Who really cares if the tests were exaggerated. Every single computer company alters tests in some way to make them appear to offer the advantage. That's called "marketing". ;)

Dishonest? Sure, but so are the claims that 500 out of 501 people prefer coke to pepsi.

NewTek fibs a little on their claims. A|W fibs a little on their claims. Everyone does it, doesn't make it right, but it also doesn't mean that ONE company should be singled out for doing the exact same thing that everyone else does just because people fear the possibility of it being accurate.

The G5's are going to be great machines. Are they "better" than an Intel box? That's all relative. Personally, I got rid of every Intel box I owned and use nothing but G4's (with the exception of my VT[2]).

Were those intel boxes faster than my G4? In most things, yep, you bet, but I work far more efficiently on the Mac than I do on Windows so the performance "balance" becomes a factor.

policarpo
06-24-2003, 02:38 PM
hey Arnie...what are you working on these days?

:D

jin choung
06-24-2003, 03:05 PM
www.flay.com

seems that the tests were indeed rigged. and not just a matter of taking away intel advantages - but by creating a totally uneven playing field by optimizing like hell for apple's advantages. real world numbers will probably show a drastic difference from apple claims if the 'countering' benchmarks are correct.

most people these days simply say that speed doesn't matter. 'can it do what you want it to do?' is the question. i hear a lot of mac people say this too.

why can't apple just make that their by-line then?

when you rig results that you KNOW are gonna be contradicted the next day....

god i hate steve jobs. that guy's gotta stop admiring his own rectum and pull his head out his ***.

jin

mlinde
06-24-2003, 04:18 PM
Originally posted by harlan
That being said however, the PowerMac G5 would best serve Pixar - or anyone else like them for that matter - as a workstation and not render nodes.

I would disagree, since Pixar appears to use Maya and custom 3D applications, and they have access to professional-grade video cards for those SUN and Intel boxes, why would they use a workstation with a fast processor but slow visual feedback? The graphics offerings in the G5 are sub-par gaming cards. Some others here disagree (that's cool) but they've probably not had the opportunity to sit in front of a Wildcat or Quadro based workstation. Even without the customized, optimized, application specific settings you can get for these cards and applications like Maya or XSI, they SMOKE the offerings Apple is making. Then realize that the "top" video card in the G5 is an ATi card, and you have a box that appears to have a fast processors, lots of RAM, speedy drives, and cheap graphics. Sounds like a perfect render node to me...

Johnny
06-24-2003, 04:23 PM
why can't or won't Apple get cards that 'smoke' the current sub-par game offerings?

would this really add way too much to the list prices of G5s?

J

Darth Mole
06-24-2003, 04:32 PM
Gotta agree: I was disappointed to see the only cards on offer were the stock ATI and NVidias. Last thing I want to do is buy a great computer only to have adequate OpenGL - that's the thing I interact at most when using apps like LW (and, indeed, the whole UI with Quartz Extreme).

Although I dread to think how much it would all cost with a nice card AND a stack of RAM...

policarpo
06-24-2003, 04:33 PM
All Apple needs to do is make stable drivers for all of these HighEnd cards so everyone can stop whining about the lack of highend cards.

Jeez...isn't that simple enough to state?

Where does one send the petition to get them to realize this?

Hrmmm....you LW MAC OSX users should start a petition....to make this happen...i'd gladly sign it!

Johnny
06-24-2003, 04:34 PM
yeah..at least they could offer 'Good-Better-Best' configs as with the desktop units.

plus, what's a potential Switcher supposed to think? he/she knows what's out there for wintel...then looks at a Mac and sees Toy graphic cards?

sheesh!

J

policarpo
06-24-2003, 04:37 PM
you guys are never happy.

how do you ever get work done?

Karl Hansson
06-24-2003, 04:45 PM
One thing about the haxial test is the fact that the AMD test was made by AMD, the Dell test was made by Dell and the Intel test was made by Intel. While apples tests were done by Veritest. What makes you think that AMD, Intels and Dells tests were all that honest? Another thing is that Veritests integer result for Dell precision 650 was 836 and Dells test for that same machine is 1089, whats up with that? The same goes for Veritests test where the Dell Dimention 8300 P4 3.0ghz got a result of 889 while Intels test for the P4 3.0 was 1152?!?!?!?! The same goes for the other benchmarks on haxial. Something tells me none of those figures on the haxial are comparable. I think this guy is only trolling. By the way did any of you guys read the Veritest pdf?

Johnny
06-24-2003, 05:00 PM
policarpo;

your comment is fair given this thread.

I can speak for myself and say that I'm almost quaking with glee at the stability and function of OSX...I can keep multiple apps busy for days with nary a burp, and that's a VERY nice change over the way things were, pre-X. I'm happy about it!

but I think it's about time Apple gets the show on the road with;

1. unquestionably powerful hardware. this means no boutique-style interpretation of benchmark tests, but 'getting stuff done' benchmarks... as someone pointed out, Apple ought to SHOW a G5 doing 5 big demanding jobs at once and handling it all without sweat.

2. professional-grade video cards ..they expect us to embrace Mac as the 3D and Video and Sound standard...how about the cards to go with that eh? cards are part of pro work..they're pro gear! If they tout the G4/G5 as pro machines, ipso facto, they ought to have the dad gum pro cards.. maybe Apple can't force manufacturers to make cards for Mac, but maybe they can help make the job easier?

IMO, the kvetching on these threads has to do with hardware issues, and you don't read anyone here complaining that the Mac OSX isn't spot-on, which I think it most definitely is.

J

policarpo
06-24-2003, 05:06 PM
right on.

and what i am suggesting is that the movement start from within the 3D community to get Panther to support all the Quadro and Fire GL and Wildcat cards...so the tweebs who think Macs are toys can shutter in horror at the gold that we all know OSX is. :)

Apple has gone this far...why not give them a little grassroots yell to go a little further!

Johnny
06-24-2003, 05:09 PM
does Apple consult with key 3D industry players to find out what the hot sh*t cards are?

and if so...

J

policarpo
06-24-2003, 05:25 PM
wrong answer.

your answer should have been...
"that's a damn fine idea. I remember that it worked for Alias to get them to port Maya to OSX...so why wouldn't it work to convince Apple that supporting these High-End cards is the right idea? Bloody brilliant! Lets do it!!"

WHERE IS ALL THE STEAM!!!

SO MUCH ENERGY IS SPENT MOANING...but none seems to be spent doing...

ho hum....i'm going back to making the coolest work i can...you monkeys have fun.
:D

Johnny
06-24-2003, 05:28 PM
Originally posted by policarpo
[B]your answer should have been...
"that's a damn fine idea. I remember that it worked for Alias to get them to port Maya to OSX...so why wouldn't it work to convince Apple that supporting these High-End cards is the right idea? Bloody brilliant! Lets do it!!"

WHERE IS ALL THE STEAM!!!
/B]


well...I wasn't around for that or witness to it, but I have no problem at all typing out an e-mail to Apple encouraging them in this...

do you have a suggested department or e-mail address??

J

policarpo
06-24-2003, 05:32 PM
set up a website.

start a petition.

have people sign it.

get all the LW, EI, C4D, and Maya OSX users to sign it.

submit it to Apple.

do this and you will get your cards.

if not...then just face facts and hope that Apple does it on their own.

Johnny
06-24-2003, 05:35 PM
sounds easy enough..
is there concensus that your list of cards is the one to pursue?

I'm only asking because I'm lost with respect to cards, other than the reviews of Mac offerings seem to be weak.

J

policarpo
06-24-2003, 05:42 PM
i think you'd be good requesting these to be supported on OSX 10.3:

1. PNY Quadro 4 550 XGL, 750XGL, 900 XGL, 380XGL, 580XGL, 980XGL
2. PNY QuadroFX, Quadro XGL, Quadro NVS
3. 3DLabs Wildcat VP Pro, Wildcat 4, Wildcat VP
4. ATI FireGL XL1, FireGLZ1, FireGL8800

sites:
http://www.pny.com/products/quadro/
http://mirror.ati.com/products/builtworkstation.html
http://www.3dlabs.com/product/wildcatvp/vppro/index.htm

I think those are the workstation class cards most HighEnd users would demand.

Johnny
06-24-2003, 05:44 PM
OK..I'll get busy drafting a petition..brief, to the point, professional (not whiny or demanding) and post it here.

maybe not tonight, but soon.

J

policarpo
06-24-2003, 05:45 PM
Then post the link to your petition on this site and on request that 3Dfestival, CGTalk and on CGChannel link to it...i can get you a link of CGFocus and InsideCG as well.

just make sure your site will be able to take the hits.

i use Hostned and they seem very reliable and inexpensive.
let's start a revolution!

www.hostned.com

Johnny
06-24-2003, 05:46 PM
I use earthlink..I suppose it can take the hits...

can you advise how to create a petition page?

policarpo
06-24-2003, 05:59 PM
well, i made this one:
http://www.cgtalk.com/showthread.php?s=&threadid=71713

data is data....

i would assume that you'd need to set up a page with a form element to accept the input...there should be some info on it on line to do this sort of stuff rather quickly.

or just set up a blog to record all of it. :)

or look for something called a Wiki...it will work to capture comments....and it might be written in Perl...so your server would need to support that.

but i think the Blog would be easiest. Look for Moveable Type...

:D

mlinde
06-24-2003, 06:02 PM
Originally posted by policarpo
1. PNY Quadro 4 550 XGL, 750XGL, 900 XGL, 380XGL, 580XGL, 980XGL
2. PNY QuadroFX, Quadro XGL, Quadro NVS

Just a tip that these cards are based on the nVidia Quadro series of graphics chips. PNY is a manufacturer/reseller, not the designer of the graphics chip.

Apple already has working relationships with ATi and nVidia. It's just a matter of getting them to take the professional hardware and make a set of cards.

Rey
06-24-2003, 06:05 PM
A rebuttal from the Apple VP of hardware marketing, here (http://apple.slashdot.org/apple/03/06/24/2154256.shtml?tid=126&tid=181). Now maybe the PC weenies will give it a rest. Let us Mac users have our fun.

BTW, I'd be willing to sign a petition to get higher end cards on the Mac (I think Architosh started one a while ago), don't forget that it also needs to be directed at the card manufacturers and not just Apple.

policarpo
06-24-2003, 06:10 PM
Originally posted by mlinde
Just a tip that these cards are based on the nVidia Quadro series of graphics chips. PNY is a manufacturer/reseller, not the designer of the graphics chip.

Apple already has working relationships with ATi and nVidia. It's just a matter of getting them to take the professional hardware and make a set of cards.

Yep. I selected the PNY cards since they are the standard developer for these cards since ELSA went belly up. I figured that they had worked all the bugs out and could maybe speed up the time to market for Apple. :)

cavalos
06-24-2003, 06:52 PM
Just wait for Tomshardware tests :)

panini
06-24-2003, 09:16 PM
Apple's tests were not done by Veritest.

Veritest is there only to supervise and add credibility. Tests are planned out in advance by Apple and done by whatever rules they decided.

In any case Dell's , AMD's and other tests also have somebody like Veritest supervise things so they are as credible as Apple's claims.

You can get results directly from SPEC's site, you don't need to go directly to a particular manufacturer.

What Apple conveniently fails to mention is that all of us can buy an AMD dual Opteron 64 bit workstation from BOXX today and this Apple won't ship unti September 1st. You also get Lightwave for only $995 with that workstation if you are buying Lightwave.

According to Spec this already available AMD 64 bit BOXX PC is in some situations twice as fast as Apple's still vaporvare G5 they demoed.

By the time G5 is available Opteron will go to or over 2GHZ and AMD 64bit Athlon 3400+ or better should be out as well.

So not only is this not the fastest, it's not even close to being the first 64 bit workstation. Not to mention one of Apple's main selling points regarding this extra wide bus is designed by none other than AMD and with some on-chip addition Opteron's bus runs at the same speed as the CPU, in other words up to twice as fast as G5's.

I am waiting to see what 64bit Athlon does and most likely buying a dual workstation with those chips. It shouldn't be more than $2000 or if I upgrade by buying MB/Chp and memory possibly less than $1000.

So, I am a potential G5 buyer ( a person looking for a new workstation soon ), but I don't like spending $3000 on a machine slower than a $2000 one. Lying about all this doesn't help Apple's cause either..
Only seeing Lightwave, Photoshop, Vegas running faster on a G5 than on AMD or P4 platform will convince me otherwise. And I'm talking about real machines optimally set-up, not cripled P4/xeon against 4 or 8 optimized IBM chips Jobs had running under the hood at his conference.

Ade
06-24-2003, 11:23 PM
Know who would clear all this crap up?!


LUXOLOGY!

They used it they would know, I might send em an email inviting them here..again..

Chuck your the man, have you used a G5 or can add anything?

Karl Hansson
06-25-2003, 01:23 AM
Reading the PDF it say that a Veritest analyst conducted the tests. Apple provided all the systems though Veritest configured the Dell 650 and the Dell 8300.

The only thing that I see apple may have done wrong in this test is that they added the high performance single-threaded malloc library. The reason for that is I imagin to compensate for not yet having a system that is optimised for the G5 processor. Also the Veritest pdf also states that the G5 tested was a prototyp. Also the G5 used the OSX system 10.2.7 which is a very early G5 system. The real G5 system is 10.3.

Also none of the Dell, Intel and AMD tests on hexial webpage can be found on the veritest homepage. In fact I didn't find any similar tests on Veritests homepage. They are found at www.spec.org. The fact is even though spec.org and Veritest are two independent testers (dont know about spec.org) the test methology may differ thus rendering different results. Those test can not be compared to the veritest test.

Personally I would like to se benchmarks that compared the G4 and the G5.

Ade
06-25-2003, 02:19 AM
Apples response explains it..
Sounds right to me as long as those enhancements make it to the actual g5.

slash dot (http://apple.slashdot.org/apple/03/06/24/2154256.shtml?tid=126&tid=181)

Lightwolf
06-25-2003, 04:16 AM
Hi Karl,
the fact is, the official SPEC numbers for the intel processors are different from the official Apple numbers. Apple may have the official numbers for the G5, since it is their machine, but not for the other machines.
SPEC is a benchmark that, rightly so, tests optimized systems with optimized compilers. If Apple has to use the gcc for the G5, and the compiled code sucks, than that is Apple's problem, and also the developers and users problem, since they get code delivered that could be faster if the technology was there.
SPEC is not a tester like veritest is, SPEC only defines the benchmark and every vendor can test by himself, but has to publicise the optimizations used to obtain the numbers.
Current processers all rely on smart compilers, RISC even more so then the so called CISC (and the border is crumbling anyhow, modern CISC processors all have a kind of RISC core).
So, currently the numbers Apple posted are only relevant for intel Linux users that use software compiled with gcc.
Us lucky ones with Windows and intel compiled code (or Linux and intel compiled code) don't have to worry about those numbers, since they are irrelevant.
I still want to see some real benches, Cinebench being a good start for rendering purposes. Oh, and a comparison to the Opteron as well please.
Cheers,
Mike

Ade
06-25-2003, 05:13 AM
Apple used the intel spec marks provided to them from the bench maker. I bet you after 1 week apple will do a full comprehensive tests on all the sceptics wishes and I can see the heading for the Apple page -

"This is for all those Doubting Tom's"...

Lightwolf
06-25-2003, 05:15 AM
Originally posted by Ade
Apple used the intel spec marks provided to them from the bench maker. ...
... created according to Apple's specs :)

Red_Oddity
06-25-2003, 06:46 AM
Why do we need to petition for better 3d graphic cards on the Mac when all we really should be petitioning...no DEMANDING...is that they give us decent OpenGL, because right now it's downright sh!te...

Also, why do we need to pay so much more for a video card that basically is exactly the same as the pc version, except that they jabbed another firmware rom on it... (for example, my Ti4600 on my PC, 280euros, my Ti4600 on my Mac 580euros, and it took 2 weeks to get to Holland because for some reason they are only authorised in Ireland to open up the Mac case, stick in the Ti4600 card, close the case and send it back to Holland...wtf is that about...)

Anyway, nice to see Apple get of it's ***, too bad they f it up by posting those crap benchmark tests, they would have seemed impressive enough without cheating...

policarpo
06-25-2003, 07:59 AM
well...this is too funny...

who has pie on the face now?:p

http://apple.slashdot.org/apple/03/06/24/2154256.shtml?tid=126&tid=181

policarpo
06-25-2003, 08:04 AM
Originally posted by panini
Apple's tests were not done by Veritest.



What Apple conveniently fails to mention is that all of us can buy an AMD dual Opteron 64 bit workstation from BOXX today and this Apple won't ship unti September 1st. You also get Lightwave for only $995 with that workstation if you are buying Lightwave.


I am waiting to see what 64bit Athlon does and most likely buying a dual workstation with those chips. It shouldn't be more than $2000 or if I upgrade by buying MB/Chp and memory possibly less than $1000.

So, I am a potential G5 buyer ( a person looking for a new workstation soon ), but I don't like spending $3000 on a machine slower than a $2000 one. Lying about all this doesn't help Apple's cause either..
Only seeing Lightwave, Photoshop, Vegas running faster on a G5 than on AMD or P4 platform will convince me otherwise. And I'm talking about real machines optimally set-up, not cripled P4/xeon against 4 or 8 optimized IBM chips Jobs had running under the hood at his conference.

well dude....LightWave is optimized for Intel chipsets, so I bet a Dual Xeon is still faster than the fastest dual Opteron when you decide to buy a work station.

Lightwolf
06-25-2003, 08:17 AM
Originally posted by policarpo
well dude....LightWave is optimized for Intel chipsets, so I bet a Dual Xeon is still faster than the fastest dual Opteron when you decide to buy a work station.
In 32 bit mode, rendering using LW, the Opteron is a tad slower than a Dual Xeon, exept for Hypervoxels where AMD excels.

see
http://www6.tomshardware.com/cpu/20030422/opteron-23.html
fo example.

Lightwolf
06-25-2003, 08:19 AM
Originally posted by policarpo
well...this is too funny...

who has pie on the face now?:p

http://apple.slashdot.org/apple/03/06/24/2154256.shtml?tid=126&tid=181
Yup, as I said, Apples benchmark aren't relevant, since I use Windows and most apps I use are compiled using the intel compiler which optimizes.
May be Apple should get a decent optimizing compiler for their G5, so that they can play with the big boys at the official SPEC list. ;) After all, they said they could have used a better compiler, so why didn't they?

panini
06-25-2003, 12:51 PM
I'm not buying a machine one day to throw it away the following month.

Xeon being faster than 64bit AMD today means nothing. It's logical to assume that Lightwave will be optimized for 64 bit, maybe even V.8 and then AMD will be quicker.

Also lightwave is not the only app. used.

As for pie, it's definitely an Apple pie and it's on their faces. These excuses make absolutely no sense. It only makes Apple execs and R&D guys come off as retards ( maybe that explains why they have been so far behind all these years ).

Look at this excerpt from and interview with Senior Vice President of Hardware Engineering, Jon Rubinstein and some Akrout guy:
These guys sound worse than Beavis and Butthead.

------------------------------
DMN: Now, you're saying it's the first 64-bit desktop machine. But isn't there an Opteron dual-processor machine? It shipped on June 4th. BOXX Technologies shipped it. It has an Opteron 244 in it.

Rubinstein: Uh...

Akrout: It's not a desktop.

DMN: That's a desktop unit.

Akrout: It depends on what you call a desktop, now. These… From a full desktop per se, this is the first one. I don't know how you really distinguish the other one as a desktop.

DMN: Well, it's a dual processor desktop machine, just like that one.

Akrout: It's not 64, then.

DMN: Yes, it's a 64-bit machine with two Opteron chips in it. It started shipping June 4th.

Akrout: They are both 64-bit, but as you know, the PowerPC is RISC architecture and they're more like, kind of CISC architecture. So there's that fundamental architecture difference. So there are some differences. You mentioned how they already have a desktop -- I'll have to double check that. I wasn't aware of that. What we've done here with the G5 -- it provides us with the first 64-bit architecture for the desktop.

Akrout: That we'll double check, but in my mind, it wasn't.


What the &%#[email protected]*????????????

It's like Spinal Tap guy repeating: "but these go to eleven"
( appologies to those not familiar with Spinal Tap movie , trust me though, it's a good joke)

These guys should be designing spatulas for McDonalds, not computers.

policarpo
06-25-2003, 01:00 PM
uuugggghhhh.....then don't buy one and enjoy your life to the fullest.

Piolla
06-25-2003, 01:35 PM
Originally posted by Karl Hansson
I for one (probably the only one at this forum) is very excited about the new G5.

You're not the only one. I'll get one as soon as they are here!

Lamont
06-25-2003, 01:40 PM
I am a PC user mostly (My B&W G3 350 is just for video capture now). But I am excited for these reasons:

1 - It's a step(leaps) in the right direction for Apple.
2 - Technology advances, regardless of platform are always good.
3 - Mac users need this. The Mac platform can do more than Photoshop...
4 - More applications will move over to OS 10
5 - Hopefully more hardware will move over as well.
6 - I love computers and tech regardless what platform it is...

policarpo
06-25-2003, 01:41 PM
and lets not forget about OSX.

it is really a cool and well built OS. it makes windows feel so archaic in some regards....

we were screwing around with iChat AV and doing peer to peer video conferencing and it was so amazing. it's going to revolutionize the adult entertainment industry....

Lamont
06-25-2003, 01:43 PM
Originally posted by policarpo
it's going to revolutionize the adult entertainment industry.... HAHAHAHA!!!

sketchyjay
06-25-2003, 02:16 PM
I just want to know how much faster they are than a G4. nice to compare apples and oranges but what performance gain will i get beyond the dual G4 1.5s we have.

There is another thing that is great about this that the PC users are missing....

the claim that the g5 is faster and 8 gigs of memory will only prompt one thing...

Intel to release the Pentium 5 and up the memory to 8 gigs too.

BTW Intel is the leader of the pack and will stay there for the forseeable future. If you ever read any of the articles on tomshardware most of the Intel chips are crippled (unactivated hyperthreading on G3 chips et. al.) so they have lots of hidden tech ready to turn on to boost the speed of their chips. They also wait until absolutely neccessary to release their new chips.

I've also noticed they love to let AMD catchup then release something just a little faster. Nothing amazing but just enough to be faster than them.

A few years... well alot of years ago I ran into a Intel engineer wearing a glass pendent with a pentium chip in it. This was back when 486 chips were king. He told me that they have stuff way more powerful than you see they just release stuff as the market needed it. Or as competition requires. So it will be interesting how they deal with the G5.

In any case we ordered one for the office maxed out ram (8GBs) and hds (1TB) I'll let you know when we get it in how it compares to the G4 1.5s we have.

Isn't there a bench marking scene included on the LW cd? Maybe i'll do some g3s, 4s too to have a nice range.

jay

policarpo
06-25-2003, 02:21 PM
very cool. definitely let us all know once that puppy arrives. :)

js33
06-25-2003, 02:25 PM
Originally posted by panini

Look at this excerpt from and interview with Senior Vice President of Hardware Engineering, Jon Rubinstein and some Akrout guy:
These guys sound worse than Beavis and Butthead.

------------------------------
DMN: Now, you're saying it's the first 64-bit desktop machine. But isn't there an Opteron dual-processor machine? It shipped on June 4th. BOXX Technologies shipped it. It has an Opteron 244 in it.

Rubinstein: Uh...

Akrout: It's not a desktop.

DMN: That's a desktop unit.

Akrout: It depends on what you call a desktop, now. These… From a full desktop per se, this is the first one. I don't know how you really distinguish the other one as a desktop.

DMN: Well, it's a dual processor desktop machine, just like that one.

Akrout: It's not 64, then.

DMN: Yes, it's a 64-bit machine with two Opteron chips in it. It started shipping June 4th.

Akrout: They are both 64-bit, but as you know, the PowerPC is RISC architecture and they're more like, kind of CISC architecture. So there's that fundamental architecture difference. So there are some differences. You mentioned how they already have a desktop -- I'll have to double check that. I wasn't aware of that. What we've done here with the G5 -- it provides us with the first 64-bit architecture for the desktop.

Akrout: That we'll double check, but in my mind, it wasn't.


What the &%#[email protected]*????????????

It's like Spinal Tap guy repeating: "but these go to eleven"
( appologies to those not familiar with Spinal Tap movie , trust me though, it's a good joke)

These guys should be designing spatulas for McDonalds, not computers.


Hehehehehe

That is funny. Where's a link to it?

So now all 64 bit PCs are no longer desktop machines?

Also even though the dual Opteron came out in June Apple is still first to the desktop with a 64-bit chip?

This is getting good. It's funny and exciting that new stuff is coming out from all sides now.

Well Intel has had Itanium for years although it was targeted at the server market and the first version didn't go very far but what about Itanium 2? I guess that will be replacing the Xeons soon.

Cheers,
JS

uberslayer™
06-25-2003, 05:59 PM
It's nice to keep your tools (componants) in a pretty tool box. but when the guy next to has the power tools you become...........

panini
06-25-2003, 10:32 PM
That interview is on the front page of www.digitalvideoediting.com

I see he added a new negative article about the whole thing.

The interview is the scond link on front page.
Direct link is:
http://www.digitalvideoediting.com/2003/06_jun/features/cw_macg5_interview2.htm

Read the top article on the front page where he has direct comparison between AMD Opteron and G5 ( it's only one page and these are specs I saw a while ago on AMD's and other sites )

Both specs are provided by Apple and AMD ( and both "verified" by independent parties, Veritest in Apple's case, you can see on AMD's site who verified theirs if you care ).

And is anybody wondering why Apple refused to give him a G5 machine to test and told him he'd have to wait until they shipped just like everybody else.

Maybe preorders wouldn't be so impressive if any independent testing were to be done prior to official release?

Rey
06-26-2003, 12:42 AM
Originally posted by panini
...I am waiting to see what 64bit Athlon does and most likely buying a dual workstation with those chips. It shouldn't be more than $2000 or if I upgrade by buying MB/Chp and memory possibly less than $1000.

So, I am a potential G5 buyer ( a person looking for a new workstation soon ), but I don't like spending $3000 on a machine slower than a $2000 one. Lying about all this doesn't help Apple's cause either..
Only seeing Lightwave, Photoshop, Vegas running faster on a G5 than on AMD or P4 platform will convince me otherwise. And I'm talking about real machines optimally set-up, not cripled P4/xeon against 4 or 8 optimized IBM chips Jobs had running under the hood at his conference.

I just configured a machine on BOXX's website with comparable specs to the high-end dual G5 PowerMac and it was over $4100! LOL! (and it still didn't have FireWire). Stop spreading FUD.

panini
06-26-2003, 06:01 AM
Did Steve Jobs help you configure it?

I have no idea of what you are talking about.

64 bit dual BOXX they have right now as a base model start at $2344. and it has 1GB of RAM. Apple's top of the line has 1/2 that. That right there is $250 worth.

To reach $4000+ you need to use the fastest chips and those are 35% - 60% faster than the dual G5 across the board, in all Spec tests Apple used.

That is NOT comparable in any way to G5. It is far better.
You need to compare the slowest opterons to G5
In that case BOXX is $300 cheaper compared to dual G5 with 1Gig of RAM. That is with Pioneer DVD-R/CD-RW drive which I only use for comparison. I already have a more super DVD-R/CD-RW ( Sony ) drive than Apple's super Drive. Even if I didn't I could find a better drive/deal elsewhere.

Firewire cards cost $29 , is that really a big deal.
Almost everybody already has one or two anyway.

I'd bet pretty much anything that 64 bit Athlon based PCs will have very similar performance to top of the line Opterons and not cost more than $2500 with 1Gig of RAM and other basic components. G5 will be almost $1000 more and slower ( again ).

Also do not forget. You are buying this BOXX today. You can't have your Apple G5 until September ( even that is questionable )
This Opteron may be much much cheaper by then. This is another tipical Apple trick. Comparing nonexistant upcoming Macs to PCs
of last month.

If we are going to go that route, lets talk about Pentium 5 coming out in the first quarter of '04. Same thing, it's due only a few months after G5 launch. Might as well.

policarpo
06-26-2003, 06:42 AM
dude...just buy a BOXX and be happy then. :)

they're great machines.

harlan
06-26-2003, 09:54 AM
If you're so anti-apple then why are you reading a thread in a forum for Apple users?

It's brazenly incompetent people like yourself that ruin the web/community experience for everyone else.

I'm quite certain that BOXX Tech. would prefer to have a "professional" rep them rather than an openly ignorant individual whose knowledge of computer functionality rivals that of a water buffalo.

I move that you're banned from posting until you at least grow out of your training bra.

policarpo
06-26-2003, 09:56 AM
wouldn't it be cool to have a "vote so and so" off the island feature on these boards. if 20 people think you're a royal pain, they could suspend your posting rights, or out right ban you...

that would be cool!

i think it would make for a healthier environment.

harlan
06-26-2003, 10:00 AM
hehehe...that would be kind of fun.

Rey
06-26-2003, 03:10 PM
Originally posted by panini
...64 bit dual BOXX they have right now as a base model start at $2344. and it has 1GB of RAM. Apple's top of the line has 1/2 that. That right there is $250 worth.

To reach $4000+ you need to use the fastest chips and those are 35% - 60% faster than the dual G5 across the board, in all Spec tests Apple used....

Okay for sh!ts & giggles I'll config a G5 with 1GB of ram.. let's see.. that's $32499. That's still less than the $4100 of that BOXX config. I'll wait for REAL world tests before I judge performance etc. I used the top of the line Opterons (1.8GHZ/800MHZ bus)because they were the only ones that came close to the specs of the top of the line G5's (2GHZ/1GHZ bus). Now if we use the time-tested method of rating chip performance by looking at GHZ ratings I would say the G5 is a better performer (how does it feel to be on the other side of the fence?).

Again, I must echo others on this boards question, "why are you here?" Are things that stagnant on the PC board that you needed to go to where the "party" is at? If you like the Opteron BOXX's spec and pricing then buy it and leave it at that. Start a thread in the PC forum about the config you plan on purchasing and rave about it all you want there. I'm sure you won't see any of us Mac users here post there telling you to buy a G5 and that AMD CEO Hector Ruiz is a weenie. You'll find that we really don't care what box you use 'cause we'll be using a Mac.

Johnny
06-26-2003, 03:18 PM
Originally posted by Rey
Again, I must echo others on this boards question, "why are you here?" Are things that stagnant on the PC board that you needed to go to where the "party" is at?

yeah...this site is more about helping each other learn how to get better at Lightwave.

I visited the PC side a couple times to ask question, and checked my platform bias at the door. I think it would be nice if that convention could be observed.

J

luka
06-26-2003, 10:18 PM
this is a very intersting statement from Luxology

G5 Perfromance reponse (http://www.luxology.net/company/wwdc03followup.aspx)

policarpo
06-26-2003, 10:27 PM
Heheheh...nice how he didn't let slip what they were making huh?

Man..i wish Siggraph were tomorrow so we'd all know.

I hope some of yo guys attending Siggraph will let us know what is what.

Hrmmm...NewTek....Play....PMG...Luxology...hrmmm.. ..let's hope there's better luck at the end of this SF rainbow. :)


...if you guys don't know the story, some peeps from NT went and started Play (Trinity systems), some then left NT and started PMG, and now a bunch left and started Lux...i'm sure there are other stories of huge successes that i don't know about. :)

Lamont
06-26-2003, 10:33 PM
I'll be hitting every major booth (and a few of the little ones...) there to cover it for a cg web-site.

Siggraph is 10 mins from my house... this is sweet. I can get faded and stagger home. And I can unload the schwagg.:D

policarpo
06-26-2003, 10:38 PM
then definitely sign up for this and let us know what is what!

Secrets Revealed?? (http://www.desktopimages.com/ocf.shtml)

and revel in the schwagg!

Lamont
06-26-2003, 10:41 PM
Waaaayy ahead of you dude! I am up to my ears in registrations. And I have to man the booth for the company I work for for a little bit too...

I am sure I can weasle out of it. Weasling out of things is what seperates us from the animals... except a weasle.

policarpo
06-26-2003, 10:47 PM
I love weasels.

We own a ferret!

Weasel a way!!!!! And be sure to go and bug Proton...take him a cool toy...he loves toys. :)

http://www.u.arizona.edu/~thummel/ferret.jpg

ferrets like to steal toys and tear off their arms and heads. :p

panini
06-27-2003, 04:52 AM
I'm here because I'm buying a new workstation by the end of this summer.

Apple is ( or looks more like it was ) an option because I'd heard good things about this IBM chip.

Just a couple of points regarding your last post:

1. 1.8 Ghz Opteron is NOT in any way similar to G5 as it is 35%-60% faster in ALL TESTS, those same tests Apple used.

2. By the time G5 is available 1.8 Ghz Opterons will be mid to low end instead of top end, so in any case even those will be much cheaper than G5. When a chp is released it usually retails for $700 and with the next release it drops to about $300, so if we are talking about a dual machine that is already $800 less right there by September.

Please DO NOT compare last months Opteron and prices with future G5s. It only makes you look like you can't grasp the basics of this discussion. ( actually this chp has been available since April )

Current Opterons with similar performance to G5 are 1.4 Ghz and a dual machine with those is $300 cheaper than a Mac with same amount of RAM.

rick_r
06-27-2003, 07:11 AM
Originally posted by serpicolugnut
The Graphics card situation will be the deciding factor on whether or not I will buy a G5 now or wait.

Jobs stated during the keynote that the G5 could use any of the AGP8x "Pro" graphics cards on the market. Too me, that sounds like the ATI FireXL or nVidia the Quadro cards... I'd like clarification on this, as it's one of the key performance areas when using a 3D app like Lightwave.

Have you taken a look at those 3D Labs Wildcat cards with 512 meg of ram? Yeeowsah! Somebody please write a driver so we can use it in OSX! :D

mlinde
06-27-2003, 08:10 AM
Ok, I'm biting. Here's specs on a BOXX Opteron Workstation:

Dual Opteron 240
4 GB RAM (max)
nVidia GeForce 5600
120 GB HD
DVD-R Drive
3 FW Ports

$4054.00

And a comparable G5:
2x 1.8 GHz G5
4 GB RAM (from Apple)
nVidia GeForce 5200
160 GB HD
DVD-R Drive

$4,149.00

You figure the 40 GB is worth about $100? I do. Of course, both machines can go up in configuration and options. I went for what I hoped was similar configuration for price comparison. Oh, my bad though. The Opteron quoted is only 1.4 GHz.

Arnie Cachelin
06-27-2003, 10:36 AM
Do the BOXX opteron machines come with an OS that can handle more than 4gb of RAM, or actually use the 64-bit CPU, or is it just Windows 2k/XP?

mlinde
06-27-2003, 10:39 AM
Originally posted by Arnie Cachelin
Do the BOXX opteron machines come with an OS that can handle more than 4gb of RAM, or actually use the 64-bit CPU, or is it just Windows 2k/XP?
The hardware for the desktop machines cap out at 4 GB. However, the "Renderboxx," which is a 1U rack unit, can take up to 8 GB. Both of them can run a variant of Linux (they come with Red Hat) or Windows 2000/XP. Is it a customized version of Windows to allow 64-bit addressing? I don't know.

Lightwolf
06-27-2003, 10:39 AM
Originally posted by Arnie Cachelin
Do the BOXX opteron machines come with an OS that can handle more than 4gb of RAM, or actually use the 64-bit CPU, or is it just Windows 2k/XP?
W2K3 with 64bit AMD support is scheduled for September and in beta right now. I think a 64 bit XP for the single 64 bit Athlons is scheduled for December.
You could always load them with a 64bit Linux though :)
The BOXX machines support 8GB btw, but I assume 2GB DDR ECC Registered RAM modules are hard to get at the moment.
Cheers,
Mike

Johnny
06-27-2003, 12:45 PM
http://www.haxial.com/spls-soapbox/apple-powermac-G5/


I don't know about the technical ins and outs, but this article makes it sound as though Apple stacked the deck.

any thoughts?

J

panini
06-28-2003, 04:36 AM
Again you guys are using the same deceitful tactics as Steve Jobs

Prisces are not the same.

Dual 1.4 ghz opteron with 4GB of mamory and Quadro card ( why would you switch to N5600? )

is about $3950

Dual G5 2Ghz ( you quoted a price for dual 1.8 ghz G5 it seems )
with same 4GB comes out to about $4750

So Mac is $800 more, by September same configurations will be more than $1000 apart.

I posted already 3 times that top of the line Dual 2ghz G5Mac is
most similar to Opteron 1.4. Look up scores for yourself on SPEC.org sites and you can find them on many other sites as well

Even a single 1.8ghz Opteron beats dual mac in some of the test,
as does any decent Pentium4.
Only in Apple's own testing do these Pentiums score so low and Opteron for obvious reasons wasn't even tested.

Johnny
06-28-2003, 10:20 AM
http://www.macintouch.com/g5reader.html#jun27

marble_sheep
06-28-2003, 11:29 AM
Wow... this has to be close to some kind of thread length record, haha! Ok, I've watched this thread for a while and haven't said anything, so I guess I might as well jump in. Panini-- you claim you are here because you are in the market for a new workstation. From the sounds of it, you have already made up your mind, and there is nothing that will convince you to go with a Mac. If that is the case, perhaps you should quit posting in the Mac forum. There are plenty of real-world tests that show how the G5 can at least hold its own against the Intel camp, even best it in some cases. Granted, there may be faster faster options from AMD... so was it deceitful of Jobs to omit that from the presentation? Yes. However... it's called MARKETING, my fellow Lightwaver. Every company does it. If the fact that Apple does it bothers you, as it appears it does, then don't buy a Mac, plain and simple. But, like I said, it looks like you've already decided that. So, buy your Opteron and be happy. There's no way of knowing how the G5 really performs until it comes out, but I CAN guarantee that it will be faster than my current G4, and as a MAC USER, that's all that matters. I'm not going to switch over, so the only benchmarks I care about are how it stacks up against the current line of Macs. My current computer can hold its own against any PC of similar price, so I would have no reason to think that the G5 would be any worse. I know I don't have the fastest machine on the block, and I don't go around pretending I do, but at the end of the day I can get just as much done as the PC guy. Am I bothered that Steve may have stacked the tests? Maybe. But it's not going to change the platform I use. And neither are your rants here. For me anyway, and most Mac users out there, it's the whole user experience that counts. If all of us only cared about raw speed, then we wouldn't use a Mac, but obviously there's something about the whole experience that keeps us coming back. If that's for you, then that's fine too. Different strokes for different folks in this case. Go with what makes you happy.

edit-- oops, take that bit about the length record back, i know i've seen some that are longer. didn't want to start sounding like steve jobs or anything. well people, let's post a whole lot more and make it happen, haha.

RonGC
06-28-2003, 12:23 PM
Actually guys some companies do have the G5 in their hands along with the Beta Panther OS. Pixar is one of them and they publically stated that the G5 was impressive and worthy of their renderman products. This simple statement says volumes. Who cares if the benchmarks aren't as fast as claimed, remember they were obtained with programs which are not optimized as yet for a 64 bit operating system, as these programs are recompiled / rewritten we should see some faster benchmarks.

Johnny
06-28-2003, 12:29 PM
I hate to be negative, but wouldn't Pixar statements about the G5 be a tad suspect, given that Steve Jobs is CEO of Pixar and Apple??

maybe it would have a larger impact if some other company, say, Alias, were to say that, or NewTek...?

J

Lamont
06-28-2003, 12:33 PM
Yeah, I also exspress concern about sattelites...

and their beams...

RonGC
06-28-2003, 12:45 PM
Even if Steve Jobs is with Pixar, it still wouldn't make a lot of sense for them to go to The Mac at this time unless the tests were impressive enough to sway them. Finances rule the day :-)

Also ven if the machine gets beaten out by some future 64bit Windows machine, im still staying with Mac. This machine is a big performance jump forward from the G4 and i have already preordered the dual 2ghz. I prefer the Mac experience over that of windows anyday of the week :-)

fxgeek
06-28-2003, 12:48 PM
Originally posted by Johnny
I hate to be negative, but wouldn't Pixar statements about the G5 be a tad suspect, given that Steve Jobs is CEO of Pixar and Apple??

maybe it would have a larger impact if some other company, say, Alias, were to say that, or NewTek...?

J

If Ed Catmul from Pixar says so, that's good enough for me. He is one of the most respected and leading scientists / artists in this industry, which probably wouldn't be where it was today if it wasn't for him and his collegues. I don't think a man of his principles would lie about something like this and don't read too much into the fact that Steve Jobs is his boss. Pixar has always been fiercly independant from Apple, otherwise we would have renderman by now.

And, sorry, but for most of the 3D industry Pixar holds more weight than anyone, and you have to assume that people who have been at the forefront of the 3d industry are beyond the childish crap that's gone on over these benchmarks.

mlinde
06-29-2003, 11:12 AM
Originally posted by panini
Again you guys are using the same deceitful tactics as Steve Jobs

Prisces are not the same.

Hmm. You are very accusatory. I'm not going to get pissed off, but calling me deceitful is a bit heavy handed.

Maybe you should try "underinformed" or "misinformed" before you slander someone with a title like deceitful. Or maybe you should go crawl under a rock with your fancy, fast, PC, and leave us alone.

Beamtracer
06-29-2003, 03:25 PM
It's quite bizarre that people so passionately want to "put down" the G5 at this early stage.

Speed comparisons with Windows boxes are irrelevant before the G5 hits the streets. But still people are ready to criticize the G5 regarding speed.

You'd have to wonder about the motives of people like this. What drives them to carry on like that before they've had any experience with the new Apple machine (or even seen one).

The Windows computer magazines and websites are full of stuff about how the G5 is not as fast others. I've long thought there must be a financial incentive for many of these computer "journalists" to be so biased.

The G5 is a revolutionary machine because it heralds the new era of 64-bit computing for the masses. Despite the existence of 64-bit UNIX machines for many years (used for professional applications) the Microsoft Windows platform has not reached this milestone.

Could it be an issue of envy on the part of Windows-lovers?

Johnny
06-29-2003, 03:38 PM
Beam;

I can only speak for myself. I'm a Mac fan to the core, but it seems that for some time now, Apple has been a day late and a dollar short with respect to hardware.

The G3 was a high-water mark (didn't it get chip of the year way back then?) but the G4 seems like it was behind the 8-ball almost from the start..

even fans were saying that it wasn't quite what it was hoped to be.

I wonder about the remarks I hear from PC users, especially those who work in shops which use both PC and Mac...

Much as I love the Mac OS experience, I have to wonder why so many PC people swear by that platform. It can't be that Mac users are the only people on earth uniquely endowed with enough cerebral cortex to know that Mac is great..

From my reading, I can' t remember a time when Mac hardware was CLEARLY out in front in terms of speed. there'd be the few Photoshop tests where the Mac would win, but not by huge margins..PCs would win the rest, plus would vaporize Mac hardware in other arenas, namely 3D.

You are without question right in your statement that, until we have this G5 in our hands, we can't make a proper assessment.

I'm in the 'believe it when I see it' camp, with a toe in the Doubting Thomas camp.

but, I will belly up to the crow-eating table with relish if it turns out that th G5 is the killer it's claimed to be.

Johnny

js33
06-30-2003, 04:32 AM
I hope the G5 is all that since Mac users have waited so long for a fast machine. The main reason Wintel machines are more popular is they are much cheaper and much more customizable than a Mac. You can buy a 2.5 Ghz PC now for less than $500.
It would no doubt smoke the entry level G5 which starts at $2000.
I'm on the fence also with the G5. I have an iMac which I rarely use and wouldn't want to invest $3000+ in a G5 to have it sit in the corner. I'm not saying either one is better than the other but I have most of my software on the PC already so it's not only an investment in hardware but the much more expensive investment in software that stops most people from switching. If more companies would do what Newtek did with 7.5 and give you the Mac and PC version the switch would not be as painful.

Plus the Wintel camp is gearing up for 64-bit as well so time will tell what to go with. Opteron is out now and Itanium 2 is supposed to be pretty good even though the first version didn't go anywhere.

Cheers,
JS

sketchyjay
06-30-2003, 04:44 AM
Originally posted by Johnny
Beam;

I can only speak for myself. I'm a Mac fan to the core, but it seems that for some time now, Apple has been a day late and a dollar short with respect to hardware.

The G3 was a high-water mark (didn't it get chip of the year way back then?) but the G4 seems like it was behind the 8-ball almost from the start..

even fans were saying that it wasn't quite what it was hoped to be.

I wonder about the remarks I hear from PC users, especially those who work in shops which use both PC and Mac...

Much as I love the Mac OS experience, I have to wonder why so many PC people swear by that platform. It can't be that Mac users are the only people on earth uniquely endowed with enough cerebral cortex to know that Mac is great..

From my reading, I can' t remember a time when Mac hardware was CLEARLY out in front in terms of speed. there'd be the few Photoshop tests where the Mac would win, but not by huge margins..PCs would win the rest, plus would vaporize Mac hardware in other arenas, namely 3D.

You are without question right in your statement that, until we have this G5 in our hands, we can't make a proper assessment.

I'm in the 'believe it when I see it' camp, with a toe in the Doubting Thomas camp.

but, I will belly up to the crow-eating table with relish if it turns out that th G5 is the killer it's claimed to be.

Johnny

The problem is one of competition. On the PC side you have Intel and AMD fighting tooth and nail for every ounce of power they can put out to beat the other. On the mac side you have Motorola and IBM, but apple will use one or the other. So the competition is not as fierce. If Apple made it so you can purchase upgrade CPUs and let IBM and motorola duke it out the Mac would stay neck n neck with the PC at all times.

Alot of PC users are speed freaks. High speeds and customization make the PCs the machine to use if you love speed. If the fastest machine still isn't fast enough you overclock it (the equivalent of pumping nitros oyxcide into the CPU) They even have refrigerated and liquid cooled cases for extreme overclocking allowing some madmen to pump up their machines 500Mhz extra.

Flip through a Maximum PC mag and you see the level of detailing and mods they do to the case and you'll see alot of PC users are hot rodders who put their money into a computer instead of a car.



jay

Johnny
06-30-2003, 09:44 AM
I wonder why there've been no speed tests showing how the G5 and G4 compare.

I can think of three reasons:

1. the G5 would vaporize the G4, leaving nobody willing to buy remaining G4s still in the pipeline

2. the G5 isn't much faster than the G4

3. Apple thinks nobody's interested in that comparison

J

panini
06-30-2003, 02:26 PM
I don't think this is strange at all.

The amount of backlash is always equal to the amount of lies used to hype a new product.

AMD and Intel do not make big announcements like Apple.
Most of you didn't even know when AMD released the first 64 bit processor.

And sort of "I told you so". AMD just announced 2 additional lines of 64bit Opterons. The cheapest costing only $229.

Trust me, by the time G5 comes out equal performance will be available for over $1000 less on PC platform.

About the Pixar guy. Isn't it obvious?

Johnny
06-30-2003, 02:44 PM
Originally posted by panini
he amount of backlash is always equal to the amount of lies used to hype a new product.

AMD and Intel do not make big announcements like Apple.
Most of you didn't even know when AMD released the first 64 bit processor. .

Trust me, by the time G5 comes out equal performance will be available for over $1000 less on PC platform.

About the Pixar guy. Isn't it obvious?

well-said. that's the kind of thing that's been bugging me for awhile. LOTS of roaring, but then you open the cage, out comes a kitten instead of a Raptor. And c'mon: the BUCKS involved here!

I'd like to see less boasting, fewer fruity colors and optical plastic, less of a big deal about the industrial design of the BOX, and much, much more in the way of SERIOUS incontestable p-e-r-f-o-r-m-a-n-c-e.

J

toby
06-30-2003, 09:08 PM
"the amount of backlash is always equal to the amount of lies used to hype a new product"

So you're justifying slinging BS? Have you not known about Marketing BS since you were a kid?

Were you upset when you bought Budweiser, but you didn't get surrounded by a dozen chicks in string bikinis?

M$ is the King of Disgusting marketing, but when Apple uses Marketing, which may or may not be BS, everybody starts finger-pointing. What's up with that?

panini
07-01-2003, 04:15 AM
Hmm.... I don't see how you concluded from that that I justify slinging BS.

Marketing is usually lies and twisting of facts, but there is also a line which you can cross and you get into an area called false advertising ( which is also illegal )

I think all those people advertising quick weight loss systems and great abs in 2 minutes are obviously wey beyond that line and should be in jail. But it seems nobody enforces those false advertising laws. Jobs falls in the same category. Backlash against this is not slinging BS.

There is enough anti-Microsoft sentiment to go around. I don't think you can say Apple is getting a raw deal here. The difference is since Microsoft doesn't make hardware, they don't ever make these "even the slowest iMac is faster than the fastest PC claims"
( which Jobs did make when iMac was released ). Microsoft is more content to do things behind the scenes and buy-out and destroy competition in different ways.

To me Microsoft is an evil empire and Apple is a wannabe evil empire. I don't see much difference other than the wannabe one is more desperate since it's behind.

I don't get upset if I take a sip of beer and models fail to appear around me, but I do get upset if it tastes like crap.

harlan
07-01-2003, 11:43 AM
Yeah, I much prefer taking a sip of a model and having beer suddenly appear around me. ;)

tallscot
07-01-2003, 02:31 PM
I haven't read this huge thread, so I apologize if this is redundant.

Comparing hardware, you use the same compiler. The same GCC compiler was used to show the G5 hardware is faster than the Xeon hardware.

The GCC has been on the Intel processor longer than the PowerPC, so it is actually more optimized for the Intel.

The argument that developers use the Intel compiler instead of the GCC is countered with the fact that there are faster compilers on the Mac that are used instead of GCC too.

Seems like a lot of sour grapes here. Is Luxology in on this conspiracy too? They have a letter on their site addressing the criticism from PC users. Fact is, it's much faster on the G5.

toby
07-01-2003, 04:57 PM
panini, I'm sorry, it's just my opinion that you're slinging BS, I shouldn't have extrapolated further on that opinion.

What I mean is that you immediately believe that Apple's benchmarks are BS as soon as that claim is made, and dig for ways to show that it's more expensive per measure of performance - an argument that can go back and forth ad infinitum, participation in which shows bias, and fervent bias looks to me like BS.

Yes Apple uses a different type of Marketing than M$, AMD and Intel, who only make part of a system. But it's no different than that of Ford or Chevy : "the most standard horsepower in it's price range of domestic 4 cylinders in it's category - on a cold weekday" - M$ has the opposite effect of competition on the market, leaving us with fewer choices, worse computers and worse software.

tallscot
07-01-2003, 05:26 PM
" What I mean is that you immediately believe that Apple's benchmarks are BS as soon as that claim is made"

I find it amusing you wrote that considering your sig.

http://www.newsmax.com/showinside.shtml?a=2003/3/13/112209

toby
07-01-2003, 05:36 PM
to be continued off-thread!

edit
well maybe he will turn private messaging or email back on...

Johnny
07-01-2003, 10:42 PM
http://maccentral.macworld.com/news/2003/07/01/reaction/

Red_Oddity
07-02-2003, 04:58 AM
Wow, i've been sick for half a week, i come back, and you guys are still on it?

Damn people, let's all wait till they actually hit the streets okay, then we can all go back to throwing mud again...

Shees...

Karl Hansson
07-07-2003, 09:41 AM
The opteron isn't such a big deal either:
http://www6.tomshardware.com/cpu/20030422/opteron-23.html#3drendering

Slower than a Pentium 4 2.6...

http://www6.tomshardware.com/cpu/20030422/images/chart_lightwave.gif

Darth Mole
07-07-2003, 10:47 AM
The G5 does reasonably well in these benchmarks, undertaken by some guy at NASA , who - presumably - doesn't bear any allegiance to any one system.

http://members.cox.net/craig.hunter/g5/

FWIW: I was down at an Apple press event in London last week, where I got a close look at the new G5. It is a beautifully architected piece of hardware. I know most pro LWers don't give a hoot about the box, but you certainly know that you've spent some serious cash here. Fantastic production values - and it looks much better 'in the flesh'.

Nice to see the IBM PowerPC roadmap goes up to 10-20GHz by 2008!

Lightwolf
07-07-2003, 12:21 PM
Originally posted by Karl Hansson
The opteron isn't such a big deal either:
http://www6.tomshardware.com/cpu/20030422/opteron-23.html#3drendering

Slower than a Pentium 4 2.6...

Well, that's Tomshardware to you :) They don't even tell you which scene they rendered, which can be crucial...
Actually, whether AMD or Intel lead depends a lot on the scene, http://www.blanos.com is still the definate site on LW benches. You'll notice that AMD processors smoke Intel on Volumetrics and basic renders, while the Intels catch up with GI and raytraced scenes...
Looking at Kites report, the G5 would perform at about the same speed as a 2.6GHz Xeon as well...
Cheers,
Mike - waiting for the Opteron and the G5 on blanos.

toby
07-07-2003, 02:44 PM
"Well, that's Tomshardware to you They don't even tell you which scene they rendered, which can be crucial"

It comes in 3rd on 3 different programs! :rolleyes:

Lightwolf
07-07-2003, 03:25 PM
Originally posted by toby
"Well, that's Tomshardware to you They don't even tell you which scene they rendered, which can be crucial"

It comes in 3rd on 3 different programs! :rolleyes:
:rolleyes: As I said, it depends on the scene. Go over to http://www.blanos.com ,check out the "variations" scene on all dual processor machine, LW 7.5, and you'll get Athlons blowing Xeons out of the water. And the Opteron is basically a souped up Athlon (+SSE2).
The Athlons have actually held up quite well in real life compared to Xeons, the Xeons have started to really take off once the 2.8GHz and more CPUs came on the market...
As I said, give me _full_ LW benches, and I can judge the machines...
Cheers,
Mike

toby
07-07-2003, 03:36 PM
so your going to ignore the C4D and 3DSMax benchmarks? One of the tests shows a single Opteron dead last. 3rd place, and not even a close 3rd, was it's best score

Lightwolf
07-08-2003, 05:46 AM
Originally posted by toby
so your going to ignore the C4D and 3DSMax benchmarks? One of the tests shows a single Opteron dead last. 3rd place, and not even a close 3rd, was it's best score
I'm actually ignoring all of them, because I have no way of reproducing them or verifying them, and because Tom's has a pretty bad rep as far as benchmakrs are concerned. All I really care about are LW and DF benchmarks (and there aren't any for DF or similar apps), so I stick to blanos.
Cheers,
Mike

toby
07-08-2003, 03:04 PM
The point was, it matters a lot less if they used the LW scene that Athlons do well on, when 2 other programs show a similar result. Anyhow, the Opteron came last after a slew of other Athlons.

All of us take benchmarks with several grains of salt, but we don't completely ignore them when they don't suit us - if you're waiting for benchmarks on blanos, why are you in this conversation? The rest of us are comparing machines based on what data we have, and not drawing any solid conclusions. So far all your posts are only meant to show the macs' weakness and/or pc's strengths, and only for the few things that you use them for:
"As I said, it depends on the scene" can and has been said for the mac also, but you don't give macs the same consideration.

Lightwolf
07-09-2003, 02:46 AM
Originally posted by toby
"As I said, it depends on the scene" can and has been said for the mac also, but you don't give macs the same consideration.
Actually I do. But there aren't any independant G5 benchmarks available either, except for Kites small run of the raytracing scene, which would roughly equal a Xeon 2.6 / 2.8 GHz.
I could easily do it the Apple way, and quote SPEC which show the Opteron being far ahead of the Xeon, and in close range to the Power4, but that would be just as much BS :)
One thing that I miss with all those benchmarks being used is
a) documentation (Cinebench can be a great tool, but most sites only use the openGL part and post the results as "Cinebench Rendering", which is simply false).
b) There are hardly any benchmarks valid for compositing or similar applications. Running Photoshop with a bunch of filters is not the same as running a 2K comp with 10 layers...
I am on this conversation because, since no other data is currently available, I think that the current G5 specs are exaggerated, that's the topic, right?
Cheers,
Mike ;)

toby
07-09-2003, 03:30 AM
It's been established that the benchmarks were not lies or under-handed - they used the best looking benchmarks they could find, in the same way that you discredit benchmarks that you don't like - but you still believe that they were exaggerated.

Let's see, what do we have now...any benchmarks that show the mac as fast are lies, any benchmarks that show a pc as slow are lies, any benchmarks that don't use LW or the proper scene are lies or irrelevant for judging system performance,
the only benchmarks you'll consider are blanos, which don't exist yet, but you feel the need to tell us this again and again, and you claim to be impartial.

Lightwolf
07-09-2003, 04:19 AM
Originally posted by toby
Let's see, what do we have now...any benchmarks that show the mac as fast are lies, any benchmarks that show a pc as slow are lies, any benchmarks that don't use LW or the proper scene are lies or irrelevant for judging system performance,
as far as LW is concerned, yes. Just look at the shift from AMD to intel in performance during the last couple of LW releases on the PC. Since LW is for me the most demanding app, that's what I judge system performance with. And since Tom's hardware results don't correspond with real world experiences I can either blame the benchmarking or the real world.
As I said, if a G5 has the same price/performance ratio, I'd switch. I'm just careful on judging it.


the only benchmarks you'll consider are blanos, which don't exist yet, but you feel the need to tell us this again and again, and you claim to be impartial.
I'll never mention blanos again :) I'll never mention LW again either...
Howzabout SPEC then ? ;)
Cheers,
Mike

tallscot
07-09-2003, 10:22 AM
The dual 2 Ghz G5 is priced very competitively, in my opinion. It's cheaper compared to certain brands of dual Xeons. I don't know if you could build a dual Xeon for less. Maybe you could.

And IBM is not Motorola. The IBM 980 will ship this time next year at 3 Ghz. Is the Xeon going to be 5.1 Ghz then? That's how fast it would have to be.

IBM supposedly has a roadmap all the way to 20 Ghz in 2010 with the IBM 990.

Johnny
07-09-2003, 10:39 AM
20 Ghz!

how fast would a machine have to be for us to model at full render quality in realtime?

I guess it would depend on caustics, radiosity, etc...

I would expect such a computer to be tapping its toes waiting for me much of the time...

J

Teig
07-16-2003, 01:13 PM
I find this amusing!

i read a bunch of posts about hoe the G5 is gona suck and such, Lots of people from the PC camp bashing it. The PC crowd seems to have missed the really inportant point. The fact that they have been lied to by their beloved Intel. The G5 speed will be revealed for what it is when they ship. However, take a close look at the Dual Xeon vs the single P4 the P4 is about 75% of the speed of the Dual Xeon machine. By the looks of it a Dual P4 would be about the same speed.

Red_Oddity
07-17-2003, 10:08 AM
I don't think P4 s can be dual? (imagine that, me the PC geek not knowing this...shame on me...i've failed Overlord Gates...)

Also...Xeons have larger cache and are more stable overall...

But i wonder how long the P4 and Xeons are gonna be around...i mean...Intel will probably do something to kick AMD down a notch again...

tallscot
07-17-2003, 10:15 AM
There are two Xeons: a server Xeon and a workstation Xeon. The server Xeon has more cache than the Pentium 4. The workstation Xeon doesn't.

Both offer SMP ability. The Pentium 4 does not.

So if you want a dual Intel processor PC, you have to get a dual Xeon.

The Xeon's system bus is 533 Mhz. The Pentium 4's is at 800 Mhz now. The G5 is 1 Ghz.

Lightwolf
07-17-2003, 10:30 AM
tallscot:
Almost, intel has released Dual Xeons with larger cache this week, so there are two types of dual Xeons out right now (allthough I assume the performance difference won't be too large when rendering...).
Bandwidth wise you're right, allthough the theoretical king of the hill here is the Opteron with 24,5 GB/s (G5 7GB/s, P4 6,4GB/s).
One thing though, the G5 only does 7GB in duplex mode, which is 3.5GB per direction, while the P4 handles almost 6.4GB in both directions, but needs small breaks when switching the direction of the data flow.

Also, since this thread is coming back alive:
c't, a german computer magazine (actually, imho _the_ computer magazine) has osted a couple of SPECS for the P4:
Same chipset, processor and RAM as the Dell used by Apple, same OS, same compiler, same flags.

Results:
Apple/Dell: int_2000base 889 fp_2000base 693
c't: int_2000base 961 (HT off) fp_2000base 811 (HT off)
c't: int_2000base 943(HT on) fp_2000base 796 (HT on)

BTW, they don't slame the G5, but actually praise the architecture. The magazine isn't PC biased either (and hasn't been for the last 15 years since I bought my first copy).

Cheers,
Mike

tallscot
07-17-2003, 01:28 PM
Nice to see Apple was right that HT gave slower results and that is the reason they had it turned off.

If the only G5 benchmark available was a SPEC from Apple, I'd care about the SPEC numbers. Clearly, the SPEC numbers don't translate into the performance you will see running the OS and applications.

I don't know whether I should get a 2 Ghz G5 next month, or wait for a 3 Ghz next year. :)

Thanks for the info.

dfc
07-17-2003, 02:59 PM
Also, it's good to see apple doing more application tests than just PS.

After all the hoopla about their testing..etc..apple, at the macexpo in NY, reran a couple of the application tests using different apps.

They used Cubase for both mac and PC this time. (Cubase is a POS on the mac...buggy and poor performance).

And they used AE instead of PS for testing.

Results were the same...ie..G5 bested the Dells by a wide margin.

While this doesn't "prove" anything, it's a least a good sign that the G5s performance isn't all on paper...or just in PS. And there is some hope that this performance will translate to applicatons (even badly optimized ones)

And, also, I might mention...that ALL of the app tests..showed a much wider disparity in performance capability between the G5 and Dells than the spec tests did.

So, it looks promising. We'll see.

tallscot
07-17-2003, 03:45 PM
Good info. Is this first-hand experience?

Ed M.
07-17-2003, 06:35 PM
OK, my two cents...

First, why Lightwolf is crying "foul!" with respect to G5 performance is beyond me. The fact of the matter is that the benchmarks that have already been run were apps optimized and compiled for G4, not G5. Apparently, they show that the G5 can run current (i.e., non-G5-specific/G5-optimized) code pretty well. It's also obvious that the apps running on the Intel boxes HAVE been optimized for the Intel processors. Think that AE isn't optimized for the Intel machines? I mean I could bring Chris Cox back and he could explain it, but I think most people can see what I'm trying to point out.

Second, has anyone wondered why all the hoopla about the SPEC scores evaporated? Answer: The tests were conducted fairly and within SPEC rules. If there had been something disingenuous with the way the test was conducted, then SPEC would have come out and openly condemned the test and results. Nevertheless, for those interested in what SPEC is really about, check out this thread (it was beaten to death over at Ars):

http://arstechnica.infopop.net/OpenTopic/page?q=Y&a=tpc&s=50009562&f=48409524&m=7510952375&p=8


OK, now we can put all the SPEC BS to rest. It's a no brainer people. VeriTest followed the rules to a *T-ee* and since the test was fully documented... Yeah, you get the idea; besides, SPEC is synthetic anyway. Furthermore, it's infinitely abusable.

What's odd *now* is that a lot of the Wintelon folk were making huge accusations and false claims regarding this. It's now clear that VeriTest (and Apple) were indeed fair in their testing. How do the PC-folk respond? They don't. They don't even bring it up. I guess it's a form of security (in their choice of platforms) through obscurity (i.e., don't let people know about this). ;-)

Other benchmarks... Other non-G5 optimized apps have also shown similar results in outperforming the competition. Mathematica? JET3D? The most recent ones at the Creative Expo? I just know I've posted this on another board somewhere, I forgot where, (perhaps it was here) so I can't provide a simple link. Anyway, this is how I see it...

First, Apple developed the G5 in direct response to what the high-end workstation customers wanted. It's that simple. The userbase needed 64bit and Apple gave it to them. There is a relatively large userbase and the software support is already falling into place. We can't say that for the Wintelon crowd -- we really can't. Even though AMD-64 is around the corner [for desktops] (again), there are still a LOT of questions.

In a nutshell, AMD-64 looks rather peachy, but where is the trouble-free 64-bit Windows support? The Opteron really isn't meant for desktops.. it's a server chip. So, is the Opteron going to be for the desktop-user crowd or the server lot? It's not clear at all and AMD still has to worry whether their processor will be widely adopted or remain obscure.. Similarly, where are the 64bit apps? Who's producing them? Who is planning on supporting them? How long will it take them to get their wares to market? and more importantly (as I've also noted in earlier threads) Where is the userbase for these things? Who's going to buy them with all these lingering questions? There are many other questions that we've already been over time and again. What about compatibility issues? Anyway... The truth is that Apple is delivering. App developers are starting to deliver. The OS is here *now* and working fine, and there is a nice userbase all ready and waiting for these systems. Just think about that. I'll lay odds that G5s sold in the last week alone then the total number of Opteron boxes sold since they were available.

Again, to hammer the point home regarding the G5 evaluations and the vector performance (astronomical) and other benchmarks... It's worth remembering that these preliminary evaluations were done with completely un-optimized code for the G5. When the G5/Panther-specific optimizations come along, expect even better performance. Yes, it's that simple.

It's a wonderful thing that a processor like the G5 can scale with frequency; especially given the new PPC architecture of the G5. The G5 scales nicely, as does the AltiVec unit with respect to performance. If you (choose) to recall, this degree of scaling DID NOT occur with the transition from Pentium3 to Pentium4. Lower clocked P3's were actually faster than P4's. And the simple truth is that the Pentium's vector performance can't hold a candle to AltiVec. The bus in the G5 is simply MONSTROUS. Any results will likely show unbelievable wins for the G5.

When the G5 is released it will lay to rest any skepticism regarding real-world performance. People will see.. and this is exactly what PC people *won't* be talking about. Synthetic benchmarks (SPEC) won't mean as much as they used to. I think a lot of Wintelon folk will be surprised. They're already howling at the top of their lungs. Will they be in denial? Who really cares?

It's likely that there will be many obscure, sneaky, underhanded tests being cooked-up by PC/Wintelon-centric sites and individuals attempting to counter any performance advantage that the G5 will show. I guess we'll have to live with it. Currently, there is no trickery, no magic spells, no illusions. The performance of the G5 is already evident with un-optimized code. It's up to the developers to make it *really* shine. On a similar note, there is a pattern that many people reading these threads should zero-in on. And I hope this sticks.

Why does everything always seem to come down to comparisons of scalar performance? Many people choose to ignore the glaring answer. The fact is that the performance crown for SIMD/vector was handed to the *G4* long ago. there is simply nothing in the Wintelon world that can compete with AltiVec -- not Intel, not AMD; no one. Yes, it's that good and the G5 just extends that lead even further.

Regarding scalar, people should keep in mind that the PPC 970 was derived directly from the Power4. People with a clue already know that the Power4 is world-renowned for it's scalar performance. What makes people think it's going to be any less effective with 970/G5? What specific part about the 970 isn't spectacular compared to the G4 or even the Intel line for that matter? Can anyone think of something? Doubt it.

--
Ed

Lightwolf
07-18-2003, 03:37 AM
Ed:
Well, if the SPEC marks aren't "foul", why aren't they on the official SPEC list, right next to the other vendors?
Why are results form third parties using an almost identical set-up different?

As for the compiler: the gcc used by Apple is optimized for the G5, as can be seen by some of the flags they used.

And while it is true that gcc is well optimized for the intel architecture, this isn't quite true for the P4, since the architecture changed dramatically with that chip (as you acknowledge), and gcc is still trying to catch up.

SSE2 vs. Altivec: SSE2 calculates in double precision, AltiVec doesn't (even though there are tricks to use AltiVec for dp), in turn though the G5 has 2 fp units, where the P4 has only one.
If the code is properly vectorized (something gcc doesn't do, but icc does), fp will be much faster much on a P4.
If you compare GFlops, the G5 wins using the fp units, but if you take into account SSE2 on the P4 and vectorized calculations the P4 wins (again, Altivec doesn't do double precision floats).

The Opteron is meant for workstations and servers, and clearly the G5 is aimed at the workstation market as well. AMD will also soon release the 64bit Athlon, boards are available now. As for the OS: W2K3 Server 64bit is supposed to come in September, WinXP 64bit in December, 64bit Linux has been available for some time.

As for the Power4, have a look at SPEC and the compareable performance of the Itanium2 and the Opteron (even though it plays in a different league). Bandwidth wise the Opteron seems to be king of the hill currently, since memory access isn't penalized as much in multi-processing situations.

Still waiting for LW renders on the G5...
cheers,
Mike

tallscot
07-18-2003, 08:29 AM
I can't quite understand how someone can accept one set of benchmarks at face value, and deny another set of benchmarks.

There is, obviously, a lot of activity on Web forums right now about the G5. It's amazing how many people, PC users I'm guessing, say that all of the benchmarks produced on the G5 are bogus, but the SPEC numbers from AMD are accurate.

Charlie White, on Digital Video Editing Magazine, has an editorial where he scolds Apple for lying about the G5 being the first 64 bit personal computer and scolds them for their SPEC benchmarks. He then turns around and provides SPEC numbers provided by AMD. Huh? So Veritest is suspect, but AMD is not?

Charlie points to the BOXX dual Opteron as the first "shipping" 64 bit personal computer. Never-mind that that BOXX comes with a 32 bit OS and has a limit of 4 gigs of RAM (32 bit), Charlie claims that it is the first 64 bit personal computer.

Something tells me that if the G5 shipped with a completely 32 bit OS (Jaguar and Panther are altered for 64 bit) and had a RAM limit of 4 gigs, Charlie would be calling foul about calling it a "64 bit personal computer".

So the Luxology, Mathematica, BLAST, eMagic, Photoshop, Quake III benchmarks from Apple are all bogus?

The posts by developers at WWDC and CreativePro Expo (whatever it's called today) that confirm the G5's speeds are also bogus, like the one with After Effects?

In any case, it's amazing how quickly the roles have switched - Mac users use to be the ones saying, "Just wait for...". Now it's the PC users. :)

You guys read the rumors about a "Dark Star" super server from Apple with up to 64 G5 processors?

Lightwolf
07-18-2003, 08:40 AM
Originally posted by tallscot
It's amazing how many people, PC users I'm guessing, say that all of the benchmarks produced on the G5 are bogus, but the SPEC numbers from AMD are accurate.
Actually, that's not what they're saying. But, official SPEC marks are created by vendors using optimized systems and compilers and should be regarded as such.
SPEC can tell you how a machine can perform in a very tight, optimized situation.
Seen from that point of view, the Veritest SPEC results are complete bogus, especially considering that other independants have gotten higher SPEC marks on almost identical P4 systems.
Of course, one can use the SPEC suite un-officially to compare systems, but you can't really claim to have the fastest system then either, only within the parameters that the systems were tested in. If these are bogus, so is the claim.

As for the 64bit part. Actually the DEC Alpha was the first 64bit PC (even sold as such for some time way back when), it never had a 64bit consumer OS though.
The Opteron can actually use more than 4GB in 32bit mode, but you need a quite expensive Windows version (the Xeons handle that as well). There is a 64bit Linux available though, and has been for some time.

Cheers,
Mike - still waiting for decent benches...

tallscot
07-18-2003, 09:13 AM
Actually, that's not what they're saying.

Yes, it is. Charlie White is just one example.

But, official SPEC marks are created by vendors using optimized systems and compilers and should be regarded as such...the Veritest SPEC results are complete bogus...especially considering that other independants have gotten higher SPEC marks on almost identical P4 systems.

Your implication is that Veritest's testing methods are a mystery, and that we can't duplicate what they did. The opposite is true. We know exactly what Veritest did. Anyone can download the PDF and duplicate the test.

You would have a point if the methods were undisclosed and we had to guess how they got those numbers.

But I give up - the Veritest benchmarks are bogus, the Tom's Hardware dual Opteron benchmarks are bogus, Apple's benchmarks are bogus...any benchmark that doesn't tell you what you want to hear are bogus.

Actually the DEC Alpha was the first 64bit PC...it never had a 64bit consumer OS though.

My definition of "personal computer" requires consumer applications.

Charlie White didn't say the DEC Alpha was the first 64 bit personal computer. He said the BOXX was.

Scot - ordered a G5 that is faster and less expensive than a dual Xeon.

Lightwolf
07-18-2003, 09:25 AM
Hi tallscot,

Originally posted by tallscot
Your implication is that Veritest's testing methods are a mystery, and that we can't duplicate what they did.
Actually, no. All official SPECs are well documented, that is part of the rules.
They are a mystery on the G5 side, and we can't duplicate that, true.


The opposite is true. We know exactly what Veritest did. Anyone can download the PDF and duplicate the test.
...and get different results, see my post above.

But I give up - the Veritest benchmarks are bogus, the Tom's Hardware dual Opteron benchmarks are bogus, Apple's benchmarks are bogus...any benchmark that doesn't tell you what you want to hear are bogus.
No, but every benchmark that doesn't give me all the details is bogus. You know, statistics and all.

My definition of "personal computer" requires consumer applications.
Hm, allright, the Alpha wasn't exactly a gaming platform in 95/96, but the Mac really isn't right now either.
And you could actually purchase those boxes in the largest computer retail store in Germany...

Charlie White didn't say the DEC Alpha was the first 64 bit personal computer. He said the BOXX was.
Well, see, he doesn't know everything either... :D Actually, I just found out a couple of days ago myself...

Cheers,
Mike - who'll buy the fastest machine he can afford once he has the dough left...

tallscot
07-18-2003, 09:38 AM
OK, who duplicated Veritest's SPEC? Do you have a link? I don't want to see a link that is "almost the same" or "close to what Veritest used".

No, but every benchmark that doesn't give me all the details is bogus.

What details are missing from the Veritest, Tom's Hardware, Mathematica, BLAST, Luxology, Photoshop, eMagic, Quake III benchmarks?

Hm, allright, the Alpha wasn't exactly a gaming platform in 95/96, but the Mac really isn't right now either.

The Mac isn't a gaming platform? That's news to me.

http://www.macgamer.com/
http://www.insidemacgames.com/
http://www.macgamefiles.com/

Did you read about how the Radeon 9800 gaming video card is available in retail now for Macs? You should tell ATI that the Mac isn't a gaming platform. :)

tallscot
07-18-2003, 09:53 AM
I just noticed that even with your Germany magazine's figures, the G5 is faster at FP. Obviously, the G5 will be pulling away with IBM ramping it up to 3 Ghz next year. The Xeon would have to be at 5.1 Ghz next year to keep the same separation.

Got a URL with the same Dell PC being used, duplicating the Veritest benchmarks? Got any detailed information for that? It's funny how you say you think any benchmark that doesn't give you all the details is bogus, yet you gave us benchmarks without any details. :)

Lightwolf
07-18-2003, 09:59 AM
Originally posted by tallscot
OK, who duplicated Veritest's SPEC? Do you have a link? I don't want to see a link that is "almost the same" or "close to what Veritest used".
http://www.heise.de/ct/ Issue 15/2003, alas not online and in German only.
And in terms of close, read my post above, the only difference is the brand of the motherboard.

What details are missing from the Veritest, Tom's Hardware, Mathematica, BLAST, Luxology, Photoshop, eMagic, Quake III benchmarks?
Veritest: they used compiler flags for the G5 that the current gcc release doesn't have, also, an optimized memory allocation library on the Mac only. It is hard to tell how that affects the system performance.
Tom's: Which LW Scene did they render? How many threads and what memory segment settings? (If you check out blanos you'll see that AMD vs. Intel differ from scene to scene).
Luxology used no AltiVec or SSE2 (in a case where SSE2 would probably really shine).
Photoshop has always had good benchmarks on Macs, however most Mac people I know are amazed by how snappy it is on a PC compared to their Mac, so...
eMagic was vs. Steinberg (I think). Hm, let's compare Lightwave to mental ray.... There are too many unknowns as to the internals.
I haven't looked at Mathematica, QuakeIII nor BLAST (wtf is BLAST?)


The Mac isn't a gaming platform?
I was just wondering on how you define consumer software, gaming seems to be the first that comes to mind.
You're right, a few games actually do get ported to the Mac, some even close to the PC release date...

Cheers,
Mike ;)

tallscot
07-18-2003, 10:30 AM
And in terms of close, read my post above, the only difference is the brand of the motherboard.

The only difference? That's a huge difference. Check out benchmarks of various PCs with the same processor, chip set, and RAM, with different motherboards. The speed difference can be quite substantial.

Again, show me a single URL that duplicates Veritest's benchmarking and gets different results. It's not like they didn't use a PC from the #1 selling PC maker, readily available to anyone who wants to test it.

Veritest: they used compiler flags for the G5 that the current gcc release doesn't have, also, an optimized memory allocation library on the Mac only. It is hard to tell how that affects the system performance.

And GCC has Intel-only optimizations too, does it not? We are comparing GCC on an Intel processor that has been out for years compared to GCC on a brand new architecture, and you are complaining that GCC is more optimized for the brand new architecture? I don't accept that argument.

Tom's: Which LW Scene did they render?

The same one with the same settings.

Luxology used no AltiVec or SSE2 (in a case where SSE2 would probably really shine).

Show me a single URL that shows SSE2 outperforming AltiVec. There are many charts on the Web that show AltiVec optimization improving the Mac speed more than SSE optimization. Lightwave 7 is one example. Don't you remember Newtek's charts? I do. AltiVec made huge improvements, but SSE didn't do as well.

Maybe I'm missing something here.

Photoshop has always had good benchmarks on Macs

Why is that? Are you implying that Adobe is neglecting their Windows applications? Doesn't Photoshop have the same SSE optimization that you say would make the difference in Luxology?

I don't accept the "Photoshop has always been faster" position. How can a computer be slower, but Photoshop is always faster?

however most Mac people I know are amazed by how snappy it is on a PC compared to their Mac

I know a lot of Mac users with old Macs complaining about how "snappy" OS X is, then getting on new PCs that are amazed how snappy the PC is. :)

eMagic was vs. Steinberg (I think). Hm, let's compare Lightwave to mental ray.... There are too many unknowns as to the internals.

Now apply that reasoning of yours to using different computers and compilers with a SPEC test.

Logic Platinum versus Cobase was the two sound applications used0. Your analogy with rendering is a bit flawed, no offense. How many simultaneous tracks can be played back is a basic statistic music professionals look at. The same goes with video editors - how many tracks of uncompressed FX and video layers can platform X or Y playback?

This statistic is important to music professionals, and looking at the G5's results would compel them to order a G5.

But Mike, we were talking about missing details that make a benchmark bogus. Veritest doesn't have any missing details.

I haven't looked at Mathematica, QuakeIII nor BLAST (wtf is BLAST?)

BLAST is DNA sequence matching software:
http://www.apple.com/powermac/performance/

Quake III is simple - play the same demo map on two systems at the same settings with the same video card. The G5 got 317 FPS, the PC got 313.8 FPS.

I was just wondering on how you define consumer software, gaming seems to be the first that comes to mind.
You're right, a few games actually do get ported to the Mac, some even close to the PC release date...

And many come out at the same time.

There are more games for the Mac than the X Box. Did you know that? Go to one of those Mac gaming sites and count the number of titles reviewed by them over the last three years. Now count how many titles are on xbox.com. It's easy enough - copy paste into Excel.

Lightwolf
07-18-2003, 10:37 AM
Originally posted by tallscot
I just noticed that even with your Germany magazine's figures, the G5 is faster at FP.
Yup, mind you though, gcc hardly uses SSE2 (as I mentioned), so in this case an Athlon should be even faster using gcc (one thing I'd like to see). The P4 can only beat the Athlon in fp number crunching due to SSE2 optimizations, if you just look at the fp unit without SSE2, the Athlon is faster (...and so is the G5 and the Opteron).

Obviously, the G5 will be pulling away with IBM ramping it up to 3 Ghz next year. The Xeon would have to be at 5.1 Ghz next year to keep the same separation.
Nope, intel will be at the next generation then. Don't forget, the G5 is 2003 technology, the P4 is from 2000.

Got a URL with the same Dell PC being used, duplicating the Veritest benchmarks? Got any detailed information for that?
Nope, I don't, otherwise I'd post it. But I guess the same chipset, process and RAM as well as the same software set-up doesn't count...


It's funny how you say you think any benchmark that doesn't give you all the details is bogus, yet you gave us benchmarks without any details. :)
Go and buy the magazine, or do you expect me to translate a three page article? c't is almost the only source I trust for that kind of information, and this is due to reading the mag for 15 years, I know how carefully they test, and how open they are with their testing methods. Once the G5 is released, it will get an unbiased review there, and I'll keep you informed.

Cheers,
Mike

Lightwolf
07-18-2003, 10:54 AM
Hi tallscot,
before we go on flaming the sh*t out of each other ;)
- yep, the only difference. They used an Epox Motherboard with the same chipset, available for roughly 150$ at your friendly dealer. Nothing special.
- the gcc has great i386 optimizations, but the P4 optmizations suck big time, especially since it doesn't vectorize fp to use SSE2. The P4 does have a different architecture compared to the predecessors, so many of the old school gcc optmizations don't help at all.
- as for the LW scenes: Check blanos and you'll see that it makes a difference.
- I'll get an URL for you as far as SSE2 is concerned. Show me a single URL that shows AltiVec doing dp floating point calcs natively.
- eMagic vs. Steinberg: how do they work internally? Audio mixing isn't audio mixing, just like video layering isn't video layering. How about a comparison of the VideoToaster vs. FCP? The Toaster does 5 or more layers in full quality including wipes in realtime. That does tell you a lot about the app, but little about the platform.

Cheers,
Mike

tallscot
07-18-2003, 10:58 AM
so in this case an Athlon should be even faster using gcc

Should, could, would...be consistent, Mike, and don't be happy with theories.

Nope, intel will be at the next generation then.

Which is? Prescott? That's not next generation, is it? It seems Intel is now adding cache since they are hitting the Mhz wall. Correct me if I'm wrong.

So what will we see next year from Intel that will be a new architecture, and what are its specs, speeds, etc.?

Nope, I don't, otherwise I'd post it. But I guess the same chipset, process and RAM as well as the same software set-up doesn't count...

They count less than Veritest's SPEC numbers, so I guess they don't count much at all. :)

Go and buy the magazine, or do you expect me to translate a three page article? c't is almost the only source I trust for that kind of information, and this is due to reading the mag for 15 years, I know how carefully they test, and how open they are with their testing methods. Once the G5 is released, it will get an unbiased review there, and I'll keep you informed.

Fine, but their test is worthless because it's not the same computer. The motherboard makes a huge difference. You like Blanos, right? Go there and see how the same Pentium 4 with the same OS, the same memory, the same Lightwave version, the same Lightwave scene gets very different times.

So your criticism of the Veritest is unfounded, in my opinion.

Lightwolf
07-18-2003, 11:18 AM
Originally posted by tallscot
Should, could, would...be consistent, Mike, and don't be happy with theories.
You're right, unfortunately there aren't too many hard facts around. Actually, none. And there won't be unless somebody else besides Apple gets a chance to benchmark the machine (and no, Veritest doesn't really count, just look at how Microsoft "pays" for Windows vs. Linux benchmarks in other areas...).

Which is? Prescott? That's not next generation, is it? It seems Intel is now adding cache since they are hitting the Mhz wall. Correct me if I'm wrong.
So, are IBM hitting a wall too beacuse they add more Cache? Prescott is scheduled for Christmas, and that's what I meant in this context. It is next gen since many of the internals (like the branch prediction unit) will be revamped.

They count less than Veritest's SPEC numbers, so I guess they don't count much at all. :)
Well, unlike Veritest c't at least managed to post official SPEC results (for a different platform though, but they at least know the rules), something both Apple and Veritest didn't do.


Fine, but their test is worthless because it's not the same computer.
Now you start to argue like me :) But comparing two different softwares isn't worthless?


The motherboard makes a huge difference. You like Blanos, right? Go there and see how the same Pentium 4 with the same OS, the same memory, the same Lightwave version, the same Lightwave scene gets very different times.
Big difference here. Those are user machines, under real life working conditions.
Veritest did tune the G5 for the benchmark, didn't tune Linux for it, and are meant to be professionals.
I would get different scores on blanos too, probably penalized for running a CD player in the background, or reading my mails every now and then. Normally, the difference between mainboards with the same chipset is around 5% of the total scores, usually around 2%.

So your criticism of the Veritest is unfounded, in my opinion.
It isn't in mine. So, at least we exchanged the arguments.
Have a nice weekend, I'm off!
cheers,
Mike

Ed M.
07-18-2003, 02:36 PM
Well, if the SPEC marks aren't "foul", why aren't they on the official SPEC list, right next to the other vendors?

You didn't read the Ars thread and SPEC has no problem with how the test was conducted. SPEC is NOT about MAXIMUM scores as people would think or as you so narrowly believe. Read the entire Ars thread.


the gcc used by Apple is optimized for the G5

No, it isn't. If you think about it, much more development has gone into gcc for Intel machines. Period. Read the entire Ars thread that's been through this already.


SSE2 vs. Altivec: SSE2 calculates in double precision, AltiVec doesn't (even though there are tricks to use AltiVec for dp), in turn though the G5 has 2 fp units, where the P4 has only one.


You don't want double precision in a vector unit.. That's why it was better to add more FP units. Read the 11th post down here:

http://vbulletin.newtek.com/showthread.php?s=&threadid=2494 (http://vbulletin.newtek.com/showthread.php?

s=&threadid=2494)


If the code is properly vectorized (something gcc doesn't do, but icc does),


right, but how many shrink-wrap developers are using icc? Not many. And we all know much more time and optimizations have gone into the Lightwave packages for Intel machines than has gone into the Mac version. Same with Photoshop. I can bring Chris Cox here to verify the latter if you wish.



fp will be much faster much on a P4.

No, FP on the P4 will NOT be as fast as FP on the G5. This has been discussed on Ars already and it's been acknowledged.
As for the Power4, have a look at SPEC and the compareable performance of the Itanium2 and the Opteron (even though it plays in a different league).

Then look how well each system performs in the real world. The Power4 is meant to have continuous up-times of thousands of years (i.e., reliability) Server CPUs aren't really concerned with Speed. Then there is the Power5 to think about. It's performance will be at least 4 times that of Power4. There is nothing from AMD or Intel -- noting on the radar.


Actually, that's not what they're saying. But, official SPEC marks are created by vendors using optimized systems and compilers and should be regarded as such.


That's wrong. It's a complete MYTH. SPEC doesn't only provide for "Most optimum configurations" You're lying. Show me the official SPEC rule that says that vendors MUST use the most optimal configs for SPEC marks.

You obviously didn't read the thread over at Ars where people discussed this in GREAT GREAT detail -- people who have been in the industry using SPEC far longer than anyone on this thread and who are by far more familiar with the rules and code. Apple and VeriTest are 100% compliant. How about you ask *SPEC* if they have any trouble with the test?

That's why all the noise has died down. If the test was bogus then there would still be news all over the Web and SPEC would have came out and debunked the claim. And Apple didn't base their claim of the "world's fastest Personal computer" on SPEC alone. There are numerous other benchmarks, but you'd have us believe that they are ALL bogus?



SPEC can tell you how a machine can perform in a very tight, optimized situation.

And that's the norm? That's the situations that most people are in while running their apps and hardware? People running Intel boxes are running in the most optimal config all the time? The apps are optimized to the fullest? Everyone is using Intel's reference compiler? Yep, you are wrong again. Go and read the Ars article. There is no need to regurgitate your bogus claims on this forum.


Seen from that point of view, the Veritest SPEC results are complete bogus, especially considering that other independants have gotten higher SPEC marks on almost identical P4 systems. ]

But theirs isn't subject to suspicion? And again, I'll ask you... How many shrink-wrap software developers use Intel's reference compiler? Let'
s be honest. Once you come back with a long and verifiable list, then we'll talk. The only people that SPEC is important to are people who sit around and run SPEC all day. And as anyone with a clue knows, SPEC IS NOT the main benchmark used to determine which system is the fastest. Nor does it imply that it will be the case 100% of the time.



Of course, one can use the SPEC suite un-officially to compare systems, but you can't really claim to have the fastest system then either, only within the parameters that the systems were tested in. If these are bogus, so is the claim.

<sigh> SPEC is not designed to be a definitive benchmark that shows "which system is fastest". Go READ the entire Ars thread. You're claims are completely bogus. If you don't believe me, e-mail the SPEC group yourself.



As for the 64bit part. Actually the DEC Alpha was the first 64bit PC (even sold as such for some time way back when), it never had a 64bit consumer OS though. The Opteron can actually use more than 4GB in 32bit mode, but you need a quite expensive Windows version (the Xeons handle that as well). There is a 64bit Linux available though, and has been for some time.

That's right and none of those systems were marketted towards everyday consumers like mom and dad and auntie Carolyn. No one is going to be running Linux and as soon as Apple offers linux-app compatibility within OS X (if it's not already there) the question of a desktop Linux will erode even further... Going from "how?" to "why even bother..."

Lightwolf, you keep telling us that there is a 64-bit version of Windows. Where? After you answer that, show me the price and then show me the price of the systems that have it preinstalled. After that, show me the apps. After that, show me the millions of units sold in that configuration.

OK, you keep telling us that there is a 64bit Windows version, but what you fail to mention is that it's for Servers and it's only aimed at Opteron. Is it more than just a beta release for Opteron? Link?

You also tell us that there is a 64bit version of *desktop* XP. May we see the link? Is it geared towards consumers? What about the price? What apps are available for it? How seamlessly will it run existing code? Link? I'll tell you what I think about the 64-bit debacle that the Wintelon world seems to be in.. You can read some past comments I posted about it here (12 posts down from the top) :

http://vbulletin.newtek.com/showthread.php?s=&threadid=1091

I'd like some feedback from others on this forum about the comments I made in the links I posted. tallscott, Beam?

--
Ed

toby
07-18-2003, 10:05 PM
I think you've got it pretty well covered Ed! If no one else is chiming in it's because we can't think of anything to add :)

very informative as usual

dfc
07-19-2003, 01:22 AM
I'm still reading all the links he posted...owwwww. My head hurts

I'll come back in a year or 2 after I digest it all.

Johnny
07-20-2003, 08:51 AM
Originally posted by toby
I think you've got it pretty well covered Ed!

yeah...that was a thorough liquification of the pro-PC argument..

J

iandavis
07-20-2003, 09:45 PM
having a fast system is important. I mean it is up to a point.

I'm running a G4/733 for graphics and an Athlon1200 for LW.

When I try to use photoshop on my windows box I find it very frustrating. It seems to slow down at every corner. how could my G4/733 so OBVIOUSLY outperform this Athlon? Or how could my LW on the athlon perform SO MUCH BETTER then on the G4/733?

I believe that what you purchase when you buy a mac is the SYSTEM. You have one of the most stable Operating Systems coupled with hardware manufactured FOR the OS. Raw rendering power has always been won buy the custom built windows boxes. I don't see that changing. Right now for raw speed I could build a dual Athlon box that would kill any existing G4... but what about general use, stability, the hours and hours you spend installing crap on a windows box, the constant updating, etc. etc. my Mac is GLORIOUSLY independant. i can beat it with a big stick and it just keeps running! Sure my athlon runs LW better, but if I had to do everything on it... if I didn't own a mac... FREAK I would be frustrated.

he he

it's like arguing which breed of horse is better as a pet....

tallscot
07-21-2003, 10:59 AM
Ed,

Nice post!

silvergun
07-24-2003, 02:03 PM
Nice work my good man

DrVideo2
07-29-2003, 05:12 PM
They Adobe, Luxology, Alias etc...

Can't all be liars.

Here are their quotes "Developers Applaud Power Mac G5 Performance




Adobe
“Thanks to the hard work of Adobe’s engineers, Photoshop performs twice as fast on Power Mac G5s, when compared to any other system we've seen from Apple,” said Greg Gilley, vice president, Graphics Applications Development at Adobe. “The future of Photoshop on the Macintosh platform is being geared around exploiting the power of Mac OS X and tapping the outstanding hardware performance of a new generation of Power Mac G5s.”




Luxology
“We are simply blown away with the performance we are seeing out of the chip and the incredibly wide pipes on the motherboard which allows our 3D technology to do more in real time than we ever thought possible,” said Brad Peebler, Luxology’s president and co-founder.




Pixar
“After running our RenderMan benchmarks, we can now say that the Power Mac G5 is the fastest desktop in the world,” said Ed Catmull, Pixar’s president.




Macromedia
“The combination of Macromedia products and the new Power Mac G5 from Apple is an ideal platform for developers to create great experiences,” said Norm Meyrowitz, Macromedia’s president of Macromedia Products.




Digidesign
“Processor power is one of the prime things that makes the magic of digital audio happen. The new Power Mac G5 is going to give our customers just want they want—loads of power and a great OS that really delivers,” said Dave Lebolt, Digidesign’s general manager and Avid Technology vice president.




Alias|Wavefront
“Alias|Wavefront is extremely pleased with the performance we’re seeing from our initial tests of Maya on the Power Mac G5,” says Kevin Tureski, general manager, Maya Engineering at Alias|Wavefront. “From dynamics to rendering, we’re seeing twice the performance with our application. Our customers will be thrilled with the Power Mac G5.”




Wolfram
“The Power Mac G5 is a scientific dream machine,” said Wolfram Research co-founder Theodore Gray. “Mathematica Version 5’s enhanced support for large-scale numerical linear algebra, linear programming, and PDEs, is a perfect example of what you need a machine as powerful as the Power Mac G5 for: Everyone who uses Mathematica, or should, would do well to look at the Power Mac G5 very seriously.”

Amadeus0
07-29-2003, 07:13 PM
Originally posted by mlinde
Ok, I'm biting. Here's specs on a BOXX Opteron Workstation:

Dual Opteron 240
4 GB RAM (max)
nVidia GeForce 5600
120 GB HD
DVD-R Drive
3 FW Ports

$4054.00

And a comparable G5:
2x 1.8 GHz G5
4 GB RAM (from Apple)
nVidia GeForce 5200
160 GB HD
DVD-R Drive

$4,149.00

You figure the 40 GB is worth about $100? I do. Of course, both machines can go up in configuration and options. I went for what I hoped was similar configuration for price comparison. Oh, my bad though. The Opteron quoted is only 1.4 GHz.

Ummm, the GFX card on the Opteron in about 20-40% faster then the one on the Mac. That would account for some of the price difference.

ALSO what is this about "AGP Pro 8x"?
Last time I checked (which was about 6 months ago) that the current AGP spec (3.0) allows only for 4X speeds on any Pro card. Also the "Pro" specs allow for 50 watt, and 110 watt cards. What type of AGP card slot is on the G5 Motherboard? If it's not the Pro110-type slot, then a Wildcat wont do you any good.

Amadeus0
07-29-2003, 07:24 PM
Originally posted by Beamtracer


The G5 is a revolutionary machine because it heralds the new era of 64-bit computing for the masses. Despite the existence of 64-bit UNIX machines for many years (used for professional applications) the Microsoft Windows platform has not reached this milestone.



The problem I see about this talk of no Windows for 64-bit Opterons is...

1. Panther wont be out when the G5's hit the market. (If I'm wrong please let me know, but the last thing that I saw in writting FROM Apple was that. G5=Sept./Panther=Dec.)

2. It apperas (that accroding to Micrsoft's Timetable) 64-bit Windows will be out at about the same time (if not a LITTLE earlier.)

3. As for "64-bit for the masses", AMD willl be there Hardware-wise at the same time, and Windows will be there software/OS-wise at about the same time as well.

Amadeus0
07-29-2003, 07:28 PM
Originally posted by harlan
Yeah, I much prefer taking a sip of a model and having beer suddenly appear around me. ;)

Wait a sec.

Are you the "Harlan" of the VT group?

;)

How is your Film going?

policarpo
07-29-2003, 07:42 PM
Panther isn't going to be 100% 64-bit...it will just have hooks to 64bit operations for specific applications to take andvantage of...just so you know.

we're still a few years out from a complete 64bit OS environment...

Amadeus0
07-29-2003, 07:54 PM
Originally posted by policarpo
Panther isn't going to be 100% 64-bit...it will just have hooks to 64bit operations for specific applications to take andvantage of...just so you know.

we're still a few years out from a complete 64bit OS environment...

Yeah I know... :rolleyes:

Anyway when will a 64-bit OPTIMZED version of Lightwave come out (for Apple, and AMD)?

That's the first question, then "how fast is it?" will be the next one (and really for most of people on this list.)

policarpo
07-29-2003, 07:58 PM
true true...sorry...i didn't catch up on the thread...sorry....

:)

Ed M.
07-30-2003, 03:49 AM
You guys are forgetting... A full 64bit OS can be SLOWER a lot of the time. What Apple will provide in the meantime is a hybrid 32/64bit OS.

I look at it as "64bit (speed) as you need it and only where you need it". because there's been examples of 64bit versions of the same 32bit app that will actually run slower than the 32bit version.

Now, the nice thing is that OS X will still provide access to the full amount of available RAM. Watch, it will be the smoothest transition yet. We've discussed the 32/66bit speed thing on these boards a while ago. All you have to do I revisit those threads.

Regarding Windows-64bit... I'm still not convinced. That means that Microsoft will have to support "yet another version of Windows". And then there is the fact that Longhorn is due mid 2005. Are we to believe that Microsoft will be supporting a 32bit AND a 64bit version of Longhorn?

Regarding the BOXX systems... I'm having a hard time finding benchmarks and application performance evaluations or even simple reviews of these systems. You would figure that if the systems had been *shipping* since early June then *someone* must have tested it with a battery of benchmarks and applications.

I'm even having a hard time locating institutions that might be using these things. It's as if the company is saying... "Boy, they're flying off the assembly line faster than we can keep up..." but there is noting in the mainstream (or otherwise) that even suggests that this thing is all that it's cracked up to be.

I'm really not trying to spark up a flamewar, but doesn't it seem just a little unusual that none of the mainstream (PC) review sites or technology/PC-based media sites have had anything to say about the "hands-on" experience with one of these things? If I've missed something, please post the links with the reviews and scores etc.

--
Ed

Ade
07-30-2003, 04:46 AM
http://home.comcast.net/~zeio/sig/3.jpg

Lightwolf
07-30-2003, 05:00 AM
Hi Ade,
now this is interesting. If you extrapolate the Xeon performance, the G5 is pretty much on par with a Xeon 3.2 (3:33 Min), and faster than a Xeon 3.06 (3:42 Min.).
I'd guess that extrapolating on the same processor is valid in rendering performance (at least from compareable benchmarks I've looked at).
So, not really the speed that Apple promised, but very fast, especially for the price. If they get a decent optimizing compiler to tickle the G5 a bit more (which can buy you another 10%-20%), we have a nice contender here...

Ade
07-30-2003, 08:39 AM
Put it this way, renderman isnt even optimised for G5 yet, but thats not to say well expect even another 20 faster (maybe we will?) but say to yourself 3 months ago we would have been 1 minute slower!


the G5 steamroller has begun....!
And will newtek be there to steer?

Lightwolf
07-30-2003, 08:42 AM
Originally posted by Ade
...but say to yourself 3 months ago we would have been 1 minute slower!
I hope more than that, I'd expect the fastest G4 to render that image in 7 Minutes, if not more...

Cheers,
Mike

Johnny
07-30-2003, 08:45 AM
It'd be interesting to know whether both of those machines are dual-processor

sketchyjay
07-30-2003, 08:47 AM
hasn't that always been the case. It seems that MHz to Mhz the motorola and IBM processors have always been a little faster than Intels. Hell even AMDs processors Mhz to Intel Mhz are not quite matched.

The problem is intel just goes "oh they caught up" then releases the next processor. As I mentioned before and they have hinted at, they have nice powerful processors to whip out as they feel like it. I wonder if all the processor companies do this?


In any case a month to go before I get our G5. Can't wait to test it against the G4s which is really the only thing that matters.

Jay

Lightwolf
07-30-2003, 08:59 AM
Originally posted by Johnny
It'd be interesting to know whether both of those machines are dual-processor
Well, the Xeon is a dual processor, and the 2GHz G5 will afaik only be available in the dual configuration.

tallscot
07-30-2003, 10:36 AM
According to MS's PR about AMD Windows 64 bit, it's suppose to be in beta in "mid 2003". No one is expecting it to be gold until 2004.

Also, Lightwave doesn't have to be 64 bit to take advantage of 64 bit addressing and data paths in OS X 10.2.7 (the OS that ships with the first G5s) or 10.3. Adobe has announced officially that they will have a free plug-in for Photoshop that optimizes it for the G5. If all Adobe has to do is issue a plug-in, it doesn't sound like a huge deal to optimize your app for the G5, but I'm speculating.

Regarding the BOXX systems... I'm having a hard time finding benchmarks and application performance evaluations or even simple reviews of these systems. You would figure that if the systems had been *shipping* since early June then *someone* must have tested it with a battery of benchmarks and applications.

Here ya go:
http://www6.tomshardware.com/cpu/20030422/opteron-23.html#3drendering

Just a reminder - the G5 will be 2.5 Ghz in January and 3 Ghz this time next year. Is the Xeon going to be 5.1 Ghz this time next year?

Ade, thanks for the image!

Ade
07-30-2003, 10:43 AM
I posted Opteron lightwave benchmarks somewhere in this forum...

As for NT making a G5 plug, i wouldnt hold my breathe. Theyd rather make a newtek mouse pad for their Boxx buddies.

Lightwolf
07-30-2003, 10:49 AM
Originally posted by tallscot
According to MS's PR about AMD Windows 64 bit, it's suppose to be in beta in "mid 2003". No one is expecting it to be gold until 2004.
That's XP though, not W2K3 (But that is a server OS) .

Also, Lightwave doesn't have to be 64 bit to take advantage of 64 bit addressing and data paths in OS X 10.2.7 (the OS that ships with the first G5s) or 10.3. Adobe has announced officially that they will have a free plug-in for Photoshop that optimizes it for the G5. If all Adobe has to do is issue a plug-in, it doesn't sound like a huge deal to optimize your app for the G5, but I'm speculating.
Actually, you can't just change the adressing of an app to 64 bits just with a bunch of plugins. Adobe may have optimized plugins, but Photoshop won't be able to adress more memory beause of them (execpt if they use plugins for their memory management, which wouldn't surprise me looking at the abysmal memory management Photoshop has...).


Just a reminder - the G5 will be 2.5 Ghz in January and 3 Ghz this time next year. Is the Xeon going to be 5.1 Ghz this time next year?
That's Prescott (or the Dual version of that), not Xeon...
4.5 GHz would be enough, at least looking at the current renderman performance...
Doesn't intel have a launch at the end of the year?

tallscot
07-30-2003, 11:22 AM
Well, we were discussing which is the first 64 bit personal computer and the BOXX was cited by the PC advocates as the first. Obviously, 64 bit servers have been around for some time now.

Adobe was able to make Photoshop SMP with just a plug-in. They were able to make Photoshop AltiVec optimized with a single plug-in. According to their own press releases, they will have a plug-in that "optimizes Photoshop for the G5". The catch there, I suppose, is what they are referring to as "G5 optimized". But the rumor sites that had Photoshop 8 screenshots say that it is 64 bit. Hmmm.

The roadmaps that I see on the Web have the Prescott maxing out at 5 Ghz. It debuts at 3.4 Ghz in Q4 2003. But then we have to wait for the Xeons to be updated.

IMHO, if we see a 3 Ghz IBM 980-based G5 in June of next year, the gap will be widening between the two to the G5's benefit.

Lightwolf
07-30-2003, 11:44 AM
tallscott:
The same hardware, different OS, it depends on how you define personal computer...
As for the Adobe plugins: they only accelerated parts of the app, SMP in Photoshop is still a myth except for some accelerated filters and conversions...
They can probably accelerate filters for the G5, put I assume that PS 8 will really take advantage of the G5 (and might have some decent SMP for a change..., at least they'll have pixel aspect ratios...).
As for the Prescott: It will have internal optimizations, after all, the P4 / Xeon core is from 2000, so I'd expect a couple of speed ups.
So far the G5 seems to be on par with the Xeon, let's wait and see what happens next....
Cheers,
Mike

redlum
08-02-2003, 01:23 PM
Originally posted by DrVideo2 They Adobe, Luxology, Alias etc...

Can't all be liars.

Here are their quotes "Developers Applaud Power Mac G5 Performance

Adobe
“Thanks to the hard work of Adobe’s engineers, Photoshop performs twice as fast on Power Mac G5s, when compared to any other system we've seen from Apple,” said Greg Gilley, vice president, Graphics Applications Development at Adobe. “The future of Photoshop on the Macintosh platform is being geared around exploiting the power of Mac OS X and tapping the outstanding hardware performance of a new generation of Power Mac G5s.”

Luxology
“We are simply blown away with the performance we are seeing out of the chip and the incredibly wide pipes on the motherboard which allows our 3D technology to do more in real time than we ever thought possible,” said Brad Peebler, Luxology’s president and co-founder.

Pixar
“After running our RenderMan benchmarks, we can now say that the Power Mac G5 is the fastest desktop in the world,” said Ed Catmull, Pixar’s president.

Macromedia
“The combination of Macromedia products and the new Power Mac G5 from Apple is an ideal platform for developers to create great experiences,” said Norm Meyrowitz, Macromedia’s president of Macromedia Products.

Digidesign
“Processor power is one of the prime things that makes the magic of digital audio happen. The new Power Mac G5 is going to give our customers just want they want—loads of power and a great OS that really delivers,” said Dave Lebolt, Digidesign’s general manager and Avid Technology vice president.

Alias|Wavefront
“Alias|Wavefront is extremely pleased with the performance we’re seeing from our initial tests of Maya on the Power Mac G5,” says Kevin Tureski, general manager, Maya Engineering at Alias|Wavefront. “From dynamics to rendering, we’re seeing twice the performance with our application. Our customers will be thrilled with the Power Mac G5.”

Wolfram
“The Power Mac G5 is a scientific dream machine,” said Wolfram Research co-founder Theodore Gray. “Mathematica Version 5’s enhanced support for large-scale numerical linear algebra, linear programming, and PDEs, is a perfect example of what you need a machine as powerful as the Power Mac G5 for: Everyone who uses Mathematica, or should, would do well to look at the Power Mac G5 very seriously.”

I spoke to one of the Newtek guys (Dues) at the Siggraph convention in San Diego last week and he said the Apple hasn't sent them a machine to test on yet. Let's hope they do this soon. I also brought my Lightwave Discovery CD to see if some one at the expo would let me try LW7.5 on the G5. Nobody had the passwords to install it. So we will have to wait and see. I love the way they looked and photoshop was a blur as far as speed goes.

panini
08-03-2003, 01:30 AM
And anybody with experience in marketing could tell you that those comments are all paid for ( either cash or favors in exchange for a comment like that )

Just like all questions Jay Leno or Letterman ask guests are agreed upon and rehearsed before the show ( Yes, I worked on one of these a few years back ).

Next thing, you are going to tell me you believe that Survivor, Bachelor and all these shows are really for real?

The best we can do is take figures from Apple about G5 and ignore what they say about PCs, then take figures from Intel and AMD for their processors and ignore what the PC side says about Macs.

According to Apple's official figures and AMD's official figures ( spec ):

Dual 1.8ghz Opteron is 30-50% faster than a dual 2Ghz G5

And Opteron came out as the first 64 bit workstation in April, G5 is still nowhere to be found.

Ed M.
08-03-2003, 08:08 AM
OK people.. Repeat after me (then keep repeating it until it sticks): "Since SPEC is infinitely abusable it shows VERY LITTLE in terms of system performance that will be witnessed in the real-world with apps that everyone uses in a production environment." End of story.

The whole spec issue was combed over and over and over at ArsTech. SPEC ins't meant to show *just* the highest scores. IBM didn't post it's benchmarks using their VisualAge compiler, which would have shown significantly higher scores. As far as the whole GCC argument goes, we've got a quote directly from one of the chief architects of the 970 as well as an architect working on GCC... (and yes, the links to the quotes have been provided in this forum before). Bottom line is that GCC on x86 has had YEARS more lead time and contains vastly more optimization than it does for GCC on PPC. The kicker is that the GCC is hobbled EVEN FURTHER on the 970... And I quote (again):

From what Hannibal learned during his interview:

[[One interesting fact that I learned about gcc is that the Power4 and 970's peculiar group dispatch scheme and issue queue structure doesn't quite fit with gcc's internal model of what a processor should look like. As a result, the L1 cache latency number (and I think some of the other numbers, as well) in the machine description file had to be altered from their true value in order to get the best performance out of gcc. ]]]

OK, now for David Edelsohn's direct comment:

[[[David Edelsohn: ...and gcc is modeling this [i.e. the L1 cache latency] with one more additional cycle because we've seen a benefit... The gcc scheduler is not really designed ideally for a processor like the 970 and the Power4 and others, and that's a lot of what the IBM and Apple teams have worked on, due to the complexity of the processor with the dispatch groups and the whole way it dispatches and issues and what parts are in-order and what parts are out-of-order, trying to better instruct the compiler how to arrange code to match that. So there are certain places where we give it [i.e. gcc] information that's more ideal for what it needs to generate than for exactly describing the processor. So again that's a lot of what IBM and Apple have been working on... that was one of the things that we're continuing to work on to try to get the best performance out of the processor. ]]]

In other words, they are trying to improve GCC performance on the 970 by making it better architected for a new processor like the 970. In other words, they've only just begun to fix GCC so it will work better with the 970. PC people sure jumped to conclusions a little too quick that time... So, the whole crap regarding SPEC has quieted down. At least this guy has it *partially* correct:

The Smell of Fear (http://www.applelust.com/oped/amc/archives/amc030718.shtml)


Now regarding the Opteron systems... I'll ask these questions (again) <sigh>

Where are they? Has anyone come across any these BOXX systems? Which companies are using them? What are they using them for? What OS are they using? Hell, what apps are they running? Sheesssssh...

What's more, I'm having a [email protected] of a time finding benchmarks, application performance evaluations or even simple reviews of the BOXX systems (which supposedly shipped as *desktops* from this company). Also keep in mind that AMD did NOT intend Opteron to be used in a desktop setting. I wonder why? BOXX simply ignored what AMD said and attempted to make it into a desktop solution and it doesn't seem to be fairing that well against the Intel competition.

Now I know the test over at Tom's Hardware (or was it Ace's?) showed the dual Opteron getting HAMMERED by the Intel machine, but it still wasn't one of these BOXX systems.

For anyone with a brain this whole "Opteron for the desktop was first" crap seems highly suspect. Not only is the Opteron aimed at a different market (according to AMD), but it was aimed at higher-performing CPUs than the 970 (Power4? Itanium?). So, the fact that the 970 will hold it's own against AMD's *best* or even trounce it says a LOT. Hasn't ANYONE from the PC-mainstream review sites had a hand-on with one of these systems? Perhaps they have and the results don't appear to be that good? I don't know, can someone out there provide me with some links to reviews of these systems? I'm *still* waiting.

Anyway, at least the G5s have been out in circulation for a while and institutions like NASA have had a chance to tinker with protoypes. That's a lot more than I can say for BOXX systems.

--
Ed

Lynx3d
08-03-2003, 01:24 PM
A few things...

Opteron being a server CPU à la Itanium and Power4...not really.
The "Hammer" design is just what AMD sees as the future for desktop, workstations and (enty level) servers. And the G5 is the proof that AMD is not the only one thinking 64bit is good for desktops too.
And if you go to the AMD website you will find the words "Opteron" and "Workstation" side by side in the same line...

Opteron benches...i can't present you a boxx-PC with rendering benches, however a Opteron 144 system tested by xbit-labs (http://www.xbitlabs.com/articles/cpu/display/opteron-1_18.html)
As you can see performance is very different...Lightwave pretty good, Max and Cinema not great, question is if they will use their SSE2 code at all on K8 CPUs, haven't read an official statement about that yet, LW more than obviously does.
If you search a bit on 2cpu.com you'll some various benches of dual and quad opterons, but that's not really the place for graphics related stuff.
Whether boxx will actually be able to ship a dual Opteron workstation immediately, i don't know, since there are no retail dual-boards with AGP yet, but you'll have to ask boxx-technologies.

As for the OS, currently you'll have to use what everyone uses, 32bit Windows 2k/XP. (Or Linux, 64bit but without 3d software AFAIK)
Don't know how far Microsoft's x86-64 Windows is, most likely they will finish the server version first, just like the server boards popped up first, but that somehow happens on all high-end PC hardware, be it Athlon MP or Xeon boards...

And that veritest and SSE2 talk...AFAIK GCC still only uses SSE2 for scalar math, it does not automatically pack instructions/data into vector format. So enabling SSE2 won't really make a difference (at least i couldn't notice any difference under windows with SSE on my Athlon, i wanted to process bitmaps in single precision FP, after some research i found that for GCC i'd have to modify the object files containing the vector math in assembly level and use that for linking)

Whatever, as you said, currently both, Opteron workstations and G5 systems are "paper tigers"

js33
08-03-2003, 05:05 PM
Originally posted by Ed M.
Hasn't ANYONE from the PC-mainstream review sites had a hand-on with one of these systems? Perhaps they have and the results don't appear to be that good? I don't know, can someone out there provide me with some links to reviews of these systems? I'm *still* waiting.
--
Ed

Really Ed,

We are all *still* waiting for G5 reviews and real benchmarks. :D :D

I don't think you looked very hard for Opteron reviews. ;) Here's a few I found.

Tom's hardware review of the Opteron vs. Xeon (http://www6.tomshardware.com/cpu/20030422/index.html)

The Opteron at Siggraph 2003 (http://www6.tomshardware.com/business/20030730/index.html)

Launch press release (http://www.dvformat.com/2003/06_jun/news/boxx063.htm)

AMD review site (http://www.amdzone.com/)

IBM displaces POWER chips for Opterons. What gives? (http://www.theinquirer.net/?article=10786)


Anyway just search google for more. It's all out there.

Cheers,
JS

panini
08-03-2003, 05:31 PM
As far as I understand BOXX was SHIPPING Opteron workstations on June 2nd

I believe at Siggraph Lightwave was demoed on one of these, everything ran very real time as opposed to crawly 10 FPS demo Luxology did with their app when G5 was announced

Here you go:

http://www.amdboard.com/opteron_board_arima_hdamb.html

BOXX systems supposedly ships with that board

Here is another system:

http://www.armari.com/system.asp?SysID=183


If you know where to look it's not difficult to set up one of these.I've got one running in the other room. For me since I upgraded from 1Ghz everything is much faster . I upgraded MB /CPUs and memory myself. Mostly use Vegas right now and previews don't slow down even with 5-6 different effects/layers.

Believe it . They are out there working.

G5 is the one nowhere to be found.

Ed M.
08-03-2003, 05:55 PM
Js33, panini... Perhaps you guys misunderstood. I'm looking for REAL concrete benchmarks running applications and software. I've seen NONE. I know about the links above. They aren't BOXX systems. You would think if the performance was all that, it would be posted EVERYWHERE on the web. The news of these totally awesome systems should be everywhere... There isn't any.

Regarding NewTek ... if they've run any tests / apps, then it should be no problem having someone from their dev group post something to this forum backing up your claims. I'd like someone from NewTek to come here so we can discuss the Opteron performance... More specifically, the BOXX systems. And remember, if they are running Windoze, they aren't 64bit and they are not taking advantage of any of the 64bit features.. and I'm not talking about memory addressing. So, again, I'll ask, where are the benchmarks?

Now, I might know a few people that are actually doing development with dual Opteron boards, so I can get the skinny on them, but I'll wait to see what you guys have to say/post first so I can check it against my other sources. BTW, you guys are creaming over these Opteron systems so much you fail to see that these AMD systems have one glaring flaw. I know the Achilles heel and I'm guessing that developers know what it is too. Do you?

--
Ed M.

js33
08-03-2003, 06:03 PM
Ed,

I think Newtek was using an Opteron system at the Sig demos.
I'm not 100% sure so you will have to ask Deuce or Proton but they were Boxx systems.

Also I'm not really creaming over the Opterons. I just pointed out where you could find out more. That's all. Also the Opteron vs. Xeon link had plenty of application benchmarks even a LW one.

I am waiting for the Itanium 2 as I think it will mop the floor with both the Opteron and the G5 when the speeds get up to the 2 Ghz level. They are at 1.5 Ghz now and climbing.

I know they may take a little longer to get all the 64 bit stuff ironed out but I can wait. :D :D

Cheers,
JS

panini
08-03-2003, 06:19 PM
Those benchmarks are real, there is even Lightwave there.

The bottom line for me were rigged demos by Apple

One of the demos shown was with G5 running a 22 track music piece against a 3Ghz + PC running 22 tracks + 5 plug-ins in Cubase.

PC crashed while mac performed flawlessly.

This is one of the absolute proofs that PCs were cripled and demonstrations were rigged.

Anybody who ever used Cubase knows that a 3Ghz + Pentium 4 can easily run 3-4 times the number of tracks and plugins which supposedly were too much to handle at that demonstration by Jobs. I was using more than 22 tracks on my old 1GHz Pentium3.

js33
08-03-2003, 06:39 PM
Yeah I was very skeptical of the Cubase vs. Logic test as well. :D

Cheers,
JS

js33
08-03-2003, 07:19 PM
I want one of these.:D

SGI Altix (http://www.sgi.com/servers/altix/configs.html)

That would make a sweet render machine now that LW has a Linux render module. But typical with SGI they are very expensive.
:D

Cheers,
JS

Lamont
08-03-2003, 07:23 PM
Great kugga moonga!!

I wanna see some benchmarks with that thing...

js33
08-03-2003, 07:39 PM
Lamont,

Can you imagine almost real time rendering of an entire movie?
This thing will scale up to 2048 Itanium2 processors.:D :cool:
Oh if I only had Bill Gate's money I would by an SGI running 64-bit Linux.:D :D

Cheers,
JS

Lamont
08-03-2003, 07:42 PM
Someone has to have bought one of those...

Ed M.
08-03-2003, 07:57 PM
Skeptical or in denial? I'm telling you guys.. You're gonna be sadly disappointed. The G5s are going to perform a lot better than people expect, and this will be on existing code written and optimized for G4, not G5. Tweaking it for G5 won't take too much effort and besides, the G5 completely lacks the major flaw in the Opteron.

Anyway, about a year ago I was in an e-mail discussion with a cross-platform developer. In one of our exchanges I mentioned my disappointment because at the time it was covered by the media that Hollywood seems to be going completely to Linux on Intel...

The reply to me was:

[[[No, they're not,and not completely. They're *using* Linux on Intel for now, but not as much as you think or as much as they'll have you believe. It's just that the Linux and Intel people keep making press releases to give you that impression.

I have quite a bit of inside information on what's really happening in that space.
But it's covered under NDAs. Believe me, things are happening and many people are going to be surprised. They will become evident in time.]]]

I'm guessing that this person must have known about G5 and Pixar and what their plans were. The person also has contacts deep inside ILM and other 3D houses. This person was always so vague when I brought up future Apple development plans. I was reassured on countless occasions that Apple was working on addressing all the issues regarding speed once and for all. Obviously the person was indirectly referring to the G5 and Apple's relationship with IBM. And keep in mind that the G5 is just the beginning.

The smell of fear indeed.

--
Ed

js33
08-03-2003, 07:57 PM
They just came out in May and yes alot of manufacturing and Oil and gas companies have bought them. I don't know if any media companies have bought yet but it's only a matter of time. :D

Cheers,
JS

js33
08-03-2003, 08:01 PM
Originally posted by Ed M.

Anyway, about a year ago I was in an e-mail discussion with a cross-platform developer. In one of our exchanges I mentioned my disappointment because at the time it was covered by the media that Hollywood seems to be going completely to Linux on Intel...

The smell of fear indeed.
--
Ed

HHHHHHHHehehehehehehe
I guess you haven't seen this yet Ed.:D :D :D

SGI Altix (http://www.sgi.com/servers/altix/)

The smell of fear INDEED.

Cheers,
JS

Lamont
08-03-2003, 08:02 PM
Originally posted by Ed M.
The smell of fear indeed.Smell of fear? I would love to buy another Mac.

I just wish Apple would let other companies make mo-bo's and other brick-a-brack for the Mac architecture, maybe then people can make their own for half the cost.

js33
08-03-2003, 08:10 PM
Yeah that would boost their profits to open up the hardware more. Allow users more flexability and choice like in the PC world.
Who wants a workstation class machine with a crappy ATI card?

Cheers,
JS

Beamtracer
08-03-2003, 08:15 PM
Originally posted by Ed M.
AMD systems have one glaring flaw. I know the Achilles heel and I'm guessing that developers know what it is too. Do you? What is it, Ed? What is the flaw with AMD 64? (Hey that rhymes!)

I'm surpised people are drooling over Itanium systems. The Itanium is Intel's Achilles heel. It must be an embarrassment to them. All your 32-bit apps will have to run with an emulated 32-bit processor which will be a slow as an old dog.

js33
08-03-2003, 08:20 PM
Originally posted by Beamtracer
What is it, Ed? What is the flaw with AMD 64? (Hey that rhymes!)

I'm surpised people are drooling over Itanium systems. The Itanium is Intel's Achilles heel. It must be an embarrassment to them. All your 32-bit apps will have to run with an emulated 32-bit processor which will be a slow as an old dog.

Well the Itanium 2 is making fast progress. Supposedly they have a 32bit emulation now that runs almost as fast as native 32bit.

Look at this Beam. :D

SGI Altix (http://www.sgi.com/servers/altix/)

Talk about a cluster. :D :D

Cheers,
JS

panini
08-03-2003, 09:38 PM
The only fear I smell is the fear of theat person who wrote the article.

It's so full of lies and misinformation that it couldn't be more obvious it was written by a apple fanatic who had just %$^# his pants after realizing how lame G5 really was. He's in complete denial and spewing nonsense.

Opteron has no serious flaws and Opteron is just the beginning.

If you want to compare G5 which still isn't available to something then we can also speculate on upcoming Athlon 64 and Pentium 5.

Intel can dump a faster chip at any time, it just doesn't make any sense for them to do so until others catch up.

G5 is dumbed down IBM server chip and AMD technology, so in essence it's yesterday's news PC even before it's out.

G5 will follow the traditional Apple curve. When iMac came out it sold very well, then as time went on, faster and faster ( and cheaper ) PCs put it in its place.

Opteron is here.

It's up to 50% faster at same mhz

Apple's bus is 1GHZ, Opteron doesn't need a bus, it's CPU communicates with memory at the CPU speed, this means 2GHZ, twice as fast as G5 ( 2Ghz Opterons are coming out this week ).

Beamtracer
08-03-2003, 09:43 PM
Amusing post (above)



Originally posted by panini
If you want to compare G5 which still isn't available to something then we can also speculate on upcoming Athlon 64 and Pentium 5
Well, as the world migrates to 64-bit computing, the Pentium 5 should be left out of the discussion as it's hobbled by the RAM limitations of 32-bit processors.

Apple and AMD will be producing systems that can access 8 gigs of RAM. As applications draw on this RAM, Intel will be left in the cold.

Originally posted by panini
Intel can dump a faster chip at any time, it just doesn't make any sense for them to do so until others catch up.
So you recognize that Intel is a monopolist and will only release better products if they see some competition?


Originally posted by panini
G5 is dumbed down IBM server chip and AMD technology, so in essence it's yesterday's news PC even before it's out.
That's funny. Show me a 64-bit Windows that can allow an application to address more than 4 gigs of RAM. Intel's 32-bit processors (complete with antique x86 functions) are yesterday's technology.

You're forgetting that for most of its history Intel has produced the slowest processors in the personal computing world.



Originally posted by panini
2Ghz Opterons are coming out this week Great. But wait... where's Microsoft? Where's your 64-bit OS?

Sorry, but Microsoft controls this game and they'll bring out their OS when they feel like it. Current timeline is next year some time, but when did MS ever release a product on time?


JS: I had a look at SGIs box. Whatever happened to their MIPS chips?

At least UNIX based OSes are able to handle multiprocessing more efficiently than MS Windows.

So... Intel says their 32-bit emulation is "almost as fast" as a real processor. I think Intel will be forced to add native 32-bit processing onto the chip. Either somehow dock it onto Itanium, or redesign a completely new 32/64 processor. Nobody's going to go for this emulation stuff.

You say that Intel is "fast making progress" with the Itanium? I see no evidence of this. Also, Hewlet Packard has "bet the farm" on Itanium. They'll be in trouble too.

If AMD makes some inroads into Intel's market share that'll be a good thing. It's not good for anyone if their is a monopolist dominating the scene.

js33
08-03-2003, 10:10 PM
Hi Beam,

JS: I had a look at SGIs box. Whatever happened to their MIPS chips?

They still use them. :D
SGI Mips (http://www.sgi.com/workstations/)
Mips server (http://www.sgi.com/origin/3000/overview.html)

At least UNIX based OSes are able to handle multiprocessing more efficiently than MS Windows.

I have to agree with you there at least for now. :D


So... Intel says their 32-bit emulation is "almost as fast" as a real processor. I think Intel will be forced to add native 32-bit processing onto the chip. Either somehow dock it onto Itanium, or redesign a completely new 32/64 processor. Nobody's going to go for this emulation stuff.

You say that Intel is "fast making progress" with the Itanium? I see no evidence of this. Also, Hewlet Packard has "bet the farm" on Itanium. They'll be in trouble too.

If AMD makes some inroads into Intel's market share that'll be a good thing. It's not good for anyone if their is a monopolist dominating the scene

Well I didn't know until recently that anyone was even making Itanium 2 workstations or clusters. :D
It came out at 900 Mhz and it's already up to 1.5 Ghz. I don't know much about them yet myself but I've started looking into them more now. Also you can't count out a company that could hold all of Apple in its pinky. :D
I agree they seem to be caught with their pants down at the moment. :p But I expect that will change very soon.

Cheers,
JS

Lynx3d
08-03-2003, 11:10 PM
Well HP makes dual Itanium2 workstations since the beginning of this year...
http://www.hp.com/workstations/itanium/
i read an article about it in a magazine called ix (german one)

well that 32bit emulation beeing as fast as native code...that's
wishful thinking from all i've read so far. It's just there to have some agument to buy Itanium at all. It's fast enough to work with old apps, but won't nearly match any up-to-date PC.

And there is virtually no content creation software for IA64, no matter which OS, at least at the time the article was written.
I really wouldn't buy an Itanium system for 3D, unless you're using POVray, because it seems to perform great on Itaniums.

I still wonder what that major flaw of Opteron is...

maksei
08-03-2003, 11:13 PM
Windows XP 64-Bit Edition supports up to 16 GB of ram..

http://www.microsoft.com/WindowsXP/64bit/default.asp

Beamtracer
08-04-2003, 12:39 AM
Originally posted by maksei
Windows XP 64-Bit Edition supports up to 16 GB of ram..

http://www.microsoft.com/WindowsXP/64bit/default.asp That version of Windows won't run on any of the new AMD processors.

Those who use the new AMD64 processors won't have any 64-bit Windows to run it with. Microsoft is currently doing a go-slow, probably to help its buddy Intel.

Apple will have at least a six month lead with their 64-bit systems, possibly more if Microsoft moves even slower.

panini
08-04-2003, 02:27 AM
Excuse me, but Apple is already behind.

I find it ridiculous that a Mac user is asking me : "Where is your 64 bit Windows ? " when you guys have no processor or OS yet, and won't have a 64bit OS for a while.

I have 64-bit Opterons in the other room and a beta of 64-bit Windows has already been demonstrated at various shows and that million man lan party. It runs just fine on Opteron systems.
What I don't hear about is G5 running any 64 bit OS.

If you really want a beta of 64bit windows it isn't too difficult to find it somewhere.

Opteron is supposed to speed up 32 bit applications as well. That is one of the advantages it has over Itanium.

And Intel wasn't the one making the slowest processors. That honor goes to Motorola and Apple.

Of course, every corporation is monopolist and will only release better products if they see some competition. But why buy slow overpriced junk from wanna be monopolists like Apple when you can but fast machines from already established monopolists.
( I did actually buy from another wanna be monopolist , AMD, but at least it's really fast )

Darth Mole
08-04-2003, 03:01 AM
The big problem with ALL of these frankly tedious arguments about Intel Itanium Opteron AMD et al is that they ignore one serious factor: You still have to use that piece of crap OS from MicroSoft.

If AMD/Intel/whoever, made a super-fast machine that could run OS X and all my Mac apps, I'd buy it. But they don't. So I buy the fastest Mac I can and - hey! - I don't care if it's faster than the fastest PC. It's a pointless, facile argument - the fastest machine is probably some sort of secret alien-technology super-cluster buried in NORAD's mountain HQ.

Pesonally, I don't buy Steve Jobs' the 'world's fastest desktop machine' pitch. But I couldn't care less: the PC world exists over there somewhere, in my peripheral vision. I've bought a dual 2GHz G5, cos it'll be about three times faster than my old G4. And it still runs OSX and it still runs all my stuff - including LightWave. Which is what this forum is supposed to be about...

Beamtracer
08-04-2003, 03:22 AM
The G3 processor was faster than any of Intel's offerings at the time, both in clock speed and the amount of work done in each clock cycle.

Same with earlier processors that appeared in Mac systems. The 604, the 601. Faster than Intel in every aspect.

Only into the G4 era did Apple slip behind Intel in clockspeed.

Now it's only days before G5 systems hit the shelves, along with an OS that can use 64-bit registers to address 8gigs+ of RAM.

No point defending Microsoft about being late with their OS. Sure, MS have displayed a buggy beta version of Windows for AMD64 processors, but it has not been released to the public. MS themselves said it won't be before next year. You'd think a company with such a massive amount of cash could do a bit better.

AMD has been let down badly. Windows users have been let down. Microsoft is the only one to blame.

js33
08-04-2003, 04:09 AM
Originally posted by Darth Mole
The big problem with ALL of these frankly tedious arguments about Intel Itanium Opteron AMD et al is that they ignore one serious factor: You still have to use that piece of crap OS from MicroSoft.

If AMD/Intel/whoever, made a super-fast machine that could run OS X and all my Mac apps, I'd buy it. But they don't. So I buy the fastest Mac I can and - hey! - I don't care if it's faster than the fastest PC. It's a pointless, facile argument - the fastest machine is probably some sort of secret alien-technology super-cluster buried in NORAD's mountain HQ.



Yeah here it is...:D
SGI Altix (http://www.sgi.com/servers/altix/)

Oh and you can run any variant of Linux on AMD or Intel if you want. :D

Cluster baby! :D

panini
08-04-2003, 08:23 AM
After playing around with my friends' Macs I swore never again to complain about Windows.

Win 2000 pro hasn't crashed on me in about a year. Vegas, Lightwave , Flash runs fine, I can't ask for anything more.

Mac users here constantly complain about NewTek not giving them enough attention , modeler freeze bugs and so on. On Macromedia board Flash user whine about Mac version being neglected too.

I think it's time for you people to understand that it's not NewTek or Macromedia, but that it's your overhyped crap Mac OS that causes these problems. You didn't get some of the most basic features until OSX, memory management and parallel processing thingies things that windows had 6-7 years ago. There is a reason Macs are used in schools more than PCs. They are computers for children.

It's very easy to solve all these problems. Dump Steve's piece of crap and buy a real computer.

Karl Hansson
08-04-2003, 08:42 AM
I run LW on Mac OSX every day and I can't remember the last time it crashed on me much less freeze. I haven't had a single freeze in any application since I installed Mac OS X almost two years ago.

What is up with all your hate? The only children around here are the ones who act like children, no names...

mlinde
08-04-2003, 09:10 AM
Gee, would one of you geniuses with the inside scoop on everything let me know a couple of things?

Does Bill Gates wear boxers or briefs?

Does Steve Jobs wear socks made of cotton or wool?

Does anyone here worry more about their art than their hardware stiffy?

Intel processors can... <blah blah blah>
AMD processors do... <blah blah blah>
G5 processors will... <blah blah blah>
IBM knows... <blah blah blah>

From what I see here, you guys all have your wife or husband, or best friends mothers second cousins sisters best friends uncle working in the R&D of all these chip manufacturers, so when can I buy a computer that will do everything for me while I sit back and sip a cup of espresso?

I can't believe that any of you spend all this time arguing about who's dad can beat up the other dad. I thought this sort of thing went out of style in high school.

Oh, and I use my Mac when it suits me and my PC when I need to. Whichever happens to be the best tool for me at the moment.

Does that mean I have two dads who can beat up all of your single dads?

Karl Hansson
08-04-2003, 09:21 AM
AMEN to that!

Red_Oddity
08-04-2003, 09:55 AM
Whooohooo...and here we go AGAIN!!!

Let 'm rip guys....

redlum
08-04-2003, 03:10 PM
Originally posted by panini And anybody with experience in marketing could tell you that those comments are all paid for ( either cash or favors in exchange for a comment like that ) . . . .

Oh brother. :rolleyes:

redlum
08-04-2003, 03:19 PM
Originally posted by Beamtracer That's funny. Show me a 64-bit Windows that can allow an application to address more than 4 gigs of RAM. Intel's 32-bit processors (complete with antique x86 functions) are yesterday's technology.

You're forgetting that for most of its history Intel has produced the slowest processors in the personal computing world.

Great. But wait... where's Microsoft? Where's your 64-bit OS?

A 64-bit window box will crash and freeze faster than a 32-bit window box. 'n dat's a fact jack! :)

Ed M.
08-04-2003, 03:37 PM
To tell you the truth, I think Apple should stick with a 32/64bit hybrid OS for as long as possible. Remember, most all common, every day apps will run SLOWER in a 64bit version. By Apple offering an OS of this type, it gives developers the option of using 64bit when and where it makes sense for a performance boost.

Keep in mind that every G5 test so far has been performed using code compiled for G4, and using a compiler (gcc) that is still in need of massive attention so it can properly handle the G5 correctly. For this to run unoptimized code so well tell me that what lies ahead can only extend the G5's lead that much further. And I'll lay odds that we'll see a whole slew of G5s being benchmarked when the start to arrive in people's hands before we see a ton of BOXX systems being tested.

--
Ed

PS: How about all of you that own a BOXX rig post some benchmarks so I can pass them around to a few developers for review and confirmation?

Beamtracer
08-05-2003, 01:10 AM
Hi Ed... you didn't tell us what the flaw is with AMD.

ecjc97
08-05-2003, 06:45 AM
http://www.theinquirer.net/?article=10853

rick_r
08-05-2003, 02:01 PM
Originally posted by panini
After playing around with my friends' Macs I swore never again to complain about Windows.

Win 2000 pro hasn't crashed on me in about a year. Vegas, Lightwave , Flash runs fine, I can't ask for anything more.

Mac users here constantly complain . . .blah blah blah . . .It's very easy to solve all these problems. Dump Steve's piece of crap and buy a real computer.

I think you've lost your way young man. You must have clicked on LW-MAC by mistake. Try this link to the LW-PC forum. And have a nice day.

http://vbulletin.newtek.com/forumdisplay.php?s=&forumid=26 (http://)

"Don't go away mad, just go away." Blondie

panini
08-06-2003, 11:25 AM
No, actually I am trying to show you the way.

The way to the truth and reason. Follow and you will forever be freed from the lunacy that is Apple ( Barbie PCs at server prices )

Karl Hansson
08-06-2003, 11:49 AM
<<< Follow and you will forever be freed from the lunacy that is Apple ( Barbie PCs at server prices )>>>

The only truly free people are those with an open mind.

Johnny
08-06-2003, 11:50 AM
Originally posted by panini
No, actually I am trying to show you the way.

The way to the truth and reason. Follow and you will forever be freed from the lunacy that is Apple ( Barbie PCs at server prices )

OK, hold on a minute..

If you're going to quote hard-core data, and cite examples of actual experience you've had, accompanied by a more professional dissenting approach, you have my eyes.

But the second your approach decays into the the dogma of the Superior, Enlightened One, I automatically think: "Wanker."

You may as well just say: "Mac users are doodieheads"

C'mon, pal! Grow up! Read the thread title! I'm sure there are p00p-fests elsewhere on the web.

sheesh!

Johnny

dfc
08-06-2003, 12:27 PM
arstechica has pretty good overview of both the IBM 970 and the AMD opteron.

After reading thru both of them...it would seem the opteron is also fully capable of handling legacy 32 bit apps (without even a recompile) and run them just fine.

I think, from reading that, he also says that not only will it run legacy 32 bit apps..but in some ways...will run it better..faster..than a standard 32 bit chip.

That was my reading of it anyway..simplified as that is.

As far as I can see...as the chips themselves go..the opteron has similar attributes of the IBM 970 as far as legacy software goes. You can run full 64 bit apps..or...you can recompile 32 bit apps to just take advantage of more memory...or..you can run legacy 32 bit apps with no recompile..just as they are with no performance overhead loss.

It's the same 32/64bit capability as far as apps go that I can see whether it's windows/operton or OSX/IBM 970.

I think, the questions...might have to do more with the OS..than with the chips themsevles.

I guess I'm missing the point Ed..of where this is all different..since the AMD opteron is fully capable of running 32 legacy apps with no recompile just as the G5 and OSX can I fail to see how that presents a different roadmap for them that's full of bumps? What's different as far the issue goes of running legacy 32, hybrid 32/64 or full on 64bit apps between them? As far as I can see...from the info presented on arstechnica, unless Im misreading it somehow...both chips are fully capable of this.

And has the same options..ie..it runs legacy apps (non recompiled) with little to none performance loss...(unlike the Itanium) and can also slightly recompile 32 bit apps to take advantage of more memory addressing (just as G5 OSX will) OR can run full on 64bit apps (just as the G5 and OSX will).

Can you expound on that a little more? ie..as to where the difference is..that will cause bumps in the road for windows/64bit under the opteron/althon?
And how will the new windows versions play into this?


Thanks
d

Ed M.
08-06-2003, 03:51 PM
Anyone catch this (http://translate.google.com/translate?u=http%3A%2F%2Fwww.macup.com%2Fnews%2Fne ws.php%3Fup%3D1060075085&langpair=de%7Cen&hl=en&ie=ISO-8859-1&prev=%2Flanguage_tools) ?

--
Ed

Johnny
08-06-2003, 04:04 PM
Originally posted by Ed M.
Anyone catch this (http://translate.google.com/translate?u=http%3A%2F%2Fwww.macup.com%2Fnews%2Fne ws.php%3Fup%3D1060075085&langpair=de%7Cen&hl=en&ie=ISO-8859-1&prev=%2Flanguage_tools) ?

--
Ed

interesting..ok..what I'm getting thru the murky translation is the bit about shoving more data thru the pipe, developers have choices as to how to handle the G5: do nothing/reap benefits, do something/reap more benefits.

Finally, once software is optimized for the G5, we're looking at a 2x performance gain. Is this about right?

J

Beamtracer
08-06-2003, 05:23 PM
That German article seems to suggest that if application developers do nothing they will get some speed improvements with the G5. If they completely recompile their apps they'll get a lot more speed improvements.

Ed M.
08-06-2003, 05:27 PM
Anyone notice the price of DVD Studio Pro 2?

:D

--
Ed

Beamtracer
08-06-2003, 08:15 PM
Originally posted by Ed M.
Anyone notice the price of DVD Studio Pro 2? Yes, I noticed the price, and that it has been rewritten in Cocoa. Now if you speak to any programmer who has written an application in Carbon, they'll tell you that there's no use rewriting it in Cocoa. They say the APIs are much the same now, and there'd be no speed advantage.

But here we have Apple doing a rewrite of DVDsp from Carbon to Cocoa. Interesting.

Ed... did I miss it somewhere... did you say what AMD's deficiency is? I'm still intrigued.

dfc
08-06-2003, 10:14 PM
Ed...Beam,
Here's an interesting test from France between opteron 140 and 146 vs PIV 3.2 and Athlon.

Not sure if you've seen it or not.

http://babelfish.altavista.com/babelfish/urltrurl?url=http%3A%2F%2Fwww.x86-secret.com%2Fpopups%2Farticleswindow.php%3Fid%3D89&lp=fr_en&tt=url


If that link works, it should pull up the translated page.

Ed M.
08-07-2003, 02:27 PM
This article: http://zdnet.com.com/2100-1103_2-5060856.html says a lot of what I've already been saying.

Key points to zero-in on are:

- Microsoft will release a version of Windows optimized for the chip in the fourth quarter of this year or the first quarter of 2004

- The beta version of the operating system will become available in "late Q3," he added. That's slightly later than the midyear release of the beta Microsoft had promised earlier

- Microsoft declined to give a timetable for the final release of the product, but it reiterated that the beta would come out "toward the middle of the year."

- Developers will also have to tweak their products. Epic's first game designed specifically for 64-bit desktop computing won't come out for nearly two years

This gives Apple plenty of lead-time.

--
Ed

Beamtracer
08-07-2003, 03:45 PM
Hi Ed... I guess your earlier cryptic post meant that AMD's achilles heel is Microsoft.

MS is always the gatekeeper of the industry. Even for Apple.

MS decides what technologies will be allowed and what ones won't. Sometimes MS will just create a delay to send a message out that their preferred technology is something else. In this case, Intel.

I think it is true that MS could easily have wiped Apple off the face of the planet. They need Apple there to keep the Department Of Justice off their tail.

MS paid their "political donation" fees to certain political parties and bought their way out of the previous DoJ trial.

The "justice" in "DoJ" means that political parties are able to influence the decisions of a court of law, after the defendant on trial pays the political party money.

By the way, both political parties in the US duopoly readily accept "donations" from companies that they make decisions about. It's called corporate government. Most people seem to accept that this is OK.

Over in Europe, the EU may still impose penalties on Microsoft.

mlinde
08-07-2003, 03:56 PM
Originally posted by Beamtracer
Now if you speak to any programmer who has written an application in Carbon, they'll tell you that there's no use rewriting it in Cocoa. They say the APIs are much the same now, and there'd be no speed advantage.

But here we have Apple doing a rewrite of DVDsp from Carbon to Cocoa. Interesting.


Just pondering here, but isn't it possible that Apple re-wrote FCP in Cocoa because they want to shed any outdated legacy code from the previous application?

A complete re-write has two distinct effects on software code:
1) It introduces all new bugs
2) It (usually) improves application speed due to optimization in re-writing the code.

There may other effects of a re-write in cocoa that we don't know about as well.

Ed M.
08-07-2003, 05:04 PM
No, Beam, Microsoft is only *part* of it. Go rehash some of my other posts... I laid it all out in great detail before and hinted on what I (and a few others at this point) believe to be a significant setback. I'm holding off on any further speculation until I get some more detailed information on it. I could be completely wrong you know.

However, if you'd like a clue as to where I'm nosing around, remember first that...

The Opteron was never meant to be the driving force that would bring X86-64 to the masses. Athlon64 is meant for that and it will be an arguably lesser-performing chip (and it isn't even here yet). It really doesn't matter if it is or isn't though. AMD is hurting.. Have you noticed that they seem to be living from chip-introduction to chip-introduction? I mean they *must* push the envelope, if they don't then they wither and die.

Look how long it took them to get 64bit to market ... and they were pushing hard -- double-timing it even triple-timing it to get it here. They nearly collapsed in a heap. They seem exhausted. What if doesn't take off like they would hope? Is there a plan "b"? And it's still X86.. an Intel design. Remember, Intel must have run the numbers and didn't like what they saw otherwise, why even bother with a clean-break away from it with IA64?

Apple, IBM and Motorola were smart. They did better because they actually *planned* a transition/migration to 64bit. Intel was obviously looking to drop X86 all together one day in the near future, but AMD will likely keep them strapped to that architecture for quite some time, I'm afraid. Considering that AMD's 64bit architecture is an extension of X86 *and* that it will likely be incompatible with whatever 64bit desktop offering Intel has planned (and they said that they weren't planning any), you can kinda guess where things are headed.

Still, that's not the clue I was meaning to provide. That much is obvious by now.

Consider this...

Apple has done EXCEPTIONALLY WELL with Panther and OS X in general that they were able to hide/camoflague enough hardware detail behind their kernel APIs... and.. well, I'll leave it at that. Like I said, I'm really not *certain* (i.e., only speculating) about the AMD rigs and the OSs that are being planned for it at this point, but as information tickles in, I'll make it available to the folks on this forum.

Oh, something else you might want to know.... I do know for certain that no one at Adobe (the Photoshop team anyway) has had any contact with any dual-64bit Opteron test-rigs.

--
Ed

toby
08-07-2003, 05:25 PM
Originally posted by Beamtracer
The "justice" in "DoJ" means that political parties are able to influence the decisions of a court of law, after the defendant on trial pays the political party money.
More specifically, more money can buy you 'more' law, more/better lawyers always results in a stronger case, regardless of facts, a la O.J. trial


By the way, both political parties in the US duopoly readily accept "donations" from companies that they make decisions about. It's called corporate government. Most people seem to accept that this is OK.
sad but true



Over in Europe, the EU may still impose penalties on Microsoft.
"The (European) commission also said it wanted to force Microsoft to disclose the software coding that competitors would need in order to make their server systems compatible with Windows."
http://www.nytimes.com/2003/08/07/business/worldbusiness/07SOFT.html

panini
08-07-2003, 07:20 PM
Ed M. is a very good example of the illogical and laughable reasoning a lot of Mac people use.


Stating that : "just look how long it took AMD to get to 64bit"

then saying how Apple and IBM were smart and planned well.

Since AMD is 6 months ahead of Apple and G5 major components were designed by AMD you can surely see how stupid and illogical this argument is. ( there would be no G5 without AMD )

Bottom line: There is still no G5 and still no 64 bit OSX.

Saying that they Apple is ahead while I have dual opteron rendering in the other room is really, really idiotic