PDA

View Full Version : Another test on G5



Pages : [1] 2

kite
08-15-2003, 05:33 AM
http://www.soundtracklounge.com/article.php?story=20030812073633362

:)

js33
08-15-2003, 08:40 PM
Doesn't seem to impressive yet. The Dual 2Ghz G5 is barely twice, if even that, as fast as a single 933Mhz G4. Also with the crappy video cards and shipping with only 512MB ram for $3000 is starting to seem like a bad deal. The only advantage this machine seems to have is that it can use up to 8 gigs of ram. But is that the limit? It's really only 4 gigs per processor so that would make it seem like it really hasn't outgrown the 32bit barrier.
I wouldn't buy one just for Lightwave use. There are faster/cheaper alternatives. For 3 grand I could buy 2 Dual 3.2 gig Xeon machines. Or I could buy 6 to 8 3 Ghz P4 and build a fast render farm.
I don't know... I want something orders of magnuitude faster and this just isn't it. I don't care what platform or who makes it as long as it is many times faster than what we have today. I guess they won't make anything in my lifetime as fast as what I want unless you can afford a Million dollar machine. :D

As one poster said ...

"Dual 2 gig G5
Authored by: Anonymous on Friday, August 15 2003 @ 05:48 PM PDT
Not really very impressive.... Please take the time to read what the 2G dual processor G5 machine was compared to... a single processor 933Mhz G4 with less memory and an ancient architecture.

I can only hope that when the machines are finally released they are a lot more impressive than only twice the speed of an older single processor G4.

And as for the hope that the speed will improve with a "fully 64 bit OS" then I suggest some further research on the 32/64 bit debate and how the G5 (970) processor does not really care. Remember this, only applications that need to do 64bit operations are going to benefit from the G5 providing the application has been re-written to accomodate this.
The OS itself does not need to be fully 64 bit to run 64 bit aware apps. "

Cheers,
JS

Ade
08-15-2003, 08:57 PM
Its a beta system running unoptimised PS7.

js33
08-15-2003, 09:06 PM
But you would think it is close to done being that the G5s are supposed to ship in about 2 weeks. I will reserve final judgement until the machines are actually released however.

Cheers,
JS

Ade
08-15-2003, 09:10 PM
Js I never buy first generation of anything.. They are fast i know that first hand, Adobe apps need optimising before they can show the new power.

js33
08-15-2003, 09:40 PM
Forget Adobe. I mean of course Photoshop is a good and useful app but I'm never sitting around waiting for PS to finish rendering. As such why people insist on using it for a benchmark app is just silly. I want to see how Lightwave performs on it not Photoshop.

Cheers,
JS

prospector
08-15-2003, 10:57 PM
js33
there are
for that 3 grand you could get 6 or 8 3GHzP4?

then you can get a 18 to 24GHz compy

CLUSTER

From a cluster website

Clusters have become important in the film industry for rendering quality graphics and animation. Beowulf clusters are used in science, engineering and finance to work on projects in fields such as protein folding, neural networks, quantum mechanics, fluid dynamics, astrophysics, evolutionary biology, molecular dynamics, genetic analysis, statistics and economics. Researchers, organizations and companies use clusters because they provide increased scalability, manageability, availability or supercomputer-level processing power at an affordable price.

I think I saw or read that Sony did a 300 PS2 cluster for supercomputer processing speed of like 400GHz

That should get you a 100,000 HV particle rainstorm frame rendered in about what...3 sec?:D

js33
08-15-2003, 11:15 PM
Yes I would much rather have a 24 Ghz than a dual 2 Ghz (4 Ghz) machine. Who wouldn't?

Clustering is common using the Linux OS but only the screamernet module exists for Linux at the moment.

Look at this for a monster cluster.
SGI Altix (http://www.sgi.com/servers/altix/)

I think I will just make my own new PC. I already have 3 content creation machines and don't need another. I need render power.
After all time is money. The less time you spend rendering the more money you can make. :D

Cheers,
JS

toby
08-16-2003, 01:46 AM
Don't exaggerate just to make Mac look worse than it is.

I notice the pc guys never compare price and stability in the same breath. "PC's are cheaper!" ( when you use cheap hardware ) "PC's don't crash more than Macs" ( when you spend just as much money to get good hardware )

I priced the cheapest Xeon 2.8 dual PC I could find, before I heard about the G5, $2400 with bare neccessities - gf2, no burner, small hd. Not to mention having to order all the parts separately and build it myself.

Why don't we hear from you guys when a test shows G5 faster than a dual Xeon? You immediately believe the slow test, and assume that's the overall speed.

"It's really only 4 gigs per processor so that would make it seem like it really hasn't outgrown the 32bit barrier. "
If you look at it per-processor, then 32 bit systems have a 1GIG PER PROCESSOR LIMIT. I can't stand this kind of argument.

js33
08-16-2003, 02:33 AM
I just did a check on pricewatch and I can get a 3Ghz P4 system without an OS for $500. I can install my copy of Win2000Pro on all them and have them render away. I could get 6 of these for the price of one Dual G5 and have an 18Ghz render farm. I know it wouldn't be that pretty to have 6 boxes but 18Ghz compared to 4Ghz for the same money is hard to beat. Yeah the Xeon boxes are more expensive than p4 because they run dual, quad, or higher but I don't think they are any faster than the same speed P4. I'm not knocking the G5 but I'm looking for the most speed for the buck not the prettiest machine.

Cheers,
JS

Ed M.
08-16-2003, 06:49 AM
:rolleyes:
If anyone believes anything js33 says, they need their head examined. He's starting to sound more and more like a troll -- I think anyone with a chunk of gray matter can see that. Remember, it's all been discussed before in this other link found here:

http://vbulletin.newtek.com/showthread.php?s=&threadid=6856&perpage=15&pagenumber=23


Someone should lock this thread because it's pretty much redundant. I've laid it out for everyone there and there has been some good discussion. Regarding what js33 has stated.. It's a classic case of what's known in the racing industry as "bench racing".

What kids like to do is thumb through all those nifty performance-parts catalogues and ADD UP all the advertised gains in horsepower those individual parts are stated to produce and write that number on the bottom of their wish-list. Before you know it, they think have this 650 horsepower motor (after all the numbers add up, right?) that they just bolted on $800.00 bucks worth of "performance-boosters" and think they are going to race and beat the people with a clue about engineering and performance. Kids and adolescents do this kind of stuff. And as anyone knows, any *real* production company that has critical deadlines to meet is NOT going to rely on a mishmash of cheepee-parts in their production environment when work need to get done. That 500 buck figure he's throwing around doesn't get you a lot of things; reliability is just one. There are others (I've went over all this before), so pay no mind to what he says. He's only trolling and bench-racing.

--
Ed

PS: Lock this thread it isn't going anywhere and it's all been discussed in the thread posted above. :rolleyes:

Antimatter
08-16-2003, 09:28 AM
I have an question, why do you guys think PC crashes more often?

Imho Macs are great for a few things and graphics are among one of them, but i have a question, how often do you see an mac acting as a server for an large ISP?

i haven't seen an isp use an mac for an server yet, they all use athons, pentium, xeron, and so on. i have seen uptimes into the months with an well mantaianted server. anyway intel xeron's aren't the end of all ends, there's the athon MP too.

A while ago i was seriously consdering dual cpu systems but i end up with my current machine which is.

$2,500 i got this machine.
- pentium4 2.8c overclocked to 3.2 ghz
- 2 stick of cosair ram, 1 gig total, and i got 2 more slots to stick in more ram so i can go to 2 gig no problem and if i do need more i can just move up to the 1 gig stick no problem.
- radeon 9700 pro <- would had liked one of those specalized rendering card but i also do some gaming to.
- 2x 10k raptor SATA drive
- 2x 200 gig IDE western dightal drive <- room for about 4 more harddrive in my case.
- custom bult watercooling loop. to keep the video card and the cpu cool, and in the furture if i end up needing other cooling i might add em in.
- lian-li 75 case, that thing is pretty nice case. its expensive but i wanted the LARGE case so i would have pleaty of room for the radator and few other items.
- 52x32x52x cdrom burner
- 16x dvdrom

its going to be dual booting window 2k pro, and gentoo linux.

anyway about pc reliability, put it this way i bought an ancient 233 mhz pc about 5 years ago and it has been running for the last 5 years 24/7 as my server/router with no stability problem or anything.

now speaking of clustering, i'll admit that i do not know that much about clustering but i did a little bit of price reseaching.

anyway i can build one computer for as cheap as $430 or cheaper

motherboard with onboard lan - $48
2500 xp athon baron cpu - $85
2 stick of 512 meg for total of 1 gig - $171
80 gig western dightal harddrive- $86
antec 300 watt power supply unit - $40

anyway i belive that's enough for one cheap computer, not including case, you can get an cheap case for about $25 then add on maybe another 25$ for cooling and youre pretty much done with one cluster.

anyway i could had gone cheaper such as going with athon 1800 but we wanted an decent cluster correct?

anyway an baron +2500 is clocked at 1.8 ghz and you probably can overclock it to atleast 2ghz easely. anyway were not here to talk about overclocking so let's leave it at 1.8 ghz.

let's say we got a joe doe with $3,000 enough cash for about 6 of those cheap clustered computer, plus other stuff such as case, wiring etc.

cpu speed: 10.8 ghz combited
memory: 6 gig total
harddrive: 480 gig of harddrive

anyway with that setup he got an decent little render farm. run gentoo linux or some other version of linux with screamernet.

anyway i went over to the apple website and tried to craft one mac g5 that had specs as close to the total above for that cluster, and it came out to about $5,645 so with that cash i could buy an cluster with the following specs.

13 computer cluster
cpu: 23.4 ghz combited
memory: 13 gig total
harddrive: 1.04 terabyte

anyway some one on the other forum here, posted this " Intel does 4, AMD does 7, and Macs do about 20 ins" that's instruction per clock cycles so let's take our nice $5,600 dual 2ghz mac G5 and compare it with our 13 computer cluster.

4 ghz that figures out to about 81,920 instruction per second.

now for the cluster

23.4 ghz totals out to about 167,731.2 instruction per second

anyway just look at that number for the same price the cluster beats out that mac g5 despire the superior instruction per clock cycle of the apple's g5 cpu.

anyway on the lower end of the specium such as at about $3,000 for the cheap cheap version of the mac dual 2ghz g5 we would have about 6 athon cluster which would be equal to 77,414.4 instruction per second, now here the mac g5 is the winner but by only 4,500 instruction per sec.

for the lower end around $3,000 the mac g5 is the winner here but toward the higher end say $5,000 and up, the cluster wins out. anyway if one cpu or one computer in the cluster fails you can replace it for about $400-500 however if one apple mac g5 fails, youre out of about $3,000 and up


note: when i talk about pricing i'm talking about the apple g5 dual 2ghz setup, i know there's cheaper ones such as the single cpu model but i choiced the dual 2ghz setup.

and when i did the pricing, i dumped all the non needcassary stuff such as the screen etc.... were only comparing the barebones required to run an cluster.

Johnny
08-16-2003, 10:54 AM
back to the original subject of the thread..I would have expected the G5 to whip the snot out of ANY G4 configuration...isn't the G5 supposed to overcome even apps that aren't optimized for it? or does that hold only when it's running a version of OSX that IS optimized for the G5?

J

Antimatter
08-16-2003, 12:13 PM
yeah it might, but you can get other thing cheaper you can buy 13 $430 pc for the price of one G5, heck i probably could buy two near bleeding edge system for the price of one top of the line G5

my only question, is why do you buy the apple g4 and g5 if theyre so overpriced? imho anything over $2,500 to $3,000 for an pc is overpriced unleast you are building an rendering farm.

hell for about $2,500 i bought an near bleeding edge pc and overclocked it to a speed equal to the speed of the fastest intel chip relased today and for about $500-600 cheaper to boot :p also have half an terabyte of harddrive, about 25 gig below half a terabyte but close enough, and so on, anyway the base bus speed of my 2.8c chip is what 800mhz when i overclocked it to 3.2ghz i ended up with a bus speed of 914mhz and onice i finally get the watercooling core all setup i will see if i can't push the front bus speed up to about 1ghz

Karl Hansson
08-16-2003, 12:17 PM
NOT A NOTHER ONE... these threads must be cloning them selfs...:o

kite
08-16-2003, 12:59 PM
Phu, why cant we macpeople celebrate the new G5 in peace.
I just wanted to share my excitement over this fresh hardware...and maybe get some more fuel for the buyfire I trying to start in my wallet :)

Ed M.
08-16-2003, 01:52 PM
Funny.. I just ran into a couple of gems... Notice what's said about Macs.


Cringely: Macs scare IT departments
http://maccentral.macworld.com/news/2003/08/16/cringely/

I, Cringely | The Pulpit
http://www.pbs.org/cringely/pulpit/pulpit20030814.html

--
Ed

Johnny
08-16-2003, 02:08 PM
I have first-hand experience of IT people hating/fearing Mac and mac-related departments..

I was in a design group within a corp. consulting firm.

One day we had some minor printer problems with a Xante laserprinter...a normally excellent machine. The Mac-hating IT guys paid a visit.

When they left, the problem wasn't cured, but the Xante was totally crippled. A Mac consultant came in to look at things, and discovered that a card inside the Xante had been broken..pulled out, and shoved in without regard to getting it back into its socket properly.

When IT guys break stuff, they MUST be scared p00pless at the thought of Mac users who don't really need them around.

Even in Mac-dominant environments, the Macs never work 1/3 as well as my set up at home does...problems are never quite resolved, even tho I know in my heart that a wipe and install would get the job done..and that I could do it myself.

IT guys (I'm sure there are good ones) seem to cultivate their own job security by use of FUD.

J

prospector
08-16-2003, 03:07 PM
So what have we learned..

Macs overpriced

Newtek NEEDS to go linux

I want a cluster for realtime rendering:D

toby
08-16-2003, 03:50 PM
Originally posted by toby
I notice the pc guys never compare price and stability in the same breath. "PC's are cheaper!" ( when you use cheap hardware ) "PC's don't crash more than Macs" ( when you spend just as much money to get good hardware )

Why don't we hear from you guys when a test shows G5 faster than a dual Xeon? You immediately believe the slow test, and assume that's the overall speed.


And now we're comparing a fully equipped desktop to a renderfarm, and to the stability of a server. Enough of this hardware bragging geek speak.

no disrespect to JS, you're (almost always) rational and respectable in your disagreement ( even if you're...wrong! :D )

Antimatter
08-16-2003, 05:13 PM
Originally posted by prospector
So what have we learned..

Macs overpriced

Newtek NEEDS to go linux

I want a cluster for realtime rendering:D

HOTDAMN! another fan of the linux platform!

i still am waiting for newtek to convert their lightwave program to linux :) i got it working in Wine but it just aint the same u know.

i mean they already have lightwave out for those overpriced macs why not the free linux? :)


And now we're comparing a fully equipped desktop to a renderfarm, and to the stability of a server. Enough of this hardware bragging geek speak.

no disrespect to JS, you're (almost always) rational and respectable in your disagreement ( even if you're...wrong! )

and by the way toby, i wasn't comparing the speed of mac verus an renderfarm i was comparing the PRICE i'm just saying that mac g5 yes its a great computer if youre willing to pay the price to get an decent one. for the price of an what $5,000 G5 i could like buy 11 cheal $430 computer and build myself an little renderfarm that will outrender that single mac, then if one of the cheap computer dies i can just replace it for about $430 or so, if a G5 dies, then you gotta to shell out $5,000 (here i'm just assuming both dies of something horroriable, such as say the cpu just exploded and melt into the motherboard thus u have to replace the whole unit)

and about stability, i wonder why there's always window NT server or linux server but never mac servers? is there any good reason why there's not a mac server? i'm guessing the number one reason, is price.

js33
08-16-2003, 05:18 PM
Originally posted by toby
And now we're comparing a fully equipped desktop to a renderfarm, and to the stability of a server. Enough of this hardware bragging geek speak.

no disrespect to JS, you're (almost always) rational and respectable in your disagreement ( even if you're...wrong! :D )

Hi Toby,

Yeah I mean no disrespect to the Mac platform no matter what Ed thinks or says.:D

After all I actually own a Mac. :D

I'm simply comparing what I can get for $3000 in the PC world Vs. the Mac world. All I want is the most speed for the buck.

What's wrong with that.

Cheers,
JS

Antimatter
08-16-2003, 05:20 PM
Originally posted by js33
Hi Toby,

Yeah I mean no disrespect to the Mac platform no matter what Ed thinks or says.:D

After all I actually own a Mac. :D

I'm simply comparing what I can get for $3000 in the PC world Vs. the Mac world. All I want is the most speed for the buck.

What's wrong with that.

Cheers,
JS

i hate windows so much that if i had a choice between windows or an mac, i would buy an mac.

a question, how good are those powerbooks? i heard they were good. and if theyre an reasonable price not like $5,000 i might buy myself one for college. cos linux isn't as friendly with laptop hardware as desktop, and i just don't have the time right now to setup an linux installation on an laptop.

js33
08-16-2003, 05:26 PM
Originally posted by Ed M.
:rolleyes:
If anyone believes anything js33 says, they need their head examined. He's starting to sound more and more like a troll -- I think anyone with a chunk of gray matter can see that. Remember, it's all been discussed before in this other link found here:

http://vbulletin.newtek.com/showthread.php?s=&threadid=6856&perpage=15&pagenumber=23


Someone should lock this thread because it's pretty much redundant. I've laid it out for everyone there and there has been some good discussion. Regarding what js33 has stated.. It's a classic case of what's known in the racing industry as "bench racing".

What kids like to do is thumb through all those nifty performance-parts catalogues and ADD UP all the advertised gains in horsepower those individual parts are stated to produce and write that number on the bottom of their wish-list. Before you know it, they think have this 650 horsepower motor (after all the numbers add up, right?) that they just bolted on $800.00 bucks worth of "performance-boosters" and think they are going to race and beat the people with a clue about engineering and performance. Kids and adolescents do this kind of stuff. And as anyone knows, any *real* production company that has critical deadlines to meet is NOT going to rely on a mishmash of cheepee-parts in their production environment when work need to get done. That 500 buck figure he's throwing around doesn't get you a lot of things; reliability is just one. There are others (I've went over all this before), so pay no mind to what he says. He's only trolling and bench-racing.

--
Ed

PS: Lock this thread it isn't going anywhere and it's all been discussed in the thread posted above. :rolleyes:


Ed,

Why should anyone that doesn't kiss your *** be considered a troll. After all your the one that doesn't use Lightwave so you should be considered the troll. Also you didn't even use a Mac until your sister gave you back the gift you gave her.

So your not only a troll but a hypocrit as well. Why should anyone listen to the crap you spew? You don't even use the products that this forum is about.

Nothing I have said is wrong. Maybe it doesn't fit into your narrowminded view of things but that's not my problem.

I'm simply comparing what you can get for $3000. I like the idea behind the G5 but I feel it is overpriced. As I said you can get a 2.8 - 3 Ghz P4 machine right now for under $600.

The thing you fail to realize is that PC hardware comes down in price where the Mac never does therefore making it a better deal and even better over time.

Cheers,
JS

toby
08-16-2003, 05:30 PM
Originally posted by js33
I'm simply comparing what I can get for $3000 in the PC world Vs. the Mac world. All I want is the most speed for the buck.

What's wrong with that.

Cheers,
JS

nothing, just don't compare 'Apples' to oranges, it doesn't make sense to buy G5's, with 3 firewire ports, pci-x card, graphics card, Superdrive, ethernet, Airport extreme card, ( and other things that NO pc has, like widescreen capability) etc., to use as a render slave. It doesn't make them over-priced.

If I already had a 2ghz dual xeon I wouldn't be planning to get a G5 either, but that doesn't mean the G5 is worthless.

js33
08-16-2003, 05:50 PM
Originally posted by toby
nothing, just don't compare 'Apples' to oranges, it doesn't make sense to buy G5's, with 3 firewire ports, pci-x card, graphics card, Superdrive, ethernet, Airport extreme card, ( and other things that NO pc has, like widescreen capability) etc., to use as a render slave. It doesn't make them over-priced.

If I already had a 2ghz dual xeon I wouldn't be planning to get a G5 either, but that doesn't mean the G5 is worthless.

No I never said the G5 was worthless. But if you want to build a render farm on the Mac you have no choice but to buy a complete machine as Apple doesn't allow you to build your own other than their restrictive build to order option.

Also my PC has 4 USB 2 ports , 2 Firewire ports, GeForce 4 Ti4200, Superdrive hehehe it's just a Pioneer A04 DVD burner just like the one in my PC, DVD rom, ethernet built in, sound built in but I also have a 10 channel Maudio soundcard, 512 mb ram, P4 2.53 Ghz and all this for only $1100 about 6 months ago. Now similar machines without the DVD burner can be had for about $5-600. That's my entire point that I can assemble a bare bones stripped down machine if I want to build a render farm but there is no option to do that with Apple. That's the problem. I guess you could go scoop up a bunch of old G4 machines when people upgrade but they aren't fast enough to even bother with.

Cheers,
JS

toby
08-16-2003, 06:05 PM
Originally posted by Antimatter

and by the way toby, i wasn't comparing the speed of mac verus an renderfarm i was comparing the PRICE

Like I said,
"And now we're comparing a fully equipped desktop to a renderfarm"
The G5 is not a renderfarm! Why should it cost the same per ghz???



and about stability, i wonder why there's always window NT server or linux server but never mac servers?

Do you think McDonalds makes excellent hamburgers? They sell the most, so they must be great. "300 Billion served"!

toby
08-16-2003, 06:08 PM
Originally posted by js33
if you want to build a render farm on the Mac you have no choice but to buy a complete machine as Apple doesn't allow you to build your own other than their restrictive build to order option.

This is the same old old old argument, "I can build a PC the way I want" I don't think it bears rehashing.

Ed M.
08-16-2003, 06:12 PM
First thing js... If your idea is sooooooo great, point us all to some honest-to-God production houses that take the "cheap-PC" rout. No one is buying these sub $700.00 PC and doing this stuff. Sure, you make it *sound* good, buuuuut...

And what should be obvious to everyone here is that it's NOT the sub $1000.00 PCs that are the #1 sellers.

The price Apple is asking for a G5 is cheap in comparison to what other OEMs offer. Just look at the technology that's in there and the refinement.

Hey, here's a great idea... Why don't we start comparing all these nifty sub $700.00 PCs and see if we can't configure a system that's better "bang-for-the-buck* than the $4000.00 PCs like the ones Dell, HP, BOXX and other leech OEMs offer and see if we can't do a lot better there too.

Then we can contact those OEMs and ask why they are charging so much when we can build our own (or get one cheaper elsewhere) and get the same performance (and everything else along with it) for lots lots cheaper. I'd like to hear what those companies have to say. Can anyone guess what they might say about their price ranges and what they offer to the customer that warrants an increase to those ungodly amounts of $$$?

--
Ed

Ed M.
08-16-2003, 06:16 PM
McDonald's... Now there's a *server* :D :rolleyes:

Cheap, fast-food gives me the S%^&s! :mad:

--
Ed

js33
08-16-2003, 07:03 PM
Originally posted by Ed M.
First thing js... If your idea is sooooooo great, point us all to some honest-to-God production houses that take the "cheap-PC" rout. No one is buying these sub $700.00 PC and doing this stuff. Sure, you make it *sound* good, buuuuut...

And what should be obvious to everyone here is that it's NOT the sub $1000.00 PCs that are the #1 sellers.

The price Apple is asking for a G5 is cheap in comparison to what other OEMs offer. Just look at the technology that's in there and the refinement.

Hey, here's a great idea... Why don't we start comparing all these nifty sub $700.00 PCs and see if we can't configure a system that's better "bang-for-the-buck* than the $4000.00 PCs like the ones Dell, HP, BOXX and other leech OEMs offer and see if we can't do a lot better there too.

Then we can contact those OEMs and ask why they are charging so much when we can build our own (or get one cheaper elsewhere) and get the same performance (and everything else along with it) for lots lots cheaper. I'd like to hear what those companies have to say. Can anyone guess what they might say about their price ranges and what they offer to the customer that warrants an increase to those ungodly amounts of $$$?

--
Ed

Ed,

I agree there are overpriced PC builders as well. But these so called cheap PCs are actually pretty good now. They are much better now than when cheap PCs first came out. There simply is no reason to have to spend $3000 for a computer anymore.

Cheers,
JS

prospector
08-16-2003, 07:07 PM
I think there might be a little something wrong

A renderfarm is usually used so you could send a certain amount of frames to eash thereby lightning the load and decreasing total rendering time but each compy does it's own frames.

Whereas a cluster uses ALL cpu's on 1 problem, so for rendering, it would be the same as if you had 1 computer rendering an animation in sequence but at total CPU speed so you would have 1 computer to work on going at 24GHz (8 compys at 3GHz each)

So we aren't really comparing a mac to a renderfarm.
we compare a G5 to a small supercomputer.
For the price of just 1 mac.

which is THE point of clusters.
not renderfarms

Antimatter
08-16-2003, 07:09 PM
i know render farm and desktop computer is different but look at this.

i speced up an top of the line apple system with 2 lcd screen and everything, guess what it came up to? about $15,000

then i went to new egg and speced up an pretty damn good pc,

dual xeron and so on, price was about $4,500

i basically bought the same things for the pc as what was in the apple g5.

granded there'll be some difference. i think theyre decent but i'm not going to blow 5-15k on an desktop system. i mean comeon. most of my pc purchases are 2.5 thousands or below, and i still get an pretty good system, it might not be bleeding edge aka p4 3.2ghz but its still pretty damn good.

Antimatter
08-16-2003, 07:17 PM
Originally posted by prospector
I think there might be a little something wrong

A renderfarm is usually used so you could send a certain amount of frames to eash thereby lightning the load and decreasing total rendering time but each compy does it's own frames.

Whereas a cluster uses ALL cpu's on 1 problem, so for rendering, it would be the same as if you had 1 computer rendering an animation in sequence but at total CPU speed so you would have 1 computer to work on going at 24GHz (8 compys at 3GHz each)

So we aren't really comparing a mac to a renderfarm.
we compare a G5 to a small supercomputer.
For the price of just 1 mac.

which is THE point of clusters.
not renderfarms

i'm not that familar with clustering/renderfarm so sorry about that. but heh i think you hit the head of the nail pretty good there, i mean for the price of one apple g5 we could probably build an small supercomputer :p or failing that an decent cheap renderfarm.

OnePerson
08-16-2003, 07:22 PM
The next time you go to a gas station or resturant and use your credit card, think about what checks the validaty of that card. But not just your card all credit cards.

Yep, RiskWise is the company that does this behind the scenes checking.

Also, they are using the xServe and OS X Server to do all this. But whats really cool is that they were using AppleShare IP in the earlier days to do this.

So, just because you don't see alot of xServes out there might be because security needs to be guarded!

toby
08-16-2003, 07:39 PM
Originally posted by Antimatter
i know render farm and desktop computer is different but look at this.

i speced up an top of the line apple system with 2 lcd screen and everything, guess what it came up to? about $15,000

then i went to new egg and speced up an pretty damn good pc,

dual xeron and so on, price was about $4,500

i basically bought the same things for the pc as what was in the apple g5.

granded there'll be some difference. i think theyre decent but i'm not going to blow 5-15k on an desktop system. i mean comeon. most of my pc purchases are 2.5 thousands or below, and i still get an pretty good system, it might not be bleeding edge aka p4 3.2ghz but its still pretty damn good.

When are guys going to get over this stupid argument? It's been raging for years now, no matter what we point out that the Mac has that pc's don't. Mac's can use the same cheap monitors and ram that PC's can, so stop exaggerating.

This is no less than if Mac fans were to take all the benchmarks that show the G5 faster than xeons, going over to the PC forum just to rub your noses in it.

"So we aren't really comparing a mac to a renderfarm. we compare a G5 to a small supercomputer"

I'm sure that if this was feasible one of you motor-mouths would no doubt have bragged to us about having one by now. This is what Ed meant by childish bench racing. All you guys care about is techno-bragging.

js33
08-16-2003, 07:42 PM
Originally posted by prospector
I think there might be a little something wrong

A renderfarm is usually used so you could send a certain amount of frames to eash thereby lightning the load and decreasing total rendering time but each compy does it's own frames.

Whereas a cluster uses ALL cpu's on 1 problem, so for rendering, it would be the same as if you had 1 computer rendering an animation in sequence but at total CPU speed so you would have 1 computer to work on going at 24GHz (8 compys at 3GHz each)

So we aren't really comparing a mac to a renderfarm.
we compare a G5 to a small supercomputer.
For the price of just 1 mac.

which is THE point of clusters.
not renderfarms

Yes it's not about what system is best anymore its about bang for the buck. I would rather buy 6 or 8 3Ghz P4s than 1 G5.
I'm not saying one is better than the other its just a better bang for the buck. If you had 8 3Ghz PCs it wouldn't be a cluster though just a speedy little render farm as each machine would render a frame on it's own. Clustering is more associated with Linux or with servers. I guess you could get an 8 processor server machine but for Lightwave purposes each processor would still render a frame each.

Cheers,
JS

toby
08-16-2003, 08:00 PM
Originally posted by toby
And now we're comparing a fully equipped desktop to a renderfarm

js33
08-16-2003, 08:11 PM
Yes on price only. Speedwise there will be no comparison unfortunately I hoped the G5 would be a speed demon.

Cheers,
JS

tallscot
08-16-2003, 08:22 PM
I haven't read this entire thread, so I apologize if someone else made this point - the A test is on the hard drive. The B test is in RAM, mostly. The differences on the B test are incredible, IMHO.

What the A test shows is the hard drives on the G5 are about twice as fast, which is also impressive.

Maybe I'm missing something here. Correct me if I'm wrong.

CORRECTION - saying the hard drive is twice is fast is inaccurate, after looking at the tests. But still, test B is in RAM and test A really isn't, so look at test B, not A.

tallscot
08-16-2003, 08:40 PM
I'd rather work on the G5 as my workstation. I don't have a render farm, but if I did, I'd wait for the G5 Xserve and compare that to getting a bunch of PCs and using that for rendering final animations. You don't really have to use the same render farm platform as your workstation.

I really like Final Cut Pro 4, and I prefer OS X over Windows.

I'm still thinking about waiting for the dual 3 Ghz G5 next year. :)

toby
08-16-2003, 08:47 PM
Originally posted by toby
Like I said,
"And now we're comparing a fully equipped desktop to a renderfarm"
The G5 is not a renderfarm! Why should it cost the same per ghz???


"I haven't read this entire thread, so I apologize if someone else made this point - the A test is on the hard drive. The B test is in RAM, mostly."

Good point, we hadn't discussed that

Ed M.
08-16-2003, 11:23 PM
Hey guys, check this out:

http://www.infoworld.com/reports/SRapple.html

--
Ed

Antimatter
08-16-2003, 11:31 PM
I have a question for you mac guys.

why are you guys so willing to shell out 3 to 15 thousand dollars for a brand new mac g5 when you can say get an pretty good pc for about say $500 to about what 1.5 thousand dollar, sure it might not be bleeding edge but its still passable and hellva lots less strainful on the wallets.

i'm just wondering that's all.

if the macs were cheaper i might actualy consder them an viable option but as it is theyre too expensive for what they offer imho.

Meshbuilder
08-17-2003, 12:25 AM
Originally posted by Antimatter
I have a question for you mac guys.

why are you guys so willing to shell out 3 to 15 thousand dollars for a brand new mac g5 when you can say get an pretty good pc for about say $500 to about what 1.5 thousand dollar, sure it might not be bleeding edge but its still passable and hellva lots less strainful on the wallets.

i'm just wondering that's all.

if the macs were cheaper i might actualy consder them an viable option but as it is theyre too expensive for what they offer imho.

Because we like our OS and the way it is designed and works..

Why would someone buy a really nice design car for 10 000$ more when a cheap car will also take you to work?
Why would someone buy a expensive B&B sound equipment when you can buy a less expensive equipment with almost or even better sound?
Why would someone like to have a nice design office to be in, when you just work there?

I don't know any other way to explain why we like our Macs..

Ade
08-17-2003, 03:50 AM
Originally posted by Antimatter
I have a question for you mac guys.

why are you guys so willing to shell out 3 to 15 thousand dollars for a brand new mac g5 when you can say get an pretty good pc for about say $500 to about what 1.5 thousand dollar, sure it might not be bleeding edge but its still passable and hellva lots less strainful on the wallets.

i'm just wondering that's all.

if the macs were cheaper i might actualy consder them an viable option but as it is theyre too expensive for what they offer imho.

Theres something to be said about the advantage of when the hardware maker is the same guy making the software OS.
There WILL always be ppl out there with a lil more money to spend on somthing extra nice, these ppl know quality and innovation.

The arguement of macs being more expensive is tiresome,
just like saying macs dont have games and windows is still realiant on DOS.
No-one never takes into account macs superior build/cooling quality, innovation and resale value.
I just sold my b/w G3 + monitor and 640 ram yesterday for $1000aus, now thats a '99 machine and I got that much!

Windows doesnt have any iapps, windows with all their money still is susceptiple to virii and security breaches...
OSX app extension technology is far ahead, windows still share DLL files between apps...This affects stability of a machine, usually why its common for pc ppl to just reinstal when they get a system or app problem.

I myself never try and convert pc ppl, they obviously have a different mind set .

toby
08-17-2003, 04:08 AM
well put ade

Jimzip
08-17-2003, 07:01 AM
Agreed. :cool:

Plus, I reckon the OS is worth paying for. Ugh, I couldn't imagine life without OS X..

Jimzip :D

Nakia
08-17-2003, 07:42 AM
Out the Box an Apple ROCKS. The iApps are top notch. A user can do so much without having to run to CompUSA and pick some cheap movie making, DVD authoring, Photo tools and whatever. The only complain I had wen I bought my G4 933 which I payed $2500 back then was I got no real Office tools i.e. Appleworks atleast. Good thing Appleworks is cheap. I figure that would be packaged included in a $2500 box. Beside that I never had to buy anything for the Apple except LW, Photoshop (which I should have stuck with Graphic convertor, GIMP or Imagemagick) and Illustrator.
The only other system I bought that had rocking Apps out the box was my SGI Indigo2 MaxImpact, It had movie maker, an Audio Synth, butt load of 3d convertor tools, Inventor, Showcase,
I added to it free of charge GIMP, FilmGimp, BMRT, Radiance.
SGI is another OS and Hardware are made by the same company.

tallscot
08-17-2003, 09:24 AM
I just priced a PC workstation that I would buy if I didn't buy a dual 2 Ghz G5. It cost about $1K more than my dual 2 Ghz G5.

I don't have to buy a $3K-$15K Mac. I could buy a $799 one.

Lightwolf
08-17-2003, 09:48 AM
Originally posted by tallscot
I just priced a PC workstation that I would buy if I didn't buy a dual 2 Ghz G5. It cost about $1K more than my dual 2 Ghz G5.
:) That's something I do every now and then when I'm dreaming...
Currently I'm at 4160,- Euros / Dell Xeon, 4370,- Euros / Apple G5
(Both with 2 Gig, 128 MB GFX Board).
Mind you the Dell does have Dual Channel SCSI 320 as well as a Quadro FX 500, the G5 on the other hand FW 800...
If I were to tune down the Dell to 2,8 GHz Processors I'd end up at roughly the same speed as the G5 (at least according to the Siggraph Renderman pre-release benches), I end up with 3650,- Euros.
Apple prices are a bit steeper in Europe too, one more reason not to switch...
Cheers,
Mike

Nakia
08-17-2003, 09:55 AM
I bought a HP xw6000 which has dual Ultra SCSI320, which helps me out. I connected a Sun D130 with 3 18gig 10KRPM drives external for Video and a Internal SCSI160 10KRPM 36gig drive for Apps i.e. LW and the rest.
the SCSI was big selling point for me being it can get $$$ to add later on.
But finding a 320 SCSI drive will be $$$$

tallscot
08-17-2003, 10:11 AM
If I were to tune down the Dell to 2,8 GHz Processors I'd end up at roughly the same speed as the G5

Like you said, when you are dreaming.

Lightwolf
08-17-2003, 10:20 AM
tallscot:

Originally posted by tallscot
Like you said, when you are dreaming.
May be you should finish reading my post before you reply... :rolleyes: :confused:

tallscot
08-17-2003, 10:23 AM
I did.

Lightwolf
08-17-2003, 10:27 AM
Originally posted by tallscot
I did.
Practice ;)

Ed M.
08-17-2003, 11:38 AM
I think a lot of people are missing the entire thing about the G5. I've been over it time and again explaining why I (me personally) think the G5 to be more than impressive, but the PC people seem to be ignoring it, or are in complete denial (judging from their posts that is). People have to realize that any scores or tests we see from the G5 are likely to be on prerelease testbeds running non-tuned code (that is 90% crappy for the G4 too) and not running under Panther. In short, *any* speed increase is significant. What the G5 has shown thus far is simply incredible!

I'm waiting for developers to get with the program and start tuning the apps to take advantage of the new features the G5 has to offer. I explained this in a few posts as well. And if what I'm hearing is true, it's a breeze to tweak the apps to take advantage of these new features. I hear that even something quick-and-dirty could yield even more impressive results over what we've already seen. This tells me that if developers decide to take *full* advantage of the technology Apple has placed before them, then it's likely that we will see some downright AWESOME performance. The fact that it can run current hobbled code so well was a wonderful surprise to say the least. What's more, the G5 is *only the first* in a new line of PowerPC processors coming from IBM. Things can only get better. And if IBM decides to license OS X from Apple to run on *their* hardware, then lookout. IMO, it's only a matter of time before we see this between Apple and IBM. I can't say I'm as confident or impressed with what the competition has planned after their current top-of-the-line offerings.

And it's *still* oh so quite in the PC mainstream press and review sites. Sure there are a number of die-hard disbelievers wandering aimlessly around the message forums in complete denial of what the G5 means. Apple isn't stupid and Steve Jobs was obviously fed up with all the B/S surrounding past benchmarks so he went to IBM for the solution.

Everything became quiet very quickly -- as if nothing ever happened. Many in the PC realm had their doubts about the performance claims, but after further review and deeper inspection by these individuals of the test documentation (in every case), it's highly likely that these skeptics touched upon something they didn't expect, something they didn't like nor believe. It's almost as if they were all coming to the conclusion -- that they were wrong with their assumptions. In their feverish attempt to debunk these claims, they failed to consider that these claims could very well be true.

As they dug deeper to find *some* flaw they only ended up finding out more facts and tidbits that left them bewildered and more surprised. What were they going to do? In their initial haste they drew a LOT of attention to themselves and the G5. However, for them, drawing attention to the G5 only served well when it shows the G5 in a poor, sour light. They wanted to get an early jump, but their plan backfired. It's as simple as that. At this point they couldn't possibly draw any further attention out of sheer embarrassment. In every step of the way the found that in all the tests conducted to this point, the G5 was the chip at the disadvantage. that's right everyone, It's all obvious.... To rehash once again, we are talking about test-rigs running G4 code (crappy code to begin with) on a non-Panther OS X. And the compiler that was used for other benches doesn't even recognize the PPC 970 correctly -- *yet*. They had no choice but to keep things quite so as to not draw anymore attention, because if they did, others might find out these facts and then they would really look foolish. It's almost as if Apple and IBM knew that the system was going to be highly scrutinized and the tests highly doubted thus, drawing a lot of attention while at the same time being meticulously open and forthcoming about the testing methodologies and providing all the detailed information to go along with it. Nice trap. Do we see this from other OEMs? What about from Intel or AMD? Does anyone ever call *them* on their results? How do we know *tyhey* aren't cooking the books? I think the skeptics realized this too that's why the clammed up. Right now I'm willing to be that a lot of them are just waiting to get a hold of a few of these G5s so they can attempt one last shot at trying to debunk the initial claims ... and still until tuned apps, Panther and GCC all contain the required attention that will make them G5 savvy, the previous analysis still holds. So, these people had no choice but to remain silent because now that the truth is out they want the G5 to get the least amount of coverage as possible.

Remember, the 970 has IBM's reputation riding on it as well. Just look at IBM's POWER line. Is there any real competition from the likes of Intel or AMD? Not really, and the Power5 will increase performance 4-fold over the current Power4s Rumor also has it that it will contain a VMX/AltiVec unit. That alone makes me wonder... Why would they add such functionality? Again, the future look bright.

Regarding Photoshop tweaks for G5... I've talked to Chris Cox and he assures me that the plugs will be ready by the time the G5 ships and yes, the G5 runs circles around everything else currently on the market. His words, not mine.

Regarding Lightwave performance.. Well, we haven't seen any benches released by NewTek yet. I wonder why? We haven't seen any official for the BOXX systems either or anything current for the Intel rigs yet either. Why? Where are the benches of the BOXX systems? I want to see them from NewTek. Haven't they benched them yet? Oh well...

Regarding Windows... Has anyone really looked at where Microsoft want to take everyone? Security issues abound. Every time they claim to release an OS thats more secure than the last the viruses seem to get more ferocious, thus costing businesses, corporations and users a lot of lost time and $$$. Their answer? A locked-down OS (read: Longhorn/Palladium) with hardware and software DRM, security enhansements, a ton of restrictions and greatly increased complexity. Forced upgrades and more and more confusion for users and developers alike, but all this is just the tip of the iceberg.

--
Ed

Antimatter
08-17-2003, 01:32 PM
I'm not trying to diss the mac, i'm just trying to find the best bang for the buck, my reason, is i'm an poor college student so i don't have that much of income as of right now.

anyway i do not use windows, as a matter of fact, if the college told me that i HAD to choice between a windows or an mac machine i would go with the mac anyday. i'm running my machine on gentoo linux and doing great with it.


now about the OS, i'm not that sure about it, its just that in my expendence <-spell? the OS tends to be little unstable and crash alots, i'm not talking about the new ones that's why i'm not sure about them, my school had some power mac lots of em and i think they ran os 6 to 7 or something i'm not sure of the version but i noticed that the apps on it tended to crash a lots and when the app crashed it usualy took the system down with it.

now i'm sort of assuming here but apple probably has fixed that partcular problem or is it still there? i'm guessing due to the new os X based off BSD is much more stable.

and one thing i love about pc is that they enable you to get your handy dirty, for my high end pc i bought all of my comps seperatly and put them together then bult my own custom watercooling system to keep the cpu and few other item cool because i'm going to try to overclock the cpu to about 3.2+ ghz and that yields me about 930ish FSB not quite 1ghz but geting close.

anyway i'm just wondering why apple dosen't give you the option to like buy from few other suppliers then assembly your own custom machine. that would help drive down the prices, plus it would give the user more choices on what he want etc...

Stewpot
08-17-2003, 03:30 PM
I just thought I would diverge a little from the current theme and point out that Macs might just be good for another reason.

Back in December of last year Arnie Cachelin mentioned in a thread about 64 bit computing that, LightWave is 64 bit clean and they did demo a 64 bit version of LW 6 running on an Itanium.

Now if that also applies to the Mac code, (I know nothing about coding), or if the 64 bit code can be recompiled(?) for the Mac, (also assuning that NT have continued doing 64 bit versions since then), the G5 could be a big winner for the Mac LW community.:D

Ed M.
08-17-2003, 04:32 PM
Stewpot, I remember that thread well, but who really knows what NewTek is up to these days? They seem to have axed their Mac Dev Team. Anyone know any of the current Mac programmers at NewTek? Is NewTek farming out development over seas? I don't think anyone knows. For the type of high-end application NewTek is producing, It would be wise for them to bring some Mac-centric code-junkies aboard. I listed some of the specialists they'll need in order to produce a topnotch app that worthy of the price tag. I'll lay odds that most of the programmers at NewTek are Windows/x86 centric. The same app for the same money with EQUAL percentages of sales should have EQUAL staff for each platform. Period.

NewTek needs UNIX people. Networking experts, programmers and software engineers.

They also need more people that are fluent in their understanding of the CURRENT Mac technology -- hardware AND OS(X). In other words, NewTek has to build a topnotch, well-rounded MacDevTeam and get on with the platform-specific optimizations.

For Networking on a variety of platforms (including Mac OS X) John C. Welch comes to mind. He's also an AppleScript master and He's willing to help NewTek if they so choose.

For distributed computing, render-farms and clustered computing Dr. Dean Dauger said He'd LOVE to help develop a solution.

For code Optimizations I really don't know anyone NT could hire permanently. I do know that Chris Cox has yet to hear from any of the guys at NewTek and Chris is still confident that he can iron out a lot of their code. One thing he suspects is that NewTek's code need a LOT more parallelism (doing more things at one time). On an outsourcing basis, someone like the OmniGroup could possibly help if NewTek feels that the workload is too heavy for anyone on their DevTeam. It's really up to NewTek at this point. The question is: Where do they want to take the Mac users that account for 50% of their Lightwave?

They also need people who are scouting for new and upcoming technologies like the robust single-precision algorithms that are rumored to be out. As it stands now, the two FPUs on each of the 970s can handle NTs double-precision code with ease (no matter how good or crappy it might be), but that isn't the point.

As I stated in an earlier post, the SPEED gained from AltiVec alone vastly out weighs any of the drawbacks single-precision once had. I'm sure if the NewTek guys contacted The Advanced Computation Group (ACG) at Apple, the group that carries out research and development in algorithms and high-performance issues relevant to Apple's technology, they would know exactly which single-precision 3D algorithms are being developed and tested.

I mean what would people say is there was a single-precision renderer that produced output quality that rivaled the current double-precision algorithms being used? The answer is simple ... it's likely that we'd have older G4's running rings around current top of the line Athlon and Pentium systems in 3D rendering.

I know you all might think it's hogwash and B/S, but I'm telling you, what I'm hearing is that there are some really sweet single-precision algorithms out there for 3D rendering that are either as good or better than the double-precision routines being used.

The faithful MacLW users deserve nothing but the best.

Another thing that people need to understand about 64bit OSs and computing.. It's known that 64bit versions of apps tend to run SLOWER if the app in question can't specifically benefit from 64bit math. There will be a LOT more data being thrown around and that includes 64bit OSs, so it's likely that a 64bit version of Windows might end up being slower than it's 32bit counterpart. This is why I like Apple's hybrid approach.

Panther blurs the line because developers will be able to keep the apps that wont benefit from going to 64bit extremely fast (and in 32bit) while at the same time allowing parts of other apps that *will* benefit from all the 64bit-goodies that the G5 offers, to take full advantage of them. In other words, developers can speed up the parts of the apps that will make sense and the ones that will benefit.

--
Ed

Lightwolf
08-17-2003, 04:49 PM
Hi Ed,
Yep, I'm still in the wrong forum ;)
The single precision renderer sound interesting, I'm not sure if it as felexible as a dp renderer in production use though (i.e. without tweaked scenes).
Even with dp LW has it's problems (for example the "let's zoom way into a part of my object that's _far_ away from the origin and work on some miniscule detail there..." problem in Modeler). Developers do prefer dp because it makes the code so much safer in tight situations...

As for the slower 64 bit apps, you're a bit disinformed here. The Opteron being hybrid like the G5 being an example here... 64 bit code is roughly 15% larger than 32 bit code, 64 bit OS'es are actually faster than their 32 bit counterparts... Even 32 bit code on a 64 bit OS runs faster than 32 bit code on a 32 bit OS (this is all Linux, I expect Win2K3 and XP 64bit to behave similarily).
On the Itanium the situation is different however, here the code bloats more (thus for the larger caches...).

Just one more thing that seems to be a misconception, 50% in sales does not mean a user base of 50% !!! At the time many of the faithful bought their LW, a Mac version wasn't even available... I assume numbers look a bit different if you're talking user base.

Nevertheless, I would like to see those improvements, especially since I assume that mose would make it over the border anyhow...

Cheers,
Mike

Ed M.
08-17-2003, 05:07 PM
[[[Just one more thing that seems to be a misconception, 50% in sales does not mean a user base of 50% !!! ]]]

This is another example of a PC-user picking and choosing how they want to interpret *percentage*. Funny how things are different when you want to look at the number of total machines sold in a given quarter or year and say that Apple only has 3% of the market. By your logic, it can be implied that the Mac market has a significantly higher *installed base*, no? Which interpretation do you want to use?

[[I assume numbers look a bit different if you're talking user base. ]]]

Wouldn't that be something ... lol

Your wrong on you 64bit speed points though. I suggest that you hop on over to ArsTech and do a little research.

--
Ed

Lightwolf
08-17-2003, 05:21 PM
Originally posted by Ed M.
[[[Just one more thing that seems to be a misconception, 50% in sales does not mean a user base of 50% !!! ]]]

This is another example of a PC-user picking and choosing how they want to interpret *percentage*. Funny how things are different when you want to look at the number of total machines sold in a given quarter or year and say that Apple only has 3% of the market. By your logic, it can be implied that the Mac market has a significantly higher *installed base*, no? Which interpretation do you want to use?

Lol, we're talking software here !!!
Of course the Mac has a higher installed base, but LW has a higher _active_ base since software does get updated (I don't think you buy every version of LW anew, do you?).
I haven't accounted for _any_ new sales since 5.6, I have accounted for a grand total of 12 updates though, and do have 6 LW licenses here. This is not new sales, but user base. And user base is more important for a software company than a for a hardware / box company
If you go check on the statement by NT, they're talking about sales, not updates (the same goes for the 25% Maya btw...).
And why do I have to be a PC user to get my facts straight? Do all Mac users bend them? Would you be more biased if I told you I was a female, black, gay or whatever? :rolleyes:


Your wrong on you 64bit speed points though. I suggest that you hop on over to ArsTech and do a little research.

Go check some SUSE benchmarks on the Opteron...
A nice system to try out all 3 options: 32 bit OS / 32 bit code, 64 bit OS / 32 bit code, 64 bit OS / 64 bit code...
But may be you have some direct links, the last time I checked the messy message ArsTech board I just found most of my arguments being validated... My research did btw. lead to the above facts...
I might need to improve on my english though...

Cheers,
Mike

Ed M.
08-17-2003, 05:47 PM
Look Lightwolf, it's like this... Native 64bit systems may actually perform more slowly in many situations, since you're sending more data across the bus and using more cache for each variable and the like. I'm certain that you still don't believe me though.. <sigh>

--
Ed

Ed M.
08-17-2003, 06:05 PM
OK, here is an edited version of a message I posted to some message board a while ago (might have even been here). It makes the point quickly and clearly... It was also posted far in advance of the introduction of the G5..

<snip> You are completely wrong to assume that simply going to 64-bit automatically translates into a speedier, faster or more efficient app/code. You are wrong.

In many cases migrating to 64-bit could actually be *slower*. Why? It's fairly simple if you take the time to look at it and understand what's really going on. It isn't so obvious to people who have fallen for the "numbers game" that Intel (with processors) and Microsoft (with version numbers) have mastered in their marketing campaigns. Here is where all the assumptions (by less informed people) started to arise:

During the transition from 16 to 32 bits applications and systems *did* gain speed. This was because many of the 16 bit machines often had to work with 32 bit numbers. A range from 0 to 65535 was simply not enough for common data. Therefore, in *that* case there is a *significant* difference between processing two 16 bit chunks or a single 32 bit chunk. However, here is the kicker (and something many people don't realize)...

There are very few common applications that really need more than 32 bit wide data, or more than 32 bits of address space. Period. So in these cases, moving an app to 64-bit code would be a huge waste of time with little return and in many cases, it would even hurt performance.

What this means is that there are *very few* applications currently at work which process several chunks of 32 bit data. This, in turn, means that going to 64 bit will only speed up *those* very few apps, while all pointer variables everywhere will consume *DOUBLE* as much memory!

In Apples case, having a single 64-bit CPU solution that can execute 32-bit code natively is a much smarter approach since it will give the developers of the *common* apps ample time to migrate their 32-bit apps to 64-bit or select parts of the app to 64-bit. There isn't going to be a performance difference after migrating those apps to 64-bit anyway. We'll look at why Apple is moving to 64-bit and it probably isn't completely for the reasons you believe. More on that later...

As for the developers who could use the extra horsepower. Guess what? The 64-bit CPU that is running 32-bit code natively is right there, just as happy to run 64-bit code or hybrid code. This means that developers will have vastly more flexibility when coding their apps. The key areas of the big "number-cruncher" apps can be moved to 64-bit, where 64-bit addressing would be an enormous boost. Here is how a trusted friend and PPC/AltiVec programmer explains it:

He expanded on this comment from a Darwin board:

PPC uses a 16 bit offset from a register to determine the load/store address. The instructions don't change, nor do their operands when you go from 32 to 64 bit. Only the data in the registers differs, which would be controlled by whether you loaded a pointer using a load word or load double word. .... cont.


Yes, the instruction format is identical. Yet there is a price for going to 64 bit: immediate values (constants embedded in machine instructions) are 16 bits wide. Filling a 32 bit register takes two instructions; filling a 64 bit register with an arbitrary bit pattern requires five instructions (synthesize two 32 bit values in four instructions, then combine them with a fifth).

I'm sure you by now you understand why going from 32-bit to 64-bit doesn't automatically translate into more performance.

Now, The *real* reason for Apple moving to 64-bit is the fact that there will be lots of machines in the very near future that will have more than 4Gig of RAM. Here is how a PPC programmer explains Apple's reasons for migrating to 64-bit:



Going to 64 bits today is more of a forward-looking move. There are a few applications in existence that really benefit from migrating to a 64 bit machine, but those are server-type, "big iron" stuff. For example, The AltaVista search engine was at one time running its database on an Alpha machine with 16 gigs of RAM, so it could service almost all requests without ever going to disk. However, the sheer size of problems/datasets being processed keeps increasing steadily. Some people claim that "typical" applications occupy 1.0 to 1.5 more address bits per year. If you just look at the 'default' amount of memory in entry-level PCs, you can draw similar conclusions. Current off-the-shelf offerings range from 128MB to 512MB. This means at the high end, only three doublings are left before we reach the 4GB limit of 32 bit machines.

On the desktop, video processing could be *the* application that drives the transition to 64 bit, not because it would make a huge difference in speed, but because it is much more convenient to handle files above 4GB on a 64 bit machine. <end snip>

You might want to file this information for future reference...

--
Ed

Beamtracer
08-17-2003, 08:08 PM
Originally posted by Antimatter
I'm not trying to diss the mac, i'm just trying to find the best bang for the buck Sure, Windows gives you the most bangs for the buck, or maybe I should say the most BLASTS for the buck.

I'm surprised the Windows users have enough time to comment on this thread. I thought they'd all be busy trying to defend themselves against the 'Windows Blast Worm' that is infecting them all. It really is a dirty OS, isn't it!

Yep, more blasts for the buck with Windows.

js33
08-17-2003, 09:20 PM
Hehehehe. Well Beam you seem to forget there are things called firewalls and virus scanners that keep that stuff out. I've never had a virus infection on my machines. My virus scanner has caught several attempts but nothing ever gets through. The ones that get infected must not have any active protection so they get infected. I don't have my Mac on long enough for it to get a cold much less a virus. Hehehe.

Cheers,
JS

Antimatter
08-17-2003, 09:46 PM
Originally posted by Beamtracer
Sure, Windows gives you the most bangs for the buck, or maybe I should say the most BLASTS for the buck.

I'm surprised the Windows users have enough time to comment on this thread. I thought they'd all be busy trying to defend themselves against the 'Windows Blast Worm' that is infecting them all. It really is a dirty OS, isn't it!

Yep, more blasts for the buck with Windows.

you're correct about windows, i'm right now laughing at windows, and i do not use windows.

rember my post above on the previous page i belive, it stated quite clearly imho that i ran my system on linux not windows.

i wasn't talking about software i was talking about hardware with my comment on the most bang for the buck.

linux, how costly is it? zero, its free :p

js33
08-17-2003, 10:13 PM
Linux is cool I guess but what do you run Lightwave on? WINE?
Also can Linux run Photoshop, After Effects, Director, Flash, DFX, etc...the list is so big.

Cheers,
JS

toby
08-17-2003, 11:13 PM
Originally posted by js33
Hehehehe. Well Beam you seem to forget there are things called firewalls and virus scanners that keep that stuff out.

That's cool, but I've never lifted a finger worrying about viruses, I open every piece of mail and attachment that I get, 5 years on the web and nary a sniffle

js33
08-17-2003, 11:45 PM
The Mac typically doesn't have as many viruses because there are less people using them. I'm sure BSD and Linux have just as many security holes as Windows, well maybe not as many hehehe, but not as big a deal is made out of them.

Cheers,
JS

toby
08-18-2003, 12:15 AM
I think it's because more PC programmers are sociopaths :p

js33
08-18-2003, 12:29 AM
I think you should rephrase that as "Virus writers are sociopathic" which I would agree with. They want to cause the most damage possible so they go where the largest installed base is.

Cheers,
JS

Antimatter
08-18-2003, 12:40 AM
Originally posted by js33
The Mac typically doesn't have as many viruses because there are less people using them. I'm sure BSD and Linux have just as many security holes as Windows, well maybe not as many hehehe, but not as big a deal is made out of them.

Cheers,
JS

well if that's true that linux/BSD has as many security hole as windows, then mac must have as many security holes too.

don't forget that the OS X is based off BSD u know.

toby
08-18-2003, 12:44 AM
it was a joke - no need to get defensive

js33
08-18-2003, 01:22 AM
Originally posted by Antimatter
well if that's true that linux/BSD has as many security hole as windows, then mac must have as many security holes too.

don't forget that the OS X is based off BSD u know.

Yeah BSD=OSX. That's what I meant. :D

Cheers,
JS

js33
08-18-2003, 01:24 AM
Originally posted by toby
it was a joke - no need to get defensive

Not defensive just pointing out that virus writers are indeed the sociopathic ones and should all be arrested. Some of them have been but it seems like if they still find them they don't advertise it anymore.

Cheers,
JS

Lightwolf
08-18-2003, 02:01 AM
Originally posted by Ed M.
OK, here is an edited version of a message I posted to some message board a while ago...
Well, let's talk x86-64 then...
x86-64 is an extension of the current x86 (i.e. intel) CPU instruction set to revamp it to 64 bit... It is purely and extension, so native x86 runs just as well on the same processor, with _no_ speed penalty.
On the Opteron (and this is _not_ true for the G5 or any other 64 bit processor) you do get a speed boost in 64bit mode mainly because there are more registers available in the x86-64 code (16 vs. 8 integer for example), which alone can lead to a speed boost of 20%. This is not an issue with the G5 for example, since it has a nice bunch of multi-purpose registers...
So it is _not_ the 64 bit on the Opteron that account for the extra speed, but the additions in 64 bit mode. Again, Opteron, not G5. I wouldn't expect a speed increase for 64 bit apps on the G5.
Agasin, looking a SUSE Linux, you can run both 64 bit and 32 bit apps on the same OS, the only thing that is not possible (or with a lot of work) would be hybrid 32/64 bit apps. Then again, you'd want 64bit mode on the Opteron for the extra registers anyhow.
Other than the extensions, 64-bit on the Opteron is pretty much the same as plain x86, so you don't suffer speed penalites when working with less bits, and you're not forced to use 64-bit data exclusively...
So, your post is nothing new, and doesn't back up your argument at all, since it seems to be only concerned with the G5 way of doing things (which I didn't dispute...).

On a side note (not directed at you at all, I've read this many times), I find tons of references everywhere addressing video editing as the first "consumer" app where 64 bit makes sense... _Only_ because video files can be huge. Currently video files are adressed using 64 bit pointers anyhow (the VT[3] even uses floats), and to actually _work_ with video (i.e. render, compose...) one doesn't need 64 bit at all.
The first people I've witnessed hitting the 32bit barrier were actually the compositing people (Digital Fusion users hitting the 3 GB / app wall on WinXP during heavy 2K compositing). So, Shake users rejoice !

Cheers,
Mike

Antimatter
08-18-2003, 02:20 AM
Originally posted by toby
it was a joke - no need to get defensive

if that was directed at me then, i was just pointing that out that's all :)


anyway regarding security hole, i would say that the majority of the security holes are there because the "user" dosen't know how to secure his system properly and disable/uninstall things that he/she dosen't need. now keep in mind not all of those holes are the user fault, just the majority. anyway you probably could secure an mac/windows/linux box as good as you want or as ****ty as you want.

but the difference between windows is that most of the security options are turned off by default, on the other hand with linux most of the distro that ive ran into turns the security options on by default, now i don't know about macs.

Ed M.
08-18-2003, 08:21 AM
[[[The Mac typically doesn't have as many viruses because there are less people using them. ]]]

This statement is pure bull$%^t. Windows servers are the MINORITY out there, yet they are the ones most affected. Guess what types of machines are attached to those Windows servers? Thaaaat's right, I thought so. Well, admittedly, I'm no big fan of LINUX, but it's FAR more secure than Windows.

The argument is flawed.

<sigh>.. I see that this thread has decayed into another "my platform-is-better-than-yours" discussion.

--
Ed

Lightwolf
08-18-2003, 08:37 AM
Ed:
(seriously OT now...)
How many script kidz have access to Sun servers, or even know how to boot up Linux? Linux as all the other *ix variants all have their exploits and security problems, just like Windows does, the widespread use of the PC makes it a lot easier to develop.

I also think the argument was Mac vs. PC viruses, in that case I don't think PC servers are at a minority (compared to Mac servers that is, not to Linux).

Cheers,
Mike

Ed M.
08-18-2003, 08:40 AM
[[[The Mac typically doesn't have as many viruses because there are less people using them. ]]]

This statement is pure bull$%^t. Windows servers are the MINORITY out there, yet they are the ones most affected. Guess what types of machines are attached to those Windows servers? Thaaaat's right, I thought so. Well, admittedly, I'm no big fan of LINUX, but it's FAR more secure than Windows.

The argument is flawed.

<sigh>.. I see that this thread has decayed into another "my platform-is-better-than-yours" discussion.

--
Ed

Nakia
08-18-2003, 09:06 AM
One of the main reasons why viruses don't effect UNIX based boxes is the way UNIx/Linux deal with permissions, user rights. There are virus that will wipe out your $HOME, one of the reason why you should not login as root, download as root or use the default su to get there. The UNIX user space is power. So in order for a virus to work it need to gain root access, that is not a simple task. Also with feature like RBAC, access to certian commands, daemons will make any unix style virus hard to use. Well put it this way UNIX been around since 1969 alot longer than windows, most of the SERVERS out there are UNIX based servers most of the websites are UNIX based servers we have yet been hit with some major virus/worm attack that took out UNIX based server on the level that windows server are getting hit with.Alot of these UNIX based server are not single CPU servers alot are 20+ and up CPUs running in a farm environment also 64bit (Just to keep this 64bit related) In my book 34 years of UNIX is good enough time to write atleast one good virus. I'm only 28 :-)
All this is comming from someone who work for Sun Microsystems So I can be a bit one sided.

Lightwolf
08-18-2003, 09:08 AM
Nakia...
And i also assume that most of the sysops on those U*ix system are more computer literate than your average IT specialist setting up small windows servers for companies.

Nakia
08-18-2003, 09:56 AM
I think the windows folks are more computer literate. Those sun servers are not that hard to deal with if you pay attention. I work on over 3,000 Sun servers.
I don't think one OS SysOPs is genrally better then the other. A Tech is a Tech.
I say use what ever works, I do think the UNIX based boxes are more stable for long heavy work. But I believe there are other OS's that rocks i.e. BeOS, Plan 9 QNX.
My Home office is WinXP, Linux, 2xW2K, Powermac G4, SGI O2, Sun Ultra5. I'm not into that this OS is King or the machine is best. Thats the reason why I can't have just one type of computer at home. But I do say SGI Workstation comes pretty close to being a catch all workstation if you got $$.

Lightwolf
08-18-2003, 09:58 AM
Traitor !!! you don't have an Amiga? ;) :p

Nakia
08-18-2003, 10:08 AM
I been looking at Amiga, just waiting for the wife to turn her back to get it. :-)

tallscot
08-18-2003, 10:30 AM
I had an Amiga 1000 without a Commodore logo on it. Paid $1,800 with the monitor. Anyone want to touch me?

We could go on and point to the list of known vulnerabilities for different platforms, etc., but it's all irrelevant. The reasons I have more hassles with Windows are irrelevant. The fact remains that these hassles exist, and they suck.

My good friend uses Discreet Edit and 3DS Max on PCs. He loves Max and he loves Edit (dropped, BTW). Last weekend, two of his three PCs stopped working. Neither would boot. One gave him a blue screen and a message about a .sys file, or something to the effect, and the other gave him a black screen with a message about a corrupted DLL or something to the effect. On one, he had to completely reformat because nothing would work. On the other, he had to reinstall Windows.

Because of the registry, he had to reinstall all his applications and he had a hell of a time getting Edit to work again with his video board.

I'm amazed that you can't just back up your registry and then restore it after a reinstall of Windows.

I own an XP Pro system here and run it next to my Mac, but I'm admittedly not on the level of trying to back up my registry. I use reg cleaner to delete references from applications that I've uninstalled but are still listed, for whatever reason. System Restore is a good feature in XP, but he uses 2K because the applications he uses don't support XP yet, according to him.

My friend wasted the entire weekend and he had a lot of work to do. This isn't the first time this happened to him either.

He's now switching to the Mac. He's going with FCP 4 and Maya 5 with a G5.

I've had similar experiences with my two PCs. I had a DOS #6 error on boot-up once, after installing a Nortel NIC. I couldn't get the thing past the black screen on boot-up, and tech support on the phone couldn't either. I took the Nortel NIC back and purchased a 3COM NIC and it worked.

My XP Pro system has been stable, though I get a lot of error messages and other mysterious messages sometimes. I don't have the blaster on my system yet, but I know what to do. When I uninstall something, which I try to avoid, I have to answer some questions about deleting DLL Y or X. Why does the user have to make these decisions? I just delete them and hope that it doesn't hose some other application. I use to have WinAmp on my system, but it kept taking over the file associations, even though I told it to not open files. I finally got tired of it taking over file associations, so I uninstalled it. After that, XP would ask me to locate WinAmp every time I wanted to run a .MPG.

OS X has none of these hassles. I've been using OS X since Public Beta, upgrading it every time and never doing a fresh install, and it just works. I've never had to reinstall it. I don't have file associations issues. I don't have DLLs to worry about. I don't have a registry. I don't have viruses. I don't have adware or spyware. I have a superior UI, IMHO. I love scrubbing video and music in the Finder.

I can get a cheap PC. I can build my own PC. I don't really want to build my own PC, nor do I want to build my own camcorder or automobile. I can afford a Mac and I gladly pay for it, because I think it's worth it. Frankly, I think Windows sucks. The OS is very important to me, because the experience I have on a Mac and PC is greatly different specifically because of the OSs.

Some people prefer Windows. Some people want to build their own computer. The PC is perfect for that. Some people can't afford or can't justify the cost of a Mac. PCs are cheap.

Nakia
08-18-2003, 10:31 AM
With the new G5, I will really keep an Eye on Amiga now, hoping that it will get the new CPU also. Hmmm

Antimatter
08-18-2003, 01:06 PM
tallscot: ive had the same problem with windows, so after a while i just said screw this, and fried my window install and installed linux, has been running for few years already with almost zero problem other than user error type of thing.

now if i had the cash i would probably buy a g5 just to experment with :) i like learning about new technology i like learning new os i like learning well stuff :)

my main problem with the mac platform right now is the price, its way too expensive in my option and another one is that you can't build it yourself, i love building pc, its always fun to select piece then put them together so you can have an entirely customized pc, and you know what is in there. You control the quality of your pc. for example i could build an workstation type pc, an gaming pc, an server pc.

now that got me thinking, is there any relase of the OSX out for the x86? if there is i would like to get my hand on it to experment with OSX, anyway i'm guessing no :(

tallscot
08-18-2003, 01:23 PM
I can appreciate the ability to choose every component in your computer, but I think the tradeoff for less choice is having an experience with less hassles.

Another user of Linux prefers using it on a Mac. Why? He says that with the Mac you don't have to define a driver or specification for every single component like you do on the PC. The installation is incredibly easy because they have every model of Mac listed and you just choose the one you have. And there's a Linux utility called MOL (Mac On Linux) that allows you to run OS X, OS 9, OS 8, System 7, etc., within the Linux. It's not emulation and there is supposedly no speed penalty because you are running it on a Mac.

At least we both agree Windows sucks. :)

I'm excited about the G5 system. It would be nice if you could buy one with no HD, no RAM, no video card, no optical drive. Apple use to sell Macs like that (but with a floppy drive), but times have changed. Given the choice between a G5 with a Radeon 9800 and OS X, and a X86 PC with Windows or Linux, I'll choose the Mac.

If Windows ever turns into what I think is the best OS, I'll go with it. Considering Longhorn isn't due till 2005/6, I doubt I'll be using my PC exclusively any time soon. :)

eblu
08-18-2003, 01:33 PM
anti-
the price thing is a myth. Any brand name PC configured with the drives, cards, memory, LCD monitors, that apple uses shows up in the same price range.
As for building it yourself, well I can empathize with that, but I also know for a fact that I am not nearly as good an engineer as one of Apple's.
I tried at the advent of cloning to put together my own system, and the price (on paper) was way higher than going with apple's offerings. Bottom line: I also cant get volume discounts by myself either.

mac os X x86 is Not a product. Apple according to rumors, keeps it around, but only to have options for the future, and leverage w/ IBM/Motorola.

my suggestion: buy a cube, and mod out the case.

http://cgi.ebay.com/ws/eBayISAPI.dll?ViewItem&item=2747676327&category=14912

Antimatter
08-18-2003, 04:21 PM
I agree with you that having a single standarized design makes it simpler for software engineers to design an application and OS.

In my option the extreme customiability of the PC is both an advange, and disadvange.

the advange is you can buy almost anything you want and most of the time it will work together with some tweeking. and if youre buying stuff for an company you can select what you want on the computer, like if youre going to bulk buy 100 pc, you could dump the sound card, video card and several other thing to drive down the price.

afterall most corporate workers are doing word processing, spreadsheet and that sort of stuff so they shouldn't need the sound card/video card. and so on.

now the disadvange is it can depending on what you are trying to do, it can make the installation much more complex, a good example is the linux installation.

tallscot: i agree with you, when i first begun to build my computer i had no clue what the hell all of the gibbish for the hardware ment, such as IDE, etc... anyway after a few week i was starting to sort it all out. but in the end i end up having fun. i used to always look down on hardware and let other people build it for me, and i would just focus on the software part, but when i decided to build my own computer myself i found out that i did like working with hardware, it presents an different sort of challenge to learn about, and i love learning :)

now about geting a barebone mac, if they had some sort of offer like that i would snap up a mac and experment with it and stick in some of the older hardware that i got laying around.

i would like it if apple offered something like this.

an mac g5 with only the motherboard, processor, case and few other properity items. then i can equip it with the harddrive of my choice, cdrom/dvd of my choice, select what sort of memory i want etc...

anyway a question, what brand and kind of memory does apple use? because when i was specing up a mac g5 i noticed the price of the memory. for 8 gig of ram apple wanted $5,000 and i thought that was way too expensive imho.

tallscot
08-18-2003, 04:38 PM
You can customize Macs. For example, I can swap out my old DVDRAM drive in my G4 for the latest DVDR drive. I can swap out my HD or add another, and they are the standard IDE/SCSI/etc. drives. I can add audio cards and bypass the integrated audio. I can swap out the video card. Same goes with the memory.

The problem is you are forced to buy a video card, RAM, HD, and optical drive from Apple. So what we do is order the minimum from Apple and then add to that after we get it.

You could buy a Mac from eBay and then customize it.

I don't know the brand of memory Apple uses. $4,950 for 8 1 gig DDR400 PC3200 sticks is a fair price, IMHO. You have a better price? I'd love to bookmark it for my purchase! :)

Nakia
08-18-2003, 06:07 PM
In my circle we say "$3000 will get you what you need most of the time" We budget ourselves to spend up to $3000. But that $3000 should get you 3 years of production use with minium upgrading.
Folks talk about the ability to tinker with PCs what the diff between Pc and Mac really? Both IDE HDs, DDR memory, Zip drives, USB devices, Firewire Devices, PCI slots, nVidia and ATI gfx cards, only diff is CPU and chipset, and Case and PC have some HighEnd $$$$$ video cards. But I swap out parts between my Pc and Mac i.e. memory, firewire card and Harddrives. Now a SGI o2 is what you call different. I can't run to the local PC man and get parts. What is that you really do that is diferent on your custom built PCs? I own a HP XW6000 I did not build it but I know everything in it, not like a PC is that complicated. Oh a Sun 4800 Sunfire is a piece of work but not a PC even a fully loaded PC or Mac.

tallscot
08-18-2003, 06:14 PM
There are studios in San Francisco that have old G3s with G4 upgrade cards in video suites.

I can upgrade my 400 Mhz G3 PowerBook to a 900 Mhz G3 PowerBook for $399 with a PowerLogix upgrade.

Antimatter
08-18-2003, 06:38 PM
So youre trying to tell me that for me to be able to get an dual proc g5 all i need to do is dump all of the stuff that i can to get it as cheaply then use some stuff that i got laying around.

so i can take almost anything from a pc and stick it into the mac? such as ddr400 ram, and an radeon 9700 etc...

am i correct? if that is true then that sounds cool to me :)

now on the price of the memory let's see if i can't dig it up.

i use price watch to help me find a place to purchase stuff cheaply, most of the time i use newegg.

Newegg (http://www.newegg.com)
Price Watch (http://www.pricewatch.com)

most of the 1 gig stuff on newegg are twinpack aka 2x512meg

ah yes i belive i have found a site that sells pc3200 1 gig stick of ram 1gig stick of pc3200 memory (http://stlpcsales.site.yahoo.net/sa10pcddnoun.html) and its on sale for $230 now add about $35 for profesional quality memory and a heatsink on it which brings the price up to $265

now take 8 * $265 which equals $2,120 which is under half the price that apple sells the 8 gig pc3200 memory

now when you search for these memory the the majority of the hits will be dual channel memory which is 2x512MB and i got one set of them in my computer right now, anyway with a little bit of searching you can find single stick of 1024mb.

i have found another site that sells single stick 1024mb memory its little more expensive at $375 a stick.

oem pc world (http://www.oempcworld.com/item.html?PRID=1373784)

hope ive helped u a little on the memory department :)

Nakia
08-18-2003, 07:07 PM
only thing to watch out for is video cards, which are mac built.
On my old Powermac g4 I use plain ol' PC133 RAM taking from my old Athlon box, along with the Harddrives. I took an old internal Zip drive from a PI 233Mhz box. Also drop an old SSI card from a PC into it.
Its hook up to two PC 21 inch monitors.
I would call that somewhat custom

Beamtracer
08-18-2003, 07:08 PM
Assembling your own computer can be fun, but there is a very big downside. Who takes responsibility when something goes wrong? That's when the smile comes off your face.

Sometimes one component can destroy another component. A bad processor can destroy RAM and other components.

If you bought your box from one manufacturer you can say to them "fix it", and it'll be fixed under warranty.

With build-your-own boxes you are going to have a lot of trouble convincing the manufacturer of one component to replace a device from another manufacturer.

This can also be difficult with Apple systems when you install 3rd party RAM or devices yourself.

Originally posted by Beamtracer
I'm surprised the Windows users have enough time to comment on this thread. I thought they'd all be busy trying to defend themselves against the 'Windows Blast Worm' that is infecting them all.

Originally posted by js33
Well Beam you seem to forget there are things called firewalls and virus scanners that keep that stuff out. Don't forget to add the price of these virus protection devices to the cost of buying a Windows computer. These are expenses that you must pay for, and expenses Mac users don't require.

js33
08-18-2003, 07:40 PM
Originally posted by Beamtracer
Assembling your own computer can be fun, but there is a very big downside. Who takes responsibility when something goes wrong? That's when the smile comes off your face.

Sometimes one component can destroy another component. A bad processor can destroy RAM and other components.

If you bought your box from one manufacturer you can say to them "fix it", and it'll be fixed under warranty.

With build-your-own boxes you are going to have a lot of trouble convincing the manufacturer of one component to replace a device from another manufacturer.

This can also be difficult with Apple systems when you install 3rd party RAM or devices yourself.

Don't forget to add the price of these virus protection devices to the cost of buying a Windows computer. These are expenses that you must pay for, and expenses Mac users don't require.

I have built most of my computers except for the last PC and the iMac. It is nice to have the flexibility to add the exact components you want rather than take the watered down hardware compromises that are made in most prebuilt systems, Macs included. Apple forces you to buy stuff you may wish to buy elsewhere like video cards. So you have to buy the video card from Apple then throw it away to buy a better one that hopefully has rom support for the Mac. Also why does Apple charge so much for Ram? Antimatter pointed out the great ripoff Apple is for memory. Our economy is based on free market and choice something Apple never learned. Apple charges $650 for the "Superdrive" which is just a Pioneer A06 or the A04 that was in the G4s which can be had for about $250 anywhere but Apple.

Yeah I guess free really adds alot to the expense. XP comes with a firewall and ZoneAlram has a free version which is what I use.
There are plenty of free virus scanners out there but I am using Norton which costs what a whopping $30 or something.
Most routers or switches have built-in firewalls as well these days.

Cheers,
JS

tallscot
08-19-2003, 12:55 AM
The SuperDrive option is $200 over the CDRW. That's hardly the figure you gave.

There is no better video card than the Radeon 9800, which Apple offers. There is no Wildcat, Fire GL, Quadro...

The video editing cards are PCI.

OS X has a firewall for free. So what? I missed your point there.

Your gripe is with pre-built systems, not just Apple. Dell doesn't give me the choice to just get a box with no RAM, no video card, no HD, etc.

js33
08-19-2003, 01:39 AM
Originally posted by tallscot
The SuperDrive option is $200 over the CDRW. That's hardly the figure you gave.

There is no better video card than the Radeon 9800, which Apple offers. There is no Wildcat, Fire GL, Quadro...

The video editing cards are PCI.

OS X has a firewall for free. So what? I missed your point there.

Your gripe is with pre-built systems, not just Apple. Dell doesn't give me the choice to just get a box with no RAM, no video card, no HD, etc.

I read in Macworld they said the "Superdrive" was a $650 option.
So how much is the CDRW from Apple? Probably still a rip. Keep in mind that same brand CDRW that Apple probably sells for $200 can be had for less than $50 now in the real world.

So Apple users are still going to be stuck with a crappy ATI card with poor OpenGL performance. Not good for LW work. I thought Steve was listening to customers.

The point of the firewall is Beam was talking about the added expense of a firewall and virus scanner for Windows and I explained that XP comes with a built-in firewall.

My gripe is that Apple isn't very flexible in their options to allow you to put in or leave off things you may or may not want.

I don't know what Dell offers but you can get a barebones system and put whatever you want in it anywhere.

Cheers,
JS

Beamtracer
08-19-2003, 04:06 AM
Intel and Microsoft have "horizontally" integrated markets. Apple has a "vertically" integrated market.

Apple's structure is called vertical because they supply everything in the system... that is, the OS, the hardware, and in a lot of cases the software.

Which is better?

Well, there would be advantages to both. The advantage to Apple's way is that there are less combinations to go wrong, or to problem solve.

Musicians usually use a Mac because they can be more assured that things will work together. In the Windows world there are an infinite number of combinations of build-your-own boxes so every combination can't be tested for.

js33
08-19-2003, 04:08 AM
Well that's true I guess but harware is pretty standard these days. It's hard to build a system and have it not work.

Cheers,
JS

Nakia
08-19-2003, 06:41 AM
ANy mac person know you can go get an Powermac without Superdrive and drop Pioneer A06 into it. The thing is you get a kick butt DVD Authoring tool when you get it with the drive. but if you wait you can get the Pioneer A06 later and drop $45for iLife. PC folks are so funny that wil say how Apple folks are lock down into the machine. I went to Bestbuys and bought some cheap 512meg SIMM and stuck in my Mac. I know I will have issues if it took it ot Apple to get fix, the same goes for Dell, Gateway, HP, you must use what the recommend to keep the warrenty. When it comes to Custom built PC where do you go when it break? Which is no different if you voided your warrenty, No where. Going by the price in the Mac mags is not the best way to get prices, when you Build your PC do you go by the Price of the New Geforec FX in CPU mag, Maximiun PC ????
I bought a HP XW6000 and 5 Computer Stores turned me down when I was looking memory. HP recommend only Kingston on top of that I needed Two 512meg ECC Register DDR PC2100 only. The Computer stores freak out once I mention I had a HP workstation then ECC, Finally I found one store (My six store) that took the time to research it and got me kingston so my warrenty will still be good. That sounds like the Apple stories PC people give.
What Prebuilt PC company gives you flexiable choice??
The choice is really bare or pricey. 3DBoxx is no cheap PC also AlienWare. I drop $1500 for XW6000 Workstaion that came with no CDRW, Monitor, one HD 40gig at that and a Qudra 4 NVS 200 card. Only thing it have going for it is its a XEON with a secound socket I wanted to add a another CPU.
The truth to really have a realible box that is real good you going to spend $$$. No Matter What. Unless you shop at crap load of PC stores online and chase every computer show that comes to town and buy alot of OEM stuff you cannot build a box below $1000, the whole box from mouse to Monitor and the Software in it. Most of us who build PC do borrow from others. Not to many of us dish out another $300 for Windows each time, I still have a 5 year old Compaq keyboard being used on one my last built boxes.

Ed M.
08-19-2003, 07:31 AM
ALL OEMs CHARGE EXTREMELY HIGH PRICES FOR RAM.

Thats why I buy a system with the minimum then hunt through http://www.ramseeker.com

--
Ed

eblu
08-19-2003, 09:49 AM
Originally posted by js33
Well that's true I guess but harware is pretty standard these days. It's hard to build a system and have it not work.

Cheers,
JS

on this i disagree. Watching countless PC being assembled, by Very experienced people, I've noticed one thing. They are expected to have problems from the start. Anything, from parts that dont fit right, to software driver conflicts, or (i love this one) IRQ conflicts (just last week btw).
PC parts all seem to come from the bazar in the City of Babel. None of the parts are designed to work together, and when you get em, its already too late to design them to work cohesively. the benefit of buying a pre-assembled system is supposed to be that it is designed, by a REAL hardware Engineer, to work without conflicts inside the case. Microsoft's biggest nightmare right now is the device driver situation, they cant possibly support them all properly. So even though they do an heroic amount of work on device support, there is always something that falls through the cracks. And Nobody tests for these conflicts. With that many devices on the market, They can't test them all against each other. So you have hundreds of devices to choose from? so what? only a handful are worth their price, and they are the components that the name brand manufacturers Already use.

Apple has less drivers to write, less stuff to support, complete control of hardware and software design. design done right.

Its easy to build a pc that works, just copy what Dell, Gateway (sometimes), Compaq, and Apple do... you'll find you spend more time and money in the long run, with no guarantees about your ability. But its also just as easy to put noname brands in a noname box, and have a permanent headache because two or more components disagree with each other.

I buy Apple because I want the best experience, and after hearing the horror stories from the people who are stuck with the frankentsien, lowest price availible, ugly beige boxes, I know that I am having the best experience.

Nakia
08-19-2003, 11:10 AM
If things are going to go the way of 64bit and PCI-X technologies the Pre-Built boxes are going to be best.
PCI-X stuff is going to pump out alot of info for the System to manage. Also the Prices of 64 CPUs like AMD and Intel will be affordable for those who can buy them in builk. So it will be companies like Boxx, AlienWare, HP, Dell, SGI and Apple who folks will first run to get them. (I will sell all my stuff to get a SGI 64bit Intel workstation)
Companies that build the Machine and the OS make a good system. SGI, Sun, Apple, Amiga are good boxes. They might not be the speediest things, but when you pop any hardware they make for it in it works.
There are alot of stuff for PC and Macs out there but how many of them are worth it?
How many are willing to buy video cards that are not either nVidia made or ATI?
Its either AMD XP or INTEL P4 mostly, not too many Celery or Duran hardcore workstations around. At one time Asus was the board of choice, for AMD now its nVidia. Intel I stck with XEONs which leave me with e7505 chipset. Sound cards, well most motherboards come with good sound now. Type of memory will you stuck with the motherboard deals with. I will never go with cheap memory only memory with lifetime warrenty like Kingsten Memory. Hardrives mostly IDE they are all Cheap. When it comes to good parts there are no real variaty amongst PC users as they claim. They are alot of cheap no good parts but who will use em or even trust them.
Its like 3D apps there are alot of of the shelf cheap 3D apps ranging from $30 and up. How many of them are useful?

tallscot
08-19-2003, 11:20 AM
My gripe is that Apple isn't very flexible in their options to allow you to put in or leave off things you may or may not want.

I don't know what Dell offers but you can get a barebones system and put whatever you want in it anywhere.

Yep, and it's worth it. I find that I would rather have limited hardware choice, like the G5, and have OS X versus more choice and Windows. Everyone has that choice. Some people put more priority on the hardware than the OS experience, and that's cool too. Some people put more priority on the price. That's fine.

Personally, I don't know a single 3D animator (friends of mine) who uses anything other than a high-end gaming card, like a GeForce or Radeon. The Mac definitely needs pro 3D cards, though. I wouldn't buy one because I'm not at that level or 3D animation that requires that kind of card, nor could I justify the expense, there are are definitely going to be more high-end animators coming to the Mac with more high-end applications coming on board, the G5, etc., so they will be looking for pro cards. Rumor is Panther has drivers for some high-end cards, but I haven't confirmed them at all. The cards have to be specifically Mac because of firmware, so just a driver won't do it. I'd be surprised if we didn't see something this time next year.

Again, I'd order a custom G5 with a single 512 meg stick and I'd put seven more 3rd party 512 meg sticks. I'd get the smallest drive from Apple and then stick in the best 3rd party drive. The SuperDrive is $200, and I don't have a problem with that.

It's worth it, easily.

dfc
08-19-2003, 11:32 AM
You have to install ram in the G5 is matched pairs. If you order it with 512 megs of ram (default) it will come with 2x256 in it. One in bank A, the other in bank B.

Plan accordingly.

It's like the old 8500s that used interleaved ram.

If you fill it up with matched small amounts..when you want to go to more ram..you have to yank the smaller ones and replace them with larger ones in pairs.

You are better off...using larger ram sticks in pairs. That way...you only pay for them once.

tallscot
08-19-2003, 11:44 AM
Ah, yes, I forgot about that. I meant to say I would order 1 gig, 2 512 sticks.

Thanks for the correction. :)

Johnny
08-19-2003, 11:45 AM
Originally posted by dfc
You have to install ram in the G5 is matched pairs. If you order it with 512 megs of ram (default) it will come with 2x256 in it. One in bank A, the other in bank B.

When I ordered my Dual 2 ghz, the Apple person said that G5s do not require matched pairs of RAM...

wonder what the scoop there is..

J

Nakia
08-19-2003, 11:45 AM
To get back to somewhere near the 64bit G5, Apple users need to take a look at SGI. http://www.sgi.com/ There are the ones that been running the show on the high-end. SGI been mates with 64bit for along time and alot of the highend apps are only on those boxes i.e. Flame, Flint, Inferno some even got there start on it Alias Poweranimator, Softimage 3D. These boxes should be the guide line for us future Apple 64bit heads.
SGI was such a power house cause they build proprietary systems. SGI is able to make sure things run good on their systems. Yet they are not the fastest out there but apps do run smoothe on them. Also they are UNIX based.
When we argue wy PC are better then apple cause of the ablility to custom build them lets step back and look at SGI who's boxes cost $$$ and you can't do jack to em. But they are still kings of the hill (slowly loosing to Linux render farms but who they loose to on the workstation front who knows?). Reason why PCs and Macs can do 3D was the spin off of nVidia from SGI. It was the Geforce 256 card that finally made PC able to do what my Indigo2 MaxImpact can. That box was from 1996.
I own a few older SGI boxes and can tell you Photoshop and Illusrtator runs real smoothe on my 185Mhz SGI o2. GL apps no problem, built in GL support on the main board is a sweat deal.
I think Apple should have a line a machine that are very proprietory. But make it serious. But unlike SGI make it somewhat affordable.
Like some of these babies
http://www.sgi.com/workstations/
Boxes like those can only come from a company who makes the OS and the Hardware, that way the thing will kick @$$. This is 64bit computing I think. and serious graphics pushers.

Ed M.
08-19-2003, 11:46 AM
Ted (eblu), you're absolutely right, and one of the brightest guy's I've met on the NewTek forums.

Not so long ago I mentioned that I had a suspicion about the Opteron's "Achilles heel" and you pretty hit directly upon what I wanted others to at least think about. I'm not even certain if you've given it any thought. I've mentioned it briefly, but no one picked up on it. This latest excerpt from your post speaks volumes:


Microsoft's biggest nightmare right now is the device driver situation, they cant possibly support them all properly.

This only becomes more of a nightmare with new software AND the new AMD hardware. I haven't witnessed too much thought being given to it anywhere on the web. I was hoping some tech sites like Ars would have picked up on it and delved into the thought a little more.

A good friend and colleague of mine did in fact bring this up at Ars where it was discussed briefly, but faded into the background. Some of the things he mentioned seemed to point to AMD (and Microsoft) having great difficulty in this particular area. We'll just have to wait and see. Remember, Windows is *already* a MISHMASH of components trying to work together in some cohesive state. There is just waaaaay too much variation and combination and almost ZERO QA.

Another thing that PC-fans don't like to mention (or perhaps they aren't aware of) is the fact that Microsoft even states on their site that even though each individual component might in fact be "Windows certified" (I mean how many *really* are after all?) does NOT mean that any combination of those items assembled together will work as advertised or even at all.

It gets back to the "benchracing" that many PC hobbyists fall prey to. I see it all the time with amateur automobile enthusiasts when they try to "hotrod" their rides. They thumb frantically through the catalogue ordering and installing all the flashy, biggest, basest performance-boosting parts and accessories they see. They add up all the advertised (claimed) performance gains and think they "got the bestest ride out there" After all they ordered all the "best" components.. Same here... All benchracing... It NEVER adds up like that in the real world without some REAL knowledge of how to make it work PROPERLY.

Apple is already ahead of the game...

THE OS AND THE SOFTWARE ARE ALREADY BEING RELEASED TO TAKE ADVANTAGE OF AT LEAST SOME OF THE G5's 64-BIT GOODIES.

Where is AMD? The Opteron? Yeah yeah, I hear everyone singing the praises, but where is the support from the developers? Where are the breathtaking benchmarks? NewTek and BOXX are playing connect the caboose. Where are the Lightwave benches if this is the *preferred* system? We have nothing from BOXX and nothing from NewTek. As a matter of fact, we have nothing from anyone.
What scores and tests we do have seem lame. Oh well, I guess we'll find out soon enough. IMNSHO, NewTek should have teamed with Apple... Not BOXX. How long do you really think BOXX is going to be around anyway? Companies like that come and go.

BTW, I spoke with Chris Cox via e-mail yesterday. The G5 enhancements will be available shortly, if not already. Over 200% speed boosts with some operations. In short, he said these new machines scream and that they saturate the bandwidth pretty well. Also of interests to me (and I suspect Lightwavers) was that according to him, nothing they've tested in the Wintelon camp even comes close. What's more, he said developers should have ZERO difficulty tweaking their ware to take advantage of the G5's enhancements. He did have a concern for NewTek though. He mentioned that If they used dcbz (data cache block zero) instructions a lot, then their code could be slower on the G5 than it was on the G4. But that it was a well documented "gotcha" and something that they should have already corrected anyway.

In short, what he is saying is that NewTek should be able to take great advantage of what the G5 has to offer and allow Mac-Lightwavers to do things that would make other systems vomit. It's all up to NewTek. However, it seems quite sad when no one from NewTek even bothered to take him up on his offer. Keep in mind that Chris Cox is one of the leading code optimizers on the planet. I invited him to these forums because I felt he could make a big difference. Same as when I brought Dean and John over to the discussions.

For those that remember, when Chris paid a visit to the forums here, he voiced concern about how NewTek was handling Mac optimizations. Keep in mind that he's not the type of guy that goes around pissing in other people's pools -- he has no time for that nor the desire to be a p#&@k. He expressed genuine concern because he knows that Mac-Lightwavers are expecting the same type of quality, performance and effort that Lightwavers in the x86 camp enjoy. They aren't. Even more disheartening is that it looks like things aren't going to *begin* to change until LW9; which means Mac-users will probably have to wait until version 10 (if it makes it that far). By that time, who knows what the competition will have had out? You guys get the picture.

Oh, here is a page some of you might be interested in:

http://developer.apple.com/technotes/tn/tn2087.html

Ted, you hit it squarely on the head. Is NewTek even paying attention to this thread? Probably not.

--
Ed

tallscot
08-19-2003, 11:53 AM
Great info, Ed. Thanks.

Nakia
08-19-2003, 12:07 PM
Thanks for the info Ed!!
Hot stuff.

dfc
08-19-2003, 12:08 PM
Originally posted by Johnny
When I ordered my Dual 2 ghz, the Apple person said that G5s do not require matched pairs of RAM...

wonder what the scoop there is..

J

From Apple's site..G5 under tech specs

Memory
* 128-bit data paths for up to 6.4-GBps memory throughput
* 1.6GHz model
* 256MB of PC2700 (333MHz) DDR SDRAM
* Four DIMM slots supporting up to 4GB of main memory



* 1.8GHz systems and 2Ghz systems
* 512MB of PC3200 (400MHz) DDR SDRAM
* Eight DIMM slots supporting up to 8GB of main memory
* Support for the following DIMMs (in pairs):
* 128MB DIMMs (64-bit-wide, 128- or 256-Mbit)
* 256MB DIMMs (64-bit-wide, 128- or 256-Mbit)
* 512MB DIMMs (64-bit-wide, 256-Mbit)
* 1GB DIMMs (64-bit-wide, 256-Mbit)

It would appear that the 1.6 ghz G5 doesn't require installing ram in pairs.

But, the 1.8 and 2ghz do.

Ed M.
08-19-2003, 12:19 PM
OK, to add to what I was talking about when I was attempting to explain to Lightwolf that simply moving to 64bit gave you nothing in the form of *speed* I talked to a hardware/programmer fellow that I know. He's very up-to-snuff on both the Opteron and 970 architectures. I thought that the information was valuable so I wanted to post it here (I hope he doesn't mind) Here is what he had to say:


There are other architectural enhancements which AMD has added to 'x86-64 beyond 64-bitness, most notably the larger register files for integer and SSE.

The PPC970 'only' widens registers to 64 bits, but does not increase the number of architected registers (there's already a LOT).

There is indeed rarely a win for the move to 64 bits, but 'x86-64 encompasses more programmer-visible improvements than merely 64 bit wide addressing.

Any potential performance improvement one might see with Opteron and Athlon 64 (when its available) will most likely come from the increased number of visible registers, and *not* from their being 64 bits wide.

However, in AMD's case, programs have to be specifically compiled for the "64 bit mode" to gain [any] access to those other architectural enhancements. Programs for the G5 do not or not to the same extent.

The G5's enhancements are mostly not programmer-visible, but act behind the curtain, so to speak. The benefits of deeper Out-of-Order-Execution are tremendous and wider superscalar issuing are available to all existing 32 bit executables as well. Those architectural enhancements are not *turned off* in the G5's 32 bit mode (just to say this explicitly). Programs will benefit as they are.

BTW, the penalty for going to 64-bitness is actually slightly greater for 'x86-64 than for PowerPC64 (970), because all 'x86 machine instructions that access any of the new registers are [one byte larger] than they used to be in 'x86-32.

With PPC64, only some bitwise shifts and rotates may cause increased program size (because a single 32 bit opcode cannot describe all desired 64-bit wide boolean masks for PowerPC-style bitfield instructions).

That means 'x86-64 will, on average, store fewer machine instructions in an instruction cache of a given size than 'x86-64 would, and that percentage is estimated to be as high as 20% [perhaps more]. As compilers get smarter, that penalty can probably be reduced (just do what you can to favour the lower half of the register file). Furthermore, thanks to the larger register file, there will be fewer memory accesses specified in the instruction stream; this works to counter the increase in program size. The net 'code bloat' effect will probably be around 10% though.

In any case, the point is hard to settle, because in both cases (PPC970, K8) the capability to do 64 bit processing cannot be separated from other enhancements made to both architectures.

AMD can be credited for improving 'x86 in substantial and sensible ways (much more so than Intel), and all of IBM, Motorola, and Apple can be credited for devising an architecture that had most these goodies right from the very start and yet was extended wholeheartedly as their 'new' ideas became practical (AltiVec! :-). - anonymous authority

The irony will be that x86 will be around a loooooog time. How much further can it be taken? We can only wonder.

--
Ed

Nakia
08-19-2003, 12:36 PM
To throw something into it how do they Compare to MIPS and UltraSPARCs? These two been around along time. The UltraSPARCs are hitting their 4th generation CPUs and SGI MIPS are up there too. This companies solve alot of the SMP issues and freaky ways of multithreading along with hardware level Clustering.
With Apple pushing into the 64bit world along with AMD there should be some comparson to the current 64bit systems. I can't find any. I would rather see how the newbie 64bit CPUs improved over the current ones or compare too. Also the whole system being that the mainboard and IO devices are important.

Any links with that info will be nice.

Ed M.
08-19-2003, 02:09 PM
Well, guys, I hope the info I'm posting helps answer a few questions regarding the comparisons between the AMD chips and the 970s as well as what 64bit *really* means and what it means for each platform. I'd figure Scot and Ted would chime in to correct anything I might have fouled up lol. In any case I hope someone (are you listening, Beam?) is bookmarking or compiling these threads because I feel they contain a lot of good information.

As usual, the *highly vocal* PC enthusiasts have little to say at this point (I hate to provoke them though). Anyway, the future for the Mac has never looked brighter. The best part is that this is only the beginning, and we're already ahead of the competition. Developers are already offering tweaked software. Apple has a partially tweaked OS ad Panther coming shortly.

The only dolt I'm waiting on is that jerk over at Digital Video Editing. He really has it in for Apple. He always seems to find some fault -- something to ***** about. And just so everyone knows, Mr. White has YET to respond to any of Adobe's official e-mails requesting his benchmark methodology regarding past Photoshop tests pitting Apple's machines against that of the competition. He hasn't replied to Chris Cox personal e-mail requests either. You would think that since he's posting these results on the web for everyone to see, he'd have the balls to discuss things in detail with Adobe. He hasn't. You would also figure that if the lead programmer and optimizations guru for Photoshop contacted him and "calls" him on the obtained results, he'd be man enough to discuss it. Again, he didn't.

I guess I just wanted to bring this out in the open so everyone knows how BOGUS this character is. Someone should send that guy a link to this very post and let him know that certain people are watching -- very closely. I have a sneaking suspicion that it's going to be increasingly difficult for him to cripple these new G5 machines, but rest assured folks, he'll likely go out of his way to do what he can in that regard and post more bogus performance claims to the net. He's a Wintelon lackey plain and simple and no one should believe his worthless drivel. Just preparing you guys for any future crap that might happen to spew from his pie-hole in the future. (gosh I'm an angry fellow :-)

--
Ed

mlinde
08-19-2003, 02:09 PM
Originally posted by dfc
It would appear that the 1.6 ghz G5 doesn't require installing ram in pairs.

But, the 1.8 and 2ghz do.

The 1.6 GHz G5 isn't built on the same architecture as the others in a number of ways. However, to access the RAM most effectively in the G5 machines, you need to install the memory in pairs.


Greg Joswiak, VP Hardware Product Marketing, Apple Computer
Performance is king in this machine -- even in memory. The two sets of memory banks must be fed in pairs, from the inside out, because the system sees each 64-bit RAM pair as one superfast, massive 128-bit chunk of memory.

Based on this (and additional details I didn't quote) it seems that the 8GHz RAM machines have to have the memory installed in pairs.

Ed M.
08-19-2003, 02:12 PM
Looks like you beat me to the start of page 9 mlinde lol. I hope no one misses the last post of page 8 though ;)

--
Ed

dfc
08-19-2003, 02:18 PM
Thanks Ed for the info.

I couldnt' find the thread on Ars about the issues with AMD/MS on drivers for 64 bit.

Could you expound on that a little more?

Or..just post a few links?

Thanks

dfc
08-19-2003, 02:45 PM
Yes, Ed,
I'm anxious too to see Mr White's tests.

He'll have his work cut out for him this time :)

After everybody complained about the PS and Logic tests at the initial G5 showing...Apple reran the tests using AE and Cubase.

So, He'll have to come up with something different I think than AE and Adobe to show the G5s failure.

Maybe he'll end up having to use Lightwave..hahahha.

But, I can save you the trouble...this will be the summary...while the new G5 shows marked improvement over the older G4s...they still come up short of the latest wintel offerings. (he'll have graphs that max at 1 or 2 seconds..showing the INtel .07 seconds ahead but it will be the HUGE long bar..that looks like it's double the macs score)

Then surmise that given the lack of software and options and the cost of the G5 being so much more (500 dollars)...that it would be a good choice only for mac users looking to move up...but x86 users are still better off with more choices and less cost..etc.

In other words..he's gonna be left with "compatibility, choice, price" arguments.

He'll talk about how the new windows only premiere has more export options...and using FCP on mac requires you to buy DVD pro for 499 to get the same capabilities..etc.

I'm sure he'll find some app that runs like crap on the G5 vs the Intel or AMD.
I just hope it's not lightwave. (even though I have a bad feeling that's gonna be one of those apps). That would be a real bummer here.

tallscot
08-19-2003, 03:04 PM
No, you are spot on about Charlie White. His editorials are bogus.

The one about the G5 was hilarious, but in a bad way- the BOXX dual Opteron with the 32 bit Windows XP Pro is the first 64 bit personal computer, didn't you know? And Veritest's SPEC numbers are false, but AMD's SPEC numbers are true...

And Premiere Professional is just as good as Final Cut Pro 4, LOL.

Ed M.
08-19-2003, 03:38 PM
dfc, a few things...

First, I can't seem to find the driver discussion over at Ars either. I have a sneaking suspicion that it was in MacAch though, but I want to make something clear... Given what's already known about the state of drivers on the Wintelon side and the fact that a lot of apps have to be recompiled and debugged to take any advantage of the Opteron's 64-bitness, it looks like developers are going to have their hands full -- with compatibility issues, API issues, conflicts and the like. We'll have to see and I'm not 100% sure -- just a hunch.

As stated in one of my other posts, Apple did a MUCH better job insulating developers from those nasties. Remember the clues I was dropping? Just go back and reread those posts regarding APIs and the like. ;-)

You can start here if you wish:

http://vbulletin.newtek.com/showthread.php?s=&threadid=6856&perpage=15&highlight=Achilles&pagenumber=17

What gives Apple the edge? Remember this little clue:

Apple has done EXCEPTIONALLY WELL with Panther and OS X in general that they were able to hide/camoflague enough hardware detail behind their kernel APIs... and.. well, I'll leave it at that.

Again, it's all speculation though... And if Intel wants in on the 64bit desktop front, AMD will have a very hard fight because its likely that developers aren't going to choose both. Go back and consider what I've been posting while thinking about the chaos that's looming in the 64bit Wintelon desktop space. I try to keep my post down to a bare minimum so it will be easy for people to search for the more interesting tidbits ;-)

Regarding Mr. White...

(Scot, I'm laughing as hard as you thinking about Mr. White :-)

Someone should e-mail Mr. White here: [email protected] and point him in the direction of these posts. I'm sure he'll find any of these threads (the most recent ones anyway) quite interesting. And that is the information he should be made aware of.

For starters, someone should point him toward this thread the other two recent threads that discuss G5 performance. I e-mailed him, but I'm guessing he's ignoring it. Those threads also question the validity of AMDs and Intel's claims with their scores -- something no one has yet done. And where is BOXX and NewTek with their Opteron scores? And expect the desktop versions to be slower than their big brother's when they ship. I mean how does anyone know whether they're cooking the books or not? PC people LOVE to take their claims that show their systems (AMD, Intel etc.) in the brightest light as absolutely true without question and that is not only biased, but sad.

Every test that was questioned by the PC skeptics was answered -- just not with the answer they would have liked to find. Every test of the G5 to date is 100% accurate, and the methodology explicitly documented for ANYONE to scrutinize. Can AMD, Intel or any other PC OEMs back up their cliams? Have they? Has anyone questioned them? You guys get the picture.

--
Ed

Ed M.
08-19-2003, 03:40 PM
Oh, and let's not forget about the suspicions that Chris Cox has about NewTek's code... Chris Posted to the old forums. I'm betting you could do a search and reread what he posted. It would be interesting to see how hosed-up NewTeks code is... If it runs SLOWER on the G5 we know it's hosed and will likely need a lot of work.

--
Ed

dfc
08-19-2003, 03:42 PM
Thanks Ed,
checking the link now.

Antimatter
08-19-2003, 03:50 PM
elbu:


Its easy to build a pc that works, just copy what Dell, Gateway (sometimes), Compaq, and Apple do... you'll find you spend more time and money in the long run, with no guarantees about your ability. But its also just as easy to put noname brands in a noname box, and have a permanent headache because two or more components disagree with each other.

that is true for people who just at random pick out piece and stick em together, but for people that who actualy takes the time to research their pc and find what combation of harware will work with each other, i expect little to no problem with the actual assemblying and setup of my PC what i probably will have little bit of trouble is on the driver side on the linux side because of those damn properity drivers. but other than that i don't expect to have that much or problem with my pc.


Nakia:


Hardrives mostly IDE they are all Cheap. When it comes to good parts there are no real variaty amongst PC users as they claim. They are alot of cheap no good parts but who will use em or even trust them.
Its like 3D apps there are alot of of the shelf cheap 3D apps ranging from $30 and up. How many of them are useful?

its not buying cheap comps, the trick is finding a good store that buy the comps cheaply in bulk and sell them cheaply. that's the trick, i use all high quality products such as western dightal harddrive, intel pentium 4, cosair memory, abit motherboard, sapphire radeon 9700pro and so on. i get my goodies cheap because i'm willing to put out the effort to search for a good deal and wait for a little while for a sale to come up then grab the parts i want.

Ed M:


It gets back to the "benchracing" that many PC hobbyists fall prey to. I see it all the time with amateur automobile enthusiasts when they try to "hotrod" their rides. They thumb frantically through the catalogue ordering and installing all the flashy, biggest, basest performance-boosting parts and accessories they see. They add up all the advertised (claimed) performance gains and think they "got the bestest ride out there" After all they ordered all the "best" components.. Same here... All benchracing... It NEVER adds up like that in the real world without some REAL knowledge of how to make it work PROPERLY.

i'll grant you that, lots of the people i know spend thousands of dollar for liquid nitrogen cooling and stuff to cool down their computer then overclock it. now i am interested in having good performance but also at an reasonable price, i could had bought that radeon 9800 pro with 256 meg of memory but i instead choice to buy an 9700 pro because it was only 5% slower than the 9800 pro and hells lots cheaper, same with the cpu, i could had bought an 3.2ghz but instead i settled for an 2.8ghz which was about 1/4 the price of the 3.2ghz


ed m: whaa who is this mr white character, i have never HEARD of him.


anyway regarding mac, if they in my option provides better performance for cheaper price than most pc system then i would be more than happy to switch, as a matter of fact i don't really care if i run on an mac system, or an x86 system, or other system, as long as i can load up my favorite OS, linux :) they got linux compiles out for mac's so that's a good thing i think :)

i've already bought my newest system and i have no cash but in the furture if the macs proves to be more powerful for the dollars i might head that way.

Beamtracer
08-19-2003, 04:24 PM
Charlie White is a "journalist" who hates the Mac. He is famous (infamous) for writing put-downs of the Mac.

The only thing that amazes me is how far Charlie White's comments go. They get repeated in every other magazine. If he kept his arguments in a logical format he would be OK, but instead he cooks his results to suit his agenda.

I'd suggest that Mac users should not click on his website. You'll see his stuff repeated everywhere else, so you don't need to give his site "hits", which will encourage him to write more drivel.

ED... thanks for all the previous info, there's lots of interesting stuff there.

I notice on the Apple page that shows programmers how to optimize for the G5 they say that some of these optimizations can increase the size of the code.

I guess that some of the G5 optimizations could increase the RAM footprint of an application. This is combined with the fact that going 64-bit in itself increases the RAM required.

Not saying this is a bad thing, either. RAM requirements have always increased over time. This is why complex applications will need to migrate to 64-bits immediately, and "home" applications will need to migrate in the next two or three years.

In personal computers, 32-bit processors have a limited life in this world. The new 64-bit contenders are the G5(IBM970), the Opteron and the Itanic. The Pentium/Xeon is out of the race and destined for the scrap heap.

Beamtracer
08-19-2003, 04:58 PM
Quote from the E-Commerce Times:
"For a new chip, this is about as good as it gets."
"The performance improvements are so evident that only a few minutes in the store demo area will convince many buyers that it's time to upgrade.
http://www.ecommercetimes.com/perl/story/31363.html (http://www.ecommercetimes.com/perl/story/31363.html)


Sorry Pentium lovers. You miss out this time!

js33
08-19-2003, 06:51 PM
I just hope all these G5 improvements help running LW better. How's the screamernet situation on the Mac these days? I configured SN on 3 pcs in five minutes and was network rendering without a hitch. You will have to use SN or a 3rd party util to render effieciently on the dual G5. Are there any 3rd party utils that let you render on a mix of macs/pcs? I guess if you have a dual machine you could just run two instances of Lightwave and render that way.

Cheers,
JS

Triple G
08-19-2003, 09:14 PM
Originally posted by js33
Are there any 3rd party utils that let you render on a mix of macs/pcs?

This is probably your best bet for now, for mixed-platform Screamernet rendering:
http://www.catalystproductions.cc/screamernet/

Though I'm not sure how good of an idea this would be to actually do. I know that in the past, there were issues at one place I worked where there was a mixed rendering environment of Pentiums and DEC Alphas...something to do with the way that procedural textures are generated differently by the different processors (I'm not too well-versed in the specifics of this area, all I know is that there was a noticeable difference.) I would assume the same to still hold true for a mixed Windows/Mac rendering environment. I think the ideal situation would be to have two seats of Lightwave and render one entire sequence on one machine, and another sequence on the other machine. Probably setting up SN to run in mode 3 and batch rendering on each machine would be the most efficient way and the least likely to cause problems when you need to bring all the frames together for compositing/editing.

Ade
08-19-2003, 09:42 PM
I never got Screamernet to work on the macs..I gave up trying, it would always crash when initialising a scene.. Poor quality.

js33
08-19-2003, 10:30 PM
Well that's unfortunate. This is by far the biggest difference between PC and Mac Lightwave. SN on the PC is easy to setup and works very good. I never tried it on the Mac since I only have one Mac. I hope they have made improvements in 8.

Cheers,
JS

Triple G
08-19-2003, 11:24 PM
Even though you only have one Mac, you can still use Screamernet to batch render your scenes. There's even a freeware app called LWSN Controller which acts as a front-end for Screamernet and makes setup a snap. Though, I have to admit...even still, sometimes Screamernet works for me, sometimes it doesn't. I can't seem to find a reason for its inconsistency. Sometimes I can get it to work by manually editing the scene files in a text editor (getting rid of Spreadsheet Editor references), other times I think maybe there are just gremlins in my machine. :confused:

dfc
08-19-2003, 11:39 PM
Here's another one we can add to the list.

This one is actually old from xlr8yourmac site from June.

"Hey everyone, I've been at Apple's developer conference and had a chance to install and try out After Effects on a new G5.
I ran the Night Flight file that has come to be the standard for AE benchmarking. Since I didn't want to sit there and watch it render for hours, I ran just the first 10 interlaced frames from the project's pre-set render queue...
http://www.aefreemart.com/tutorials/3DinAE/nightflight/nightflight.html
Here are my results for this test on the three computers I have available to me:

1 x 1.0 GHz G4 PowerBook 17" - ~30 minutes (3 min/frame)
2 x 2.66 GHz Pentium Xeon from Boxx - 11 min, 39 sec (1.2 min/frame) (actually 1.165 min/frame)
2 x 2.0 GHz PowerMac G5 - 6 min, 1 sec (0.6 min/frame)

js33
08-20-2003, 12:03 AM
I'm waiting for Lightwave benchmarks. As it is the most time intensive, as any 3D app will be, application I don't care about photoshop or even after effects tests.

Cheers,
JS

toby
08-20-2003, 12:33 AM
( Slow benchmarks, Photoshop tests )

Originally posted by js33
Doesn't seem to impressive yet. The Dual 2Ghz G5 is barely twice, if even that, as fast as a single 933Mhz G4

( Fast benchmarks, After Effects )

Originally posted by js33
I'm waiting for Lightwave benchmarks. As it is the most time intensive, as any 3D app will be, application I don't care about photoshop or even after effects tests.


This is exactly what I mean. I don't care if you don't like Macs, but I hate hearing it over and over again, and hearing slanted opinions and exaggerations.

js33
08-20-2003, 12:42 AM
Toby,

I never said I didn't like Macs. I own an iMac w/superdrive.:D

I have always been waiting for LW benchmarks. I'm just never waiting much for photoshop or after effects to render no matter which platform. But LW rendering on the other hand can greatly effect your budgets and deadlines for projects so it is of a bit more concern to me.

Cheers,
JS

toby
08-20-2003, 12:59 AM
yes I know you own a Mac, you tell us how much better your pc is every chance you get.

It's obvious you prefer pc's, and that's fine, but be fair - my example above shows that you don't want to believe that macs are just as good, unless they prove to be twice as fast as the fastest pc in every test, for less money -

Basically you, and others, are letting your preference influence your judgement, something I wish everyone would avoid.

Antimatter
08-20-2003, 01:09 AM
now i myself am interested in seeing a lightwave benchmark, wonder when one will come out?

the reason, is i almost never do any sort of photoshop/gimp or video editing type of stuff, what i spend most of my time is in modeler or in layout working on stuff, so what i'm interested in would be the performance of lightwave on the g5.

is there any out yet, from what ive seen so far i'm guessing there isn't any bench yet for lightwave. :(

js33
08-20-2003, 03:04 AM
Originally posted by toby
yes I know you own a Mac, you tell us how much better your pc is every chance you get.

It's obvious you prefer pc's, and that's fine, but be fair - my example above shows that you don't want to believe that macs are just as good, unless they prove to be twice as fast as the fastest pc in every test, for less money -

Basically you, and others, are letting your preference influence your judgement, something I wish everyone would avoid.

OK let's say the Mac is twice as fast as even the most expensive PC you could buy there will still be "issues" with the Mac version of Lightwave that no amount of speed will fix.
1. Network rendering problems
2. Lack of plugins - situation getting a little better
3. Crappy video cards with poor opengl support i.e. ATI

Those are the main problems. I assume you can change out the ATI card for a GeForce which would help with problem 3.

Only time will tell if these issues get resolved.

Cheers,
JS

kite
08-20-2003, 03:52 AM
Originally posted by Antimatter
now i myself am interested in seeing a lightwave benchmark, wonder when one will come out?

the reason, is i almost never do any sort of photoshop/gimp or video editing type of stuff, what i spend most of my time is in modeler or in layout working on stuff, so what i'm interested in would be the performance of lightwave on the g5.

is there any out yet, from what ive seen so far i'm guessing there isn't any bench yet for lightwave. :(

07-01 I did one LW7.5 benchfile rendering:
http://vbulletin.newtek.com/showthread.php?s=&threadid=7542

Ed M.
08-20-2003, 07:20 AM
In personal computers, 32-bit processors have a limited life in this world.

Not necessarily, Beam... Unlike what coders have to go through to get to the Opteron Goodies, no such effort needs to be made with 32bit apps running on Panther (10.3) or 10.2.7. All the goodies are available to the 32bit apps too, hence the term "hybrid" when referring to apps as well as the OS (X).

The Opteron has no such nicety. To get at the 64bit mode they'll have to do a lot of extra work it would seem. No advantage there IMNSHO.

Keep in mind that while running a 32bit app on Opteron on a non-hybrid OS like Windows, it will be just like running the code on any other plain-Jane 32bit hardware. I'd also venture to say that even if you were running that same 32bit code on an Opteron box with a 64bit OS, you'd still be limited in that it would still *not* allow any hooks into the 64bit features (but maybe some of the other enhancements that come along with it), thus still making it the same'ol 32 bit system.

As my friend pointed out, to take advantage of the 64bit goodies of Opteron, both the OS and the apps will have to be recoded and recompiled. How hard that is to do (for developers) is unclear. The same goes for *any* type of driver (and we know what a nightmare this is already for M$ and developers). The G5 has no such limitation.

Apple and IBM (and Motorola) have done a FAR BETTER JOB with the PowerPC ISA. And as stated earlier, the design allows developers much better access to the 64bit features that the G5 has to offer. And all that is available to 32bit apps that have been tweaked a little bit; and even enjoying *some* of the advantages with the apps just as they are (crappy, non-tweaked G4-code and all!). Then there is the question of demand for apps for these Opteron rigs...

Have we seen anything specifically written or coded to take advantage of the Opteron? No.

Why is that?
Could it be that developers see little advantage?

Are the initial benchmarks *that* disheartening compared to the current 32bit Intel offerings?

How much further will AMD be able to push the envelope?

They are already living from chip-intro-to-chip-intro. Considering the benchmarks that we have seen of the Opterons tested against Intel's current and best 32bit gear, I'll venture to say that Intel will likely keep up with respect to speed, thus diminishing the demand for the Opteron desktops even further. Developers will notice as well.

Developers will likely see little advantage and likely remain content where they are churning out 32bit code for x86. On the other hand, we have heard all the same arguments being raised in regard to the G5. Well, lookie what's happening already.

Developers are JUMPING on the advantages the G5 brings to the table. They are taking advantage of the hybrid features of the OS and the hardware. The PC people will yell and point and sing "Hallelujah!!" when the say the Opterons shipped in June.

Where are the Apps?
Where is the OS?
Where are the Benchmarks?

I'm still waiting for NewTek to release theirs. I'm still waiting for BOXX to release theirs. We've seen NOTHING, people! Personally, I think NewTek looks silly because they seemed to have hitched their wagon to a dead horse (BOXX).

Let's face it, who is BOXX, but yet another leech-OEM-wanna-be-Dell? I'll lay odds that Apple sells more G5s in a single quarter than BOXX sells all year. What I'm saying is that without developer interest or support, the Opteron as a desktop solution (something that it wasn't meant for) will evaporate. Athlon64 is supposed to be designed for the desktop, but think about it. The Athlon64 is supposed to be the Opteron's little brother. What will it's performance be like compared to Opteron? We'll just have to wait and see.

I still have a hunch that Intel came up with a similar plan to migrate x86 to 64bit. Who knows x86 better than Intel? They ran the numbers and probably looked at the same approach that AMD took to migrate to x86 and figured it would still lead to a dead-end. Why do you think they aren't bothering to pursue it? I think AMD is in some serious trouble if Opteron and Athlon64 doesn't take off. Just try and put it all into perspective.

--
Ed

Ed M.
08-20-2003, 07:22 AM
I just hope all these G5 improvements help running LW better. How's the screamernet situation on the Mac these days? I configured SN on 3 pcs in five minutes and was network rendering without a hitch.

I don't know js... we'll have to wait and see just how hosed-up NewTek's code is (see my earlier post).

SneakerNet on the Mac? It bites. So what?

That's not Apple's fault.
That's not the users fault.
That's not the OS's fault.
That's not the hardware's fault.

That's just lamo-NewTek being lazy as hell, not giving customers the quality they deserve for their $$.

And I haven't seen anyone from NewTek say anything about it lately. Personally I think the company is hurting.

Why don't you do us all a favor and go read all those past SneakerNet discussions when I brought John C. Welch and Dr. Dean Dauger over to give NewTek serious options with regard to networking and distributed (clustered) computing.

And SneakerNet is *NOT* network rendering, so get over it.

Ted (eblu) has been bitching about SneakerNet longer than anyone if I recall. As a matter of fact, it was his initial complaints that prompted me to contact Dean and John. The same goes for when I brought Chris Cox over to help them out with their Mac optimizations. Mac users should be EXTREMELY skeptical of anything NewTek says regarding the Mac. They've had plenty of chances to change things for Mac users and on every occasion they turned down the offers. For those that remember the discussions, you know what I'm talking about.

Ade hit it right on the head... POOR QUALITY for the same $$$!

js33 has no other argument, but like all loyal Windows users, he'll tout the weakest aspect of something with respect to the Mac as compared to the PC and sing loud and clear. Hallelujah, brother!


I'm waiting for Lightwave benchmarks. As it is the most time intensive, as any 3D app will be, application I don't care about photoshop or even after effects tests.

Bull$&*t. Once RenderMan is out for the Mac *it* will be the finest render on the market.

Have you even considered how hosed-up NewTek's LW code could be? It could be highly PC-centric no matter what NT is telling its customers. Chris Cox was here or perhaps you should go reread what he posted. You can even e-mail him yourself.

js... you can't say "let's compare this highly optimized code for the x86 platform to the God awful code written for the PPC" It's just silly.

--
Ed

js33
08-20-2003, 07:58 AM
Ed,


js33 has no other argument, but like all loyal Windows users, he'll tout the weakest aspect of something with respect to the Mac as compared to the PC and sing loud and clear. Hallelujah, brother!

There are alot of arguments that can be raised but I know most people in here are tired of hearing them as much as I and others are of hearing you spewing the same crap over and over and over again.

I will reserve my final judgement on the G5 until we get some good Lightwave benchmarks.



js... you can't say "let's compare this highly optimized code for the x86 platform to the God awful code written for the PPC" It's just silly.

As I understand it the code for LW is almost 100% platform agnostic with the only exception being the file requestors and color pickers using the native platform. Remember it used to run on SGI, Sun, DEC Alpha, and Amiga(which is where it all started). Not to mention the fact that all of those but the Amiga were 64-bit platforms. Also SGI and Sun of course were/are UNIX systems. I don't have any direct experience with the SGI or Sun versions but the Amiga and DEC Alpha never had any problems using ScreamerNet. So why does only the Mac seem to have a problem with it? I think it's more Apples fault than you want to admit. Also OSX is relatively new as an OS(even though it is based on BSD). Look how long it took Quark to get to OSX I guess it was all Quarks fault? Maybe LW 8 will see some improvements on the Mac. We'll just have to wait and see.

Cheers,
JS

tallscot
08-20-2003, 11:03 AM
It's not Apple's fault because Maya 5 doesn't have the same issues, nor does Cinema 4DXL, nor does After Effects, nor does Shake, nor does Final Cut Pro, nor does Combustion 2.1...rendering over the network is done flawlessly on the Mac with many applications.

I use After Effects a lot. The speed of AE is important to me, and it's great to see that benchmark. Thanks.

Curious, why is the Radeon 9800 crappy but a GeForce is not? The GeForce is an older generation card than the Radeon 9800. js, what kind of video card are you using?

For me, my platform preference is more important to me than my 3D app preference. Looking at Luxology's application screaming on a G5 right next to a dual Xeon that was glaringly slower, it's obvious the G5 is a very fast system. PC advocates have no answer for that. None. Luxology's president even posted a letter on their site about how the code is the same on all platforms and the G5 is much faster.

If Lightwave 8 turns out to be a downer on the Mac, change applications. Seriously, I'm not going to keep giving hundreds of dollars to a company that is putting out a product that is inferior in quality to the competitor's offerings.

When you look at the response the 3D industry has had with the G5 coupled with OS X, it's obvious the Mac is on an upswing.

http://www.kaydara.com/press/index.php?filename=current/2003/20030723

“Based on feedback from customers and our current industry projections, we do not believe the 3D market warrants the development and support of two different Unix-based platforms,” said Michel Besner, president of Kaydara. “Mac OS X is a much more powerful and reliable solution for graphic artists and 3D professionals. With the release of the new Power Mac G5, we believe Mac OS X is the platform of the future, which is why we have decided to concentrate our Unix development efforts solely for this market.”


http://www.soundtracklounge.com/article.php?story=20030812073633362

In Final Cut Pro 4 the dual G5 recognized that the machine was a "PowerMac G5" and the most amazing thing to me was the fact that of all the video filters and transitions, only a couple filters were NOT real-time (no rendering needed) filters. I think only 1 or 2 transitions were not real-time. Scrubbing through the video files was extremely responsive and fast, even on non-rendered, layered video tracks. Again, in trying to slow down the machine, doing rapid scrubbing continuously on these layered tracks only used about 65% of the CPUs

http://www.architosh.com/news/2003-07/2003c1-0730-richardkerris.phtml

Richard Kerris said Pixar announced at the Siggraph show that PR RenderMan on Mac OS X will be released as a beta in September and will have a final ship date later in the year.

I have an Apple Store about 3 miles away, so when they get the first ones, I'll go over there with my LW 7.5 disc and dongle and do some benchmarking. We'll have to wait on the dual 2 Ghz for a few weeks. But make no mistake, I'll be doing some benchmarking soon and supplying the numbers to Chris' Lightwave Benchmarks, and here. :)

panini
08-20-2003, 11:52 AM
Man............., I come here to the Mac forum after a couple of weeks and this nonsense is still going on.

Ed M. you need to find a good psychiatrist 'cause I haven't seen anybody in so much denial or coming up with these imaginary Apple / IBM collaboration vs AMD is finished nonsense.

IBM doesn't care about Apple and there is no collaboration going on. G5 is an old IBM chip reworked with AMD technology ( hyperbus or whatever and a few other things ) and dumped on Apple who were desperate for anything aftre the G4 fiasco.

IBM just announced that their new server line will feature AMD Opterons ( read that line a few times until it sinks in ). If G5 were all that why would they choose AMD and Opterons, think about it.

In all real world applications, even those mentioned by apple Pentiums or Opterons performed better. For example 2.6 Ghz Pentium, which is a year old chip score higher than a brand new G5 at 2 Ghz.
This according to that NASA guy.

In all SPEC benchmarks available on the net Opterons score between 30 and 50% better than the fastest G5.

The only place where G5 performed better were rigged tests at the Apple show where Apple had who knows what running under the hood and PC were severely crippled ( as can be easily verified, I can't get such low scores from my PC even if I hammer it to death ).

Pixar and Luxology guys are just saying what they were told to say. In any case Luxology's apps looked very slow on both G5 and PC, while Lightwave demos on Boxx ( at Siggraph ) flew without any slowdown, no matter what they were showing.

And if I'm correct ( haven't checked that one lately ) Maya does have severe problems on a Mac, as in: it's not even available ( only "cheap " version is abvailable on the Mac )

tallscot
08-20-2003, 12:17 PM
IBM doesn't care about Apple and there is no collaboration going on.

Wrong.

G5 is an old IBM chip reworked with AMD technology ( hyperbus or whatever and a few other things ) and dumped on Apple who were desperate for anything aftre the G4 fiasco.

This is really hilarious. It's core is based on the current Power4. The 980 is in prototype right now.

Hypertransport was invented by AMD, and perfected by the Hypertransport consortium. So what?

IBM just announced that their new server line will feature AMD Opterons ( read that line a few times until it sinks in ). If G5 were all that why would they choose AMD and Opterons, think about it.

Why don't you think about how the PPC isn't an X86 processor for a second, mmmm kay?

In all real world applications, even those mentioned by apple Pentiums or Opterons performed better.

Show me one.

Here's one that shows the dual Opteron getting its butt kicked by a dual Xeon, which got its butt kicked by a dual G5:
http://www6.tomshardware.com/cpu/20030422/opteron-23.html#3drendering

In all SPEC benchmarks available on the net Opterons score between 30 and 50% better than the fastest G5.

SPEC numbers from AMD are better than SPEC numbers from Veritest? I'll take Veritest's numbers, thank you.

The only place where G5 performed better were rigged tests at the Apple show where Apple had who knows what running under the hood and PC were severely crippled ( as can be easily verified, I can't get such low scores from my PC even if I hammer it to death ).

The only thing that is been verified is the Veritest benchmarks were completely fair:
http://www.applelust.com/oped/amc/archives/amc030718.shtml

And if I'm correct ( haven't checked that one lately ) Maya does have severe problems on a Mac, as in: it's not even available ( only "cheap " version is abvailable on the Mac )

Maya 5 Complete is a great application, with features I wish Lightwave had - like a free vector renderer, a hardware renderer. What's funny is the features in Unlimited that Complete doesn't have are also not available in Lightwave. Correct me if I'm wrong.

But the point was that network rendering works fine on the Mac. The idea that the Mac has some inability to do a basic thing like render over a network is a huge laugh.

toby
08-20-2003, 01:35 PM
Originally posted by panini
Man............., I come here to the Mac forum after a couple of weeks and this nonsense is still going on.

Talk about nonsense!

You make up the most ridiculous statements, half of which have already been stomped into the ground, accuse companies of lying just because their tests show the G5 faster, and don't back up a single thing.

You can stop spewing your LIES because even PC people here don't believe you - in fact I don't think anyone's stupid enough to believe you!

Nakia
08-20-2003, 01:54 PM
Originally posted by panini

IBM just announced that their new server line will feature AMD Opterons ( read that line a few times until it sinks in ). If G5 were all that why would they choose AMD and Opterons, think about it.



It will be thier eServer line, basically replacing the Intel boxes. Who really uses the eServer any way? Thats why they are so cheap. Now IBM big boys the Mainframes not sure if IBM is willing to throw that AMD into those. Those boys run IBM OS under Linux. I work Comm rooms loaded with Servers. I alone see over 3,000 sun servers and countless dell Linux and SGI Servers. I see a handful of eServers I do see S390's. IBM talking up to AMD is no biggie.

Intel/AMD and Apple got to prove them selves when compare to Sun and SGI. When folks can afford real servers agian then We will see what 64bit SMP computing is about. So far Intel, AMD, and Apple made anything that compare to Sun, SGI or IBM Mainframes.
And the comparison to either Sun or SGI is so avoided. Why not? These Companies been around along time.

Ed M.
08-20-2003, 03:06 PM
Well, guys, anyone else other than Panini and js33 have a problem with what I've said or the information I posted? Let me know if you think it's made up, far-fetched or in accurate. Please. Personally I think js (and definitely panini) are full of... Well, you can add your own expletive.

Again, can someone correct anything I've said that might have been incorrect regarding the Opteron? And no, not you panini, you haven't a clue what you are talking about.

Regarding the Lightwave tests. Chris Cox called it. He suspected Lightwave's codebase was hosed from the beginning, even before the G5 was even talked about. Would anyone doubt Chris Cox if I invited him to this forum to debunk whatever claims made in favor of Opteron or Xeon? Js? Panini? Remember the gotchas he mentioned, well, it looks like the NewTek guys have a lot of work to do.

However, Mac-users be extremely wary at this point. This will be the app that is going to be used to beat Mac-users over the head time and again because it seems to be a rats nest of spaghetti code. Just look at all the other apps that run exceptionally well on the G5. It's likely that this will be the one test will be the one test most covered in the news headlines reporting that the "G5 Is Slower". What it should show everyone is that the app totally blows chunks and customers should start seriously bitching

Until Lightwave fixes its crap this is going to be the app that will be used to shine a dull light on the G5. I don't expect it to ever be corrected. Oh, and don't expect Maya or PR-Render to have any such performance issues. NewTek must have been asleep the last 4 years.

http://developer.apple.com/technotes/tn/tn2087.html

Note to Mac-users: It doesn't make sense for an app to run slower on the G5 unless the app developer was completely incompetent. So be on your guard.

--
Ed

Lightwolf
08-20-2003, 03:29 PM
Originally posted by Ed M.
Again, can someone correct anything I've said that might have been incorrect regarding the Opteron?
Hi Ed...
If you insist:

Have we seen anything specifically written or coded to take advantage of the Opteron? No.
I would look at db2, IBM's database for example, which runs on x86-64 in 64 bit mode. Not to mention gcc and Linux of course. There are a couple of other big iron apps (mainly in the database department) that have been or are being ported, as well as a commitment by autodesk/discreet.

Are the initial benchmarks *that* disheartening compared to the current 32bit Intel offerings?
Not really, where the Opteron really shines though is SMP, especially once you get beyond 2 processors. Mainly due to the efficient memory architecture. That's why there've been so many announcements lately of Opterons being used in massive paralell computing. It seems to currently be the only CPU that bridges the gap between "home" use and massive computing (i.e. more than 1000 processors in a computer).

Where are the Apps?
Where is the OS?
Where are the Benchmarks?
I think we've had that one a thousand times.

I still have a hunch that Intel came up with a similar plan to migrate x86 to 64bit.
Actually, intel does have the rights to use the same extensions, and it is rumoured that prescott has them hidden already. If x86-64 takes off, intel will be able to jump ship in no time.

To get at the 64bit mode they'll have to do a lot of extra work it would seem
If the code is clean, a recompile is all that is need for an app.

That's just from the past 4 pages, I didn't bother going through the other threads...

Cheers,
Mike

Rey
08-20-2003, 03:41 PM
Hey guys remember, "do not feed the trolls." Ed. M, Beam and tallscot, you guys are doing a great job refuting all the FUD that has hit this forum recently. I agree that I am more of a Mac user than a LightWave user. While I do most of my 3D work on LightWave if a better alternative with a competitive price comes along then I'll switch. Right now, at this instant, though I think LightWave gives you more bang for the buck (although our Mac dollars don't seem to buy as much as PC dollars in NewTek Land). That's why I'll hold onto my Mac dollars until I get a suitable opportunity to see what LW8 has to offer. If the Mac version isn't up to par with the PC version then I'll spend my Mac dollars elsewhere.

Beamtracer
08-20-2003, 03:42 PM
Originally posted by Ed M.
Keep in mind that while running a 32bit app on Opteron on a non-hybrid OS like Windows, it will be just like running the code on any other plain-Jane 32bit hardware.
Wouldn't that be an incentive for Windows developers to optimize for AMD64?

The G5 gives stellar performance on both 32-bit and 64-bit code. The concern is that software developers might be lazy and not take advantage of 64-bit features if their apps work OK without.


Originally posted by Ed M.
Have we seen anything specifically written or coded to take advantage of the Opteron? No.

Why is that?
Nobody in the Windows world does anything until Microsoft gives them permission to do so. It's a mafia-like situation.

That reminds me... when Apple announced its 64-bit platform at the Worldwide Developers' Conference it also released compilers to assist software developers migrating their apps to OSX64.

Who will be writing AMD's compiler software? AMD? They're not a software company. Don't say they'll be leaving this up to Microsoft?


Originally posted by Ed M.
I still have a hunch that Intel came up with a similar plan to migrate x86 to 64bit. Who knows x86 better than Intel?
There's been a lot of speculation that Intel has a secret skunkworks project to develop an Opteron-like processor, in case its failing Itanic processor bites the dust.

Ed M.
08-20-2003, 04:04 PM
Wouldn't that be an incentive for Windows developers to optimize for AMD64?

Nope.. see the reasoning I provide in other posts a few pages back.

Lightwolf: You've completely missed the point. Go back and rad what I've posted again.

Note to others reading this forum: Can anyone help this poor fellow out and possibly explain it in simpler terms?

In the end, we have one app that runs even more crappy on the G5 than it does on the G4. Anyone see the *real* problem here?

I think I'm going to point this thread in the direction of a few people over at Apple. I think they'd be interested to know what's going on with Lightwave. It should come as no surprise though.. Mac-users were always treated like the redheaded step children, and we knew there were problems with Lightwave code long ago and we know there was problems with SneakerNet long ago.

Just remember loyal Mac-Lightwavers, offers to help resolve both of these issues was made numerous times by leading experts. NewTek declined every single time.

--
Ed

Beamtracer
08-20-2003, 04:10 PM
I expect that a single processor G5 would be better value for money than a dual processor one when it comes to Lightwave. This is because there is are amounts of code in Lightwave that is not multiprocessor aware.

Photoshop will probably do much better because it is fully multithreaded, and their G5 optimization plug-in (by Adobe's Chris Cox) has just been released.

Lightwolf
08-20-2003, 04:17 PM
Hi Ed.
As to the Screamernet rendering: If you go back to the old forum (I think you were part of the discussion), you'd notice that NT chose a way for the nodes to communitcate that is so simple that it should work on any OS (it actually did, at least on any supported platform from the Amiga onward, vai Mips, NT on Alphas, SGIsand even OS9 Macs as well as older OSX versions).
Now you're blaming NT for not optimizing their network rendering even though the OS update broke it? You can blame them for not reacting to it, but it is hardly their fault...
As for optimizing their code, I honestly doubt that there is very much to optimize in a CPU specific way, since at least 99% of LW are written in C, it is more a matter of choosing the right compiler and options. On intel the used icc to great effect (at least for the P4, on the Athlon LW did get a tad slower), and I assume they'll use Apple's compiler and use the right switches...
I don't really know why you, or Chris Cox, assume that there is something wrong with the code, have any of you looked at it? The current LW performance seems to be pretty much on par with, for example, the Adobe AE performance if compared to a PC (who does Chris Cox work for?).

Cheers,
Mike :rolleyes:

Ed M.
08-20-2003, 04:32 PM
Lightwolf... regarding SneakerNet... There were several really excellent threads covering the situation. PLEASE, PLEASE, go back and re read all the posts authored by John C. Welch, Ted Devlin and Dr. Dean Dauger. Then come back. It isn't a Network renderer.

As for the code... Why hasn't it gotten any better? Why must it take so long? As for who seen it, well, no, I have not seen it.. However, Chris Cox has made the offer to help on several occasions. NewTek has yet to take him up on the offer. Chris is one of the best CPU optimizers on the planet (x86 or PPC), if not *the* best. Period. He's been at this stuff for an extremely long time. Give the guy some credit. I'm sure he knows what he's talking about.

--
Ed

Lightwolf
08-20-2003, 04:38 PM
Originally posted by Ed M.
Lightwolf... regarding SneakerNet... There were several really excellent threads covering the situation. PLEASE, PLEASE, go back and re read all the posts authored by John C. Welch, Ted Devlin and Dr. Dean Dauger. Then come back. It isn't a Network renderer.

I did, I even added to it, it renders across a network here, so it is a network renderer afaik. You probably don't call accessing files on another machine networking, I do.


As for the code... Why hasn't it gotten any better? Why must it take so long? As for who seen it, well, no, I have not seen it.. However, Chris Cox has made the offer to help on several occasions. NewTek has yet to take him up on the offer. Chris is one of the best CPU optimizers on the planet (x86 or PPC), if not *the* best. Period. He's been at this stuff for an extremely long time. Give the guy some credit. I'm sure he knows what he's talking about.
I don't doubt it, then again, I have to defend NT here, some of the guys there know the P4 better than most people at intel do (-> VT crew), and I'm sure someone like Andrew Cross know how to write optimized code... Ever seen something like a VT perform on a Mac? Not even FCP 4 comes close.
But tell me, what exactly is wrong with the code, except for the usual bugs (which the intel version has as well) ?

Ed M.
08-20-2003, 04:45 PM
Well, I'm not up on video-cards for the Mac. That's Beam's specialty. I'm sure the Mac has plenty of options though. Instead of the VideoToaster however.

--
Ed

Beamtracer
08-20-2003, 04:51 PM
Originally posted by Lightwolf
But tell me, what exactly is wrong with the code, except for the usual bugs (which the intel version has as well) ?
Lightwave still sports an archaic command line based interface

Most of it is not multithreaded, and is therefore slow.


Originally posted by Lightwolf
I have to defend NT here, some of the guys there know the P4 better than most people at intel do (-> VT crew)
I'm sure they love their Pentium processors. I wish them well. However this is no help to Newtek's Mac customers

Lightwolf
08-20-2003, 04:52 PM
Originally posted by Ed M.
Well, I'm not up on video-cards for the Mac.
Fine enough. what makes the VT special is that it delivers amazing performance and still does _everything_ (except for i/o) in software, including 3D DVEs, blurs etc, all the heavy stuff. You can only achieve this performance through heavy optimizations, and there is no other Vendor even close to the VT as far as the software performance is concerned...


That's Beam's specialty. I'm sure the Mac has plenty of options though. Instead of the VideoToaster however.

You're actually not sure, you're guessing, as your following comment on the VT shows. I'm not going to list all the PC video options here though ...
And I'm not saying the Mac doesn't have any either, it actually has some pretty nice ones, but that goes for both platforms (jic) :rolleyes:
Cheers,
Mike

Lightwolf
08-20-2003, 04:56 PM
Originally posted by Beamtracer
Lightwave still sports an archaic command line based interface

Which is good. So does Shake for example, Renderman, oooh, I could name a few others. Even OSX has an "archaic" command line interface (and you even have to use it sometimes, since you can't get at all the options via the gui).


Most of it is not multithreaded, and is therefore slow.

That is absolutely true, but that goes for both platforms. I was asking more for Mac specific problems with LWs code...


I'm sure they love their Pentium processors. I wish them well. However this is no help to Newtek's Mac customers
...but it shows that they know how to optimize for a processor... Once you know _how_ it works, it is quite easy to adapt to other platforms...
Cheers,
Mike

Beamtracer
08-20-2003, 04:57 PM
Newtek's baby, the Video Toaster (for Windows) has a slightly different market to Apple's Final Cut Pro.

FCP is used by national TV networks for broadcast, HDTV, and it is also used for feature films (growing quickly in this area). FCP's main competitor is Avid.

I'm sure people will argue about this, but if you believe Newtek's recent marketing you'd get the impression that Toaster (and its derivative skins) is being aimed at local TV stations, and church videos.

Ed M.
08-20-2003, 04:58 PM
So, the VideoToaster is the best thing since sliced bread, is that it? And no other combination can compare, is that also correct? OK, fine. I'll buy that; until someone states otherwise. Wonder why they don't bring it over to the Mac though. Again, that should also be a concern for LOYAL Mac-customers.

--
Ed

Lightwolf
08-20-2003, 05:01 PM
Beamtracer, I wasn't talking market :rolleyes: I was talking performance. VTEdit isn't my editing app of choice either, that is not the point, but the amount of performance the software squezzes out of those PCs is just amazing. Heck, Media 100 need three PCI boards to achieve the same in realtime, and it allows for less layers as well.
Hm, maybe they'll squeeze LW a bit too :)

Beamtracer
08-20-2003, 05:02 PM
It's interesting that the Aura video paint program was also Windows-only. As soon as it leaves Newtek (it is now known as the 'Mirage' and distibuted by a company called Bauhaus) it gains Mac OS X compatibity.

Lightwolf
08-20-2003, 05:02 PM
Originally posted by Ed M.
So, the VideoToaster is the best thing since sliced bread, is that it?
I never said that, all I was commenting on was the performance of NTs code.

Beamtracer
08-20-2003, 05:10 PM
Regarding Apple's Final Cut Pro running on a G5, may I repeat a review that tallscot kindly posted earlier in this thread...

In Final Cut Pro 4 the dual G5 recognized that the machine was a "PowerMac G5" and the most amazing thing to me was the fact that of all the video filters and transitions, only a couple filters were NOT real-time (no rendering needed) filters. I think only 1 or 2 transitions were not real-time. Scrubbing through the video files was extremely responsive and fast, even on non-rendered, layered video tracks. Again, in trying to slow down the machine, doing rapid scrubbing continuously on these layered tracks only used about 65% of the CPUs
http://www.soundtracklounge.com/article.php?story=20030812073633362 (http://www.soundtracklounge.com/article.php?story=20030812073633362)

Beamtracer
08-20-2003, 05:14 PM
Originally posted by Lightwolf
I have to defend NT here, some of the guys there know the P4 better than most people at intel do (-> VT crew), and I'm sure someone like Andrew Cross know how to write optimized code
I was thinking about that comment again. All those experts at Newtek who know the Pentium inside out.

I wonder how many G5 experts they have working away in at Newtek?

Lightwolf
08-20-2003, 05:15 PM
Originally posted by Beamtracer
I was thinking about that comment again. All those experts at Newtek who know the Pentium inside out.

I wonder how many G5 experts they have working away in at Newtek?
I dunno, how many of the VT crew work on a G5?

Ed M.
08-20-2003, 06:01 PM
I dunno, how many of the VT crew work on a G5?


What's your point? *My* point is that NTs code is a pile of crap considering other apps that have been *untouched* for the G5 seem to fun orders of magnitude faster. NewTek's is just the opposite. Can anyone dig up the reasons NewTek gave for explaining why the code on the G4 was slow? After that, go back and see if the G5 contains any of those shortcomings. To put it in simple terms for you, if NewTek had been properly coding for the PPC all along, their effort to bring it up to snuff for G5 would be minimal. Period. Why can't you just say that they weren't prepared from the start? And why should Mac users pay the same price for an app that doesn't have parity with it's PC counterpart? will apple have to release an open-source 3D render engine to make this point clear? I wouldn't be surprised if they aren't working on one already. And why haven't we heard any commitment by NT to bring LW up to snuff for G5? Yeah, I think I have an idea.

--
Ed

Ed M.
08-20-2003, 06:06 PM
As a matter of fact, we've heard nothing whatsoever from NewTek regarding the G5.

Is anyone from NewTek aware that it's out? 'd be curious as to what they have to say.

Hey, Beam, you have any links or excerpts from NewTek engineers explaining why the G4 was slow at Lightwave? I want to see if the G5 still fall into that same category.

--
Ed

Lightwolf
08-20-2003, 06:27 PM
Ed,
Let me try to get this right. NT is to blame for Apple's relatively low performance G4? Will you blame Adobe too (After Effects) ?
Which 3D app on the G4 performs on par to a P4 / Xeon? How does for Example Cinema4D perform, or Maya for that matter...
How come it takes a G5 to put renderman on par with a Xeon 2.8 / 3.06 GHz? (at least according to the benchmark from Pixar at Siggraph someone posted)...
At the same time you complain for LW not to perform as it does on an intel box... What do you want NT to do? Send you an extra processor for Christmas? :rolleyes:
I'm getting extremely confused here.... :confused:
Cheers,
Mike

panini
08-20-2003, 06:34 PM
Simply a load of crap. Opteron already runs some 32bit apps faster.
Ed. M lives in a fantasy land.


Nasa guy's tests

2.6 Pentium scored 255, 2Ghz G5 scored 254

That is embarassing since that pentium is about a year old chip.

Spec scores I posted here ( look it up ) across the board Opteron is 30-50% faster. That is a huge margin.

Nowhere is G5 faster, Apple was scared to even mention Opterons and ignored facts that these were on the market almost 6 months before Apples crap.

These are all facts that only morons would dispute.
Only morons who believe that 254 is higher than 255 or that September comes before April. And obviously some here are actually claiming exactly that.

NewTek's code is fine by the way and what about Maya. Can you buy it ( the real version ) for a Mac???
Only morons do not know that Mac OSs were years behind windows in simple multithreading and memory management things. That is part of the problem and that is why Macs couldn't even rung things like Quake properly until a couple of years back. Maybe it looks pretty to you guys, but until OSX all Mac OS were garbage, so don't blame NewTek , blame your lying a$$hole CEO.

Ed M.
08-20-2003, 06:42 PM
No, Lightwolf... Had you read what I've been posting you'd know that I'm looking for the excuses they gave for why the G4 was such a poor performer (i.e., *their* reasons) Is it clear now? OK, good.

After that, I want to see if any of those excuses hold water when I look at their answers to see if those reasons apply to the G5. Then I'd like to run those reasons (excuses) across a few other cross-platform developers I speak with to see if any of it jives. Is *that* clear now? OK, good.

Regarding the use of dual-processors (SMP), How come NT hasn't got a solution for *either* platform at this point? Don't you think they're a little behind in that regard? I mean SMP machines have only been around for... You get the picture. And I want someone from NewTek to weigh in here. Be it for SneakerNet or code optimizing. I'd also like their preliminary analysis of the G5. What do they think about it?

--
Ed

Ed M.
08-20-2003, 06:55 PM
Panini, your an idiot. I've come to that conclusion. I'm sorry everyone, I don't mean to be harsh. I'm really a nice guy just looking to counter any of the FUD these jokers are slinging around.

Everyone: I've talked with Dr. Craig hunter. I was the one that suggested he start the thread over at Ars. Anyone who reads the thread will notice the true conclusions.

Has anyone checked out the preliminary SIMD scores yet?? :D

Panini is full of $h%T, I'm sorry. Any of you can probably e-mail Dr. Hunter directly if you'd like further clarification of the tests... (i.e., testbed system, unoptimized gcc compiler etc) I stand by my posts and the information contained in them. My sources are topnotch. Does anyone think otherwise? In the mean time we should just ignore anything that fool posts because it's only a waste of bandwidth. At the same time I'll be doing my best to continue to bring interesting tidbits and knowledge to this forum (if that's OK with everyone?) ;)

--
Ed

Nakia
08-20-2003, 07:10 PM
I say if LW don't work out for Apple in the Future Apple might just buy themselves a top 3D app. So far Apples owns: Top of the line Compositing App-Shake, Top of the Line, one of the best well rounded and expandable Video Editing App- Final Cut Pro, Kick @$$ audio Tool-Logic Audio, One of the best DVD Authoring Tool- DVD Studio Pro. Whats left a Paint Program and 3D app. Also don't forget the ability to make apps on Apple, they give you all the tools you need. (I just wish they choose better than GCC) With Apple aggressive atittude toward eating up everything I don't worry too much. Also SGI loosing grown in those areas that Apple can take.

js33
08-20-2003, 09:51 PM
Originally posted by Ed M.
Well, guys, anyone else other than Panini and js33 have a problem with what I've said or the information I posted? Let me know if you think it's made up, far-fetched or in accurate. Please. Personally I think js (and definitely panini) are full of... Well, you can add your own expletive.--
Ed

Ed,

What have I ever said that was wrong or inaccurate? Also I don't speak for Panini. His words are his and not mine.
Also you never answered the fact that Lightwave has been on Unix platforms and the DEC Alpha, all of which were 64bit along time ago, and they never had any problem using screamerNET.
Maybe OSX is being updated so often that it breaks code that worked fine when LW or updates were released. Also if Windows is such a piece of s#*t how come it can run Lightwave so well.
Lightwave was ported to the PC from the Amiga version. It was also ported to SGI, Sun, DEC Alpha and the Mac. Why is the Mac version so different compared to all these other varied platforms?
I think Lightwave's code is fine or it would not have been able to run on all these other platforms. I guess the Mac version needs special treatment to get it to perform the same as any other platform will straight away. Also I'm sure Newtek doesn't want a competitor sneaking around in their code. OK Adobe may not be a direct competitor but what does a 2D paint program guy know about 3D programming? They are very different. Now that all the Lightwave programmers left to join Lux they don't post here anymore which is indeed unfortunate. If you have any real ideas for improving Lightwave on the Mac I would like to hear them.
I don't mean posting references to obscure technical processor architecture minutia.

Cheers,
JS

Ed M.
08-20-2003, 10:26 PM
Also I'm sure Newtek doesn't want a competitor sneaking around in their code. OK Adobe may not be a direct competitor but what does a 2D paint program guy know about 3D programming?


Go back to the old forums and look up his posts; Chris answered that already. Would you like his e-mail address? You could ask him yourself. You'd be extremely surprised.

--
Ed

js33
08-20-2003, 10:37 PM
I could probably guess that it's something like [email protected] I would ask him if I thought it would make any difference. I'm sure the orginal programmers of Lightwave knew what they were doing. I don't know much about the new guys they have. Maybe Chris could apply for one of the 3D engineering jobs that are open and save the Mac version.
Maybe he would be interested in doing that?

Cheers,
JS

toby
08-20-2003, 10:45 PM
that's a great idea. NT needs to replace the guys who left for Lux, and Mr. Cox could certainly improve the Mac version as well as the pc version

Ed M.
08-20-2003, 10:53 PM
I not sure if you guys are being sarcastic or poking fun at Chris (I hope not). His offer was to sit down and help NewTek iron out some of their Mac-optimization issues. Was there something wrong with that? Something humorous perhaps? NewTek would have been wise to take him up on his offer when he first posted it here. I really believe he could have helped.

--
Ed

toby
08-20-2003, 11:15 PM
of course I'm not being sarcastic ( I remember those posts ). I think it would be awesome if he worked for NT - we'd get real Mac optimization.

js33
08-20-2003, 11:51 PM
Originally posted by Ed M.
I not sure if you guys are being sarcastic or poking fun at Chris (I hope not). His offer was to sit down and help NewTek iron out some of their Mac-optimization issues. Was there something wrong with that? Something humorous perhaps? NewTek would have been wise to take him up on his offer when he first posted it here. I really believe he could have helped.

--
Ed

No I'm not being sarcastic. I have the utmost respect for any programmer in Chris Cox's or Newtek league. I have dabled with the LW SDK and compiled example code. I know how deep it can get and if Chris wants to help Newtek I say go for it. Also with the likely sparse 3D team they have now I think they need all the help they can get.

Cheers,
JS

Beamtracer
08-21-2003, 01:25 AM
Originally posted by panini
Only morons do not know that Mac OSs were years behind windows in simple multithreading
"Only morons?"

"Only morons?"

Ahhhh, Ha Ha Ha!!!! :D You are obviously completely unaware that Apple had dual processor boxes out on the market running MacOS years before Intel or Microsoft.

Who was it you were calling a "moron" anyway??? :p

Lightwolf
08-21-2003, 01:36 AM
Originally posted by Beamtracer
You are obviously completely unaware that Apple had dual processor boxes out on the market running MacOS years before Intel or Microsoft.
Hm, that must have been during the 680x0 era, since I just switched to PCs when the first PowerPCs came out, and I had a nice NT 3.51 box (which did allow for SMP even back then). BTW, MacOS didn't have SMP until OSX, all the multiprocessors support before that was basically a bad hack...

To panini:
Please try to be a bit more polite. Our discussion is heated enough as it is without calling each other names. This goes to Ed too to a much lesser degree. Thanks.

Cheers,
Mike

Beamtracer
08-21-2003, 01:53 AM
Hi Lightwolf. I think the first dual processor Macs used the PowerPC 604 processor. They were released around the time of Mac OS7.5 or 8 (approximately). Long before OSX.


Originally posted by Lightwolf
To panini:
Please try to be a bit more polite.
That was a doosie, wasn't it! After throwing verbal cream tarts around, then one lands right in his face.

Lightwolf
08-21-2003, 01:58 AM
Originally posted by Beamtracer
Hi Lightwolf. I think the first dual processor Macs used the PowerPC 604 processor. They were released around the time of Mac OS7.5 or 8 (approximately). Long before OSX.
Well, there were SMP Pentium board, may be even i486 boards as well (If I remember correctly), so that was before Apple, just to make my point ;)
On the SMP front Apple has only caught up with OSX, but then again, calling what they had before that an OS is a bit like calling DOS an OS ;)

Beamtracer
08-21-2003, 02:01 AM
Here it is.

The Apple 9600

http://mirror.theboxchildren.com/images/models/9600.gif

Released February 1997

Cost: $4700

One of the processor options with this machine was

2 x 200MHz PowerPC 604e processors

What year was it that Windows became multithreaded?

js33
08-21-2003, 02:11 AM
Beam,

Microsoft Windows NT 3.1 was released August, 1993. This was the first Windows that offered SMP support. It even ran on PowerPCs in 1995.

Cheers,
JS

Lightwolf
08-21-2003, 02:14 AM
Originally posted by Beamtracer
Here it is.

The Apple 9600

Released February 1997

Cost: $4700

One of the processor options with this machine was

2 x 200MHz PowerPC 604e processors

What year was it that Windows became multithreaded?
Well, I switched to PCs and NT 3.51 in early 1996, by autumn that year I ran a Dual Pentium Pro 200 (cost me around 2000$ back then, with 128 MB of RAM). NT 3.51 as well.
We did have a 9600 back then when we started gadget (single proc only), man, that machine was slow as hell even back then. I ran circles around it. It was also the last Mac we ever had...

Cheers,
Mike

Nakia
08-21-2003, 06:21 AM
Have any one here ran LW on another platform beside PC or Mac?
I ran it on SGI when it was on that platform it ran Great, even on a 185Mhz O2. It ran on thier 64bit system like the Indigo2, Octane and everything else. All Sun boxes passed Ultra 2 are 64bit and LW ran on Solaris. The Octane which are Dual CPUs boxes might be slow compare to the PC CPU speed but its architecture as a whole smoke anything that PC maker put out as far as design and technology (freaking Octanes have crossbars ). I guess thats why these thing are real Workstaions not boxes with the latest hardware in them for the time, which is a PC thing today workstations are tomorrow desktops.
What is the max limitied of INtel CPU in one box that Windows can handle? Also what is the max amout of Threads can Windows handle compare to Mac OS X?

Nakia
08-21-2003, 06:30 AM
A quick quesiton can someone comfirm that this is true or false for Mac and PC.

<snip>
Importantly, unlike many dual processor systems (especially PCs), Octane's design permits maximum exploitation of both CPUs at the same time, ie. two processes running in parallel will not interfere with each other due to the enormous memory, subsystem and I/O bandwidth that is available in the system. Both can be running flat out without having to compete for resources.
<snip>

I got it from a SGI Octane resource page
heres the link to it
http://futuretech.mirror.vuurwerk.net/octnarch.htm

Lightwolf
08-21-2003, 06:38 AM
Nakia:
Limited: Xeon and intel 7505 chipset (1 memory subsystem for both processors, 266 Mhz DDR Dual channel);

Less limitations: G5 (1 very fast memory subsytem for both processors, 400Mhz DDR Dual Channel, 1 x Hypertransport, fast access to PCI-X and i/o);

No limitations: Opteron (fast direct memory bus on every processor, 333Mhz DDR Dual Channel per processor, 4 x Hypertransport).

As far as sheer bandwidth goes, SGIs do still rock though, too bad the MIPS processors are so (relatively) slow. No point in building a MIPS renderfarm...

Cheers,
Mike

Ed M.
08-21-2003, 06:39 AM
Has NewTek said anything official about the G5? Have they mentioned what their plans are? What are they doing to alleviate their lack of Mac-familiar programmers and engineers? Does anyone here think that NT should farm out their work to a company like the OmniGroup or at least work closely with them? Imagine what Lightwave would be like if Apple scooped them up ;-)

--
Ed

Ed M.
08-21-2003, 06:43 AM
It's per-processor for G5 as well Lightwolf. But the Opteron has other limitations that I mentioned earlier. What's more, I dont think we'll be talking about the Opteron when it's little brother the Athlon64 ships. That's going to be the true AMD offering for the desktop. I'd have to say that developers are in a better position supporting the G5 than they are the Opteron. Refer to my other posts that compare them.

--
Ed

Nakia
08-21-2003, 06:55 AM
What about the Mainboards ? nVidia nforce2 boards gave the AMD XP a big booste. We all know that all motherboards are not the same. nForce3 will be the Opteron board of choice. Too me this a major issue being that the mothermakers usually are the ones not making the cpu. Yeah they might use the chipset made by Intel to support intel CPUs but the rest of the board can suck.
We hear of talk about the specs of the CPU but not the whole system. Most motherboards do not take advantage of the CPU.
Is the architecture of the G5 main board behind, even of ahead of AMD/Intel 64bit systems mainboards as far as the desktops/workstations?

Ed M.
08-21-2003, 07:04 AM
When Panther ships things will be even better. I like the idea of the hybrid OS approach. Apple and IBM did a superb job of insulating developers with better APIs. Developers have full access to all the G5's 64bit features if they want them. Not all API's are converted over yet, but developers who require the features can still get at them. This blurs the line when talking about a 32bit OS or a 32bit app. Everything sort of migrates toward a hybrid form. And companies are already taking advantage of these features. Some of the features come along with the OS and developers don't have to do a thing. I mentioned all this before though.

--
Ed

Ed M.
08-21-2003, 07:06 AM
AMD/Intel 64bit systems mainboards as far as the desktops/workstations?

Many over at ArsTech would say YES ;)

--
Ed

Lightwolf
08-21-2003, 07:09 AM
Originally posted by Ed M.
It's per-processor for G5 as well Lightwolf.
I wouldn't be surprised is your research on the Opteron is as well founded as this statement. :confused:
Check this again:
http://a448.g.akamai.net/7/448/51/df8683ae13dd56/www.apple.com/powermac/pdf/PowerMacG5_TO_072903.pdf Page 9.

You have per processor channel going to the system controller, but only one channel going from the system controller (which acts as the memory controller) to the RAM (which is DDR400 Dual channel, at least on the Dual G5). So the two CPUs share the same access path to the RAM.
Not the case on the Opteron, where every CPU has an integrated memory controller, so you hook up the memory directly to the CPU. And since it has 4 Hypertransport channels connecting the processors, even the memory access bandwidth to memory hosted by another CPU is exceptionally good.
The nice things about thing this concept is that the per processor memory bandwidth doesn't degrade that much if you add more processors.


Refer to my other posts that compare them.

As you know I've read all of them, but I begin to doubt them more and more. May be you should get your facts on the G5 right first... ;)
Cheers,
Mike

Lightwolf
08-21-2003, 07:14 AM
Originally posted by Nakia
Is the architecture of the G5 main board behind, even of ahead of AMD/Intel 64bit systems mainboards as far as the desktops/workstations?
Well, compared to Xeons it is far ahead, I don#t know about the Itanium architecture though.
The Opteron architecture is definetely on par, and scales much better with multiple processors then the G5 does.
If you want a 4-way or more workstation, the Opteron is a decent choice.
As for motherboards, tyan has just released a Dual Opteron board (Based on the AMD 8000 chipset) with all the goods (including U320), and a dual nForce3 is supoosed to come Q1 2004.
Due to the CPU integrated memory controller though, the Opteorn is a bit less dependant on the chipset as far as performance is concerned. Hook up the i/o via Hypertransport and off you go.
Cheers,
Mike

js33
08-21-2003, 07:24 AM
Originally posted by Nakia
Have any one here ran LW on another platform beside PC or Mac?
I ran it on SGI when it was on that platform it ran Great, even on a 185Mhz O2. It ran on thier 64bit system like the Indigo2, Octane and everything else. All Sun boxes passed Ultra 2 are 64bit and LW ran on Solaris. The Octane which are Dual CPUs boxes might be slow compare to the PC CPU speed but its architecture as a whole smoke anything that PC maker put out as far as design and technology (freaking Octanes have crossbars ). I guess thats why these thing are real Workstaions not boxes with the latest hardware in them for the time, which is a PC thing today workstations are tomorrow desktops.
What is the max limitied of INtel CPU in one box that Windows can handle? Also what is the max amout of Threads can Windows handle compare to Mac OS X?

I have run Lightwave on Amiga, DEC Alpha, Mips, Mac and Intel. The Mips was a dual network rendering machine for the Amiga back in the day and it ran WindowsNT. It's cool that you bring the SGI perspective here. I have always envied SGI machines but they were just too expensive to buy for personal use or even small businesss use. I think the most CPUs in one box that Windows can use is 32 with the Enterprise edition. Beam me up Scotty.
As far as max threads. I don't know. Do you mean the OS as a whole or just running Lightwave?

If you really want a render machine and stay with SGI check out the Altix cluster running Linux on Itanium 2 processors.

SGI Altix (http://www.sgi.com/servers/altix/)


Cheers,
JS

js33
08-21-2003, 07:33 AM
Originally posted by Lightwolf
Well, compared to Xeons it is far ahead, I don#t know about the Itanium architecture though.
The Opteron architecture is definetely on par, and scales much better with multiple processors then the G5 does.
If you want a 4-way or more workstation, the Opteron is a decent choice.
As for motherboards, tyan has just released a Dual Opteron board (Based on the AMD 8000 chipset) with all the goods (including U320), and a dual nForce3 is supoosed to come Q1 2004.
Due to the CPU integrated memory controller though, the Opteorn is a bit less dependant on the chipset as far as performance is concerned. Hook up the i/o via Hypertransport and off you go.
Cheers,
Mike

Hi Lightwolf,

I've wasted so many replies arguing with Ed I've never replied to you. I am in agreement with you on all your posts. As far as the G5 goes it's still very much a "I'll believe it when I see it" thing.
Especially after all the trumped up performance of the G4 which turned out to be mostly bu||$h!t.

The Opteron is starting to sound pretty sweet. So is anyone making 4 processor motherboards yet?

Cheers,
JS

Ed M.
08-21-2003, 07:38 AM
Well, judging from the tests we've seen with the Opteron thus far, you would expect those integrated memory controllers to be making quite a noticeable difference. All the Opteron tests to this point are extremely underwhelming. I suggest you spend more time over at Ars.

http://arstechnica.infopop.net/OpenTopic/page?q=Y&a=tpc&s=50009562&f=8300945231&m=9080959175&p=9

There's a lot of pages discussing the G5.

--
Ed

Ed M.
08-21-2003, 07:49 AM
Lightwolf, Js... you guys must be high. We've seen ZERO in the way of anything from anyone regarding Opteron tests. You guys are just slinging bull$hit. Where are these tests showing the Opteron to be soooooo superior? The thing looks like a pile of $hi^ compared to what Intel is offering if you ask me. You guys are doing a lot of wishing and hoping for all this Opteron support, but the fact is that the G5 is ALREADY HERE and DEVELOPERS ARE SUPPORTING IT OUT OF THE DOOR. We can't say any of this about Opteron (a server CPU).

And Where is Athlon64 (Opteron's little brother)?
Where is the OS support?
Where is the developer support?

These Opteron rigs have been out for a while now and as I predicted you clowns are basing G5 performance on ****ty Lightwave code?? LOL Is that how things are done in Bizzarro World? And we haven't seen a single decent benchmark showing off the superiority of the Opteron. Again, it looks rather weak compared to the dual Xeon rigs.

You expect people reading this thread to take you guys seriously? You guys have shown us nothing. Typical. Does anyone here take these guys seriously? Be honest.

--
Ed

eblu
08-21-2003, 07:57 AM
Originally posted by js33
Ed,

Also you never answered the fact that Lightwave has been on Unix platforms and the DEC Alpha, all of which were 64bit along time ago, and they never had any problem using screamerNET.
Maybe OSX is being updated so often that it breaks code that worked fine when LW or updates were released. Also if Windows is such a piece of s#*t how come it can run Lightwave so well.
Lightwave was ported to the PC from the Amiga version. It was also ported to SGI, Sun, DEC Alpha and the Mac. Why is the Mac version so different compared to all these other varied platforms?
... If you have any real ideas for improving Lightwave on the Mac I would like to hear them.


JS33,
I'm not completely following this thread anymore, but I think I can speak to this a bit. The dec alpha was a nice machine, terribly limited bc it was tied to a particular version of NT, but nice nonetheless. The 64 bitness of the dec alpha was a big selling point for it btw, but I think With the G5 its the 2 Ghz, and the two floating point monsters that makes the huge diff, its apparent that this is a floating point performer, exactly what we need in 3d.
The other unix variants of LW were always shakey, and had every problem that plagues mac users today. poor implementation, no plugins, broken features swept under the rug. It was another feature to put on the box, "we're crossplatform" (remember Mac LW 5.0? yeah, it was like that). As for screamernet... the mac version breaks too many rules set by apple. it uses a completely outdated Command Line Interface emulation program called souix which is a farce if you think about it. Soiux is a redundancy, a waste of system resources, completely reliant on carbon, and an extraneous layer of uselessness on an already poorly designed application. When you want to quit you are forced to use "force quit", a mark of real quality there. It relies on completely end-of-lifed routines and functions, things that apple axed many years ago due to instability, and are only supporting because sloppy, lazy programmers will not update their code. Believe me, if Screamernet followed all the rules, and used technology that has been around for at least as long as there has been a LW for Mac os x, then it would work, even if Apple put out 30 more updates.
this dovetails perfectly with my suggestions for NT's dev team. Mac os X has rules, rules meant to make the development process more cost effective and more effective in the long run. Newtek obviously doesn't follow them at all. So either they don't know these rules, or they dont care about them. Neither one of these situations makes for a good product. My suggestion is for Newtek's developers to step up to the plate and get interested in Mac os X. Its a rock solid OS (in my novice developer's opinion) with incredible tools, like the unix layer which is a better solution than souix for screamernet in its current incarnation, or things like Applescript javascript cocoa and Rendezvous for say: a Real Network rendering tool. If the developers aren't interested in mac os x then they wont make good product, because they wont understand the os, or its benefits.
I am suggesting, that the current crop of programmers working on Lightwave are platform specific... PC specific. And that means that the platform Agnostic code is not exactly agnostic. the openGL implementation is in fact optimized for PCs... ecetra ad nauseam. This is unavoidable, Mac os X is incredibly new, nobody learns how to program on osX in OpenGL Gaming school, they learn on PCs, so they learn the PC optimized way of doing things whether they know it or not.
remember LW on SGI wasn't the cats meow and OpenGL is Owned by SGI.

Apple is seriously aware of this, and they seem to think that developers are aware of the disparity, and that the answer lies in Code, not hardware. they are giving away everything they learn about how to make Opengl Kick *** in OS X. just check out the apple dev site, a large portion is dedicated to Opengl, and integrating it with OS X. The problem is... I am a novice programmer, and I am apparently more aware of the problems and solutions than Newtek is... not a good sign.

I like LW and I can live with many of the problems that come with their approach to the Mac Port of LW, but that does not mean that I am blind to the fact that LW is first a PC app, and second a Mac app.

Nakia
08-21-2003, 08:00 AM
js33 - I dig SGI boxes. most PC folks do not understand how long it took for PCs to do 3D on a good level. And the history of nVidia and SGI. Alot of SGI technology is so far aheadd of the PC world and Mac. Apple has a chance to do what SGI do only for one main reason they make the OS and the Hardware. What I think SGI doing with the Intel 64bit is wild and crazy, they basically slam the CPU in thier current MIPs CPU setup. If SGI puts out a Intel or AMD 64bit IRIX (not windows or Linux) Workstation it will be the one to beat, they will put it all into it unless HP pulls it off.
js33- By how many threads I meant the OS not LW.

Having the Memmory Controller on the CPU is a hot deal. All the new Sun server I deal with have this feature. I make stacking memory up alot faster as long as you have a CPU on that bank of memory. Also Trouble shooting works best.
But this is a feature Sun has had for awhile.
We talk of 64bit CPUs but what about the rest? What type of filesystem will be introduce with these new 64Bit OS and CPUs? The filesystem can make a big diff. Right now SGI IRIX has a 64bit filesystem XFS which so far is untouch by any other fs. Sun has a 128bit filesystem in the works. We as Mac and PC users get so lost in our own world we forget we are getting is not the new.
Where I do contract work at they are moving some of the UNIX ops to work on the new Intel 64bit servers, my Sun team make a big joke out of it. Guess cause Sun been dropping 64bit now for years.

Lightwolf
08-21-2003, 08:01 AM
Originally posted by Ed M.
You expect people reading this thread to take you guys seriously? You guys have shown us nothing. Typical. Does anyone here take these guys seriously? Be honest.
I dunno, I don't go around posting BS (...sorry) about the G5, and make claims which even Apple doesn't make (just look at the previous post about the memory controller)
I mean, come on, you claim a per cpu memory controller for the G5, even if the techdocs clearly spell out a different case.
Who's been smoking what then?
All you seem to do here is bash newteks code for not running faster on a slow processor than on a fast one, and you now even have an excuse if it doesn't run well on the G5... Hm, could I use the same excuse for the Opteron? Should I? What counts for me is the performance I get in the end from the apps I choose to work with. If Digital Fusion screams on a quad Opteron, so be it. If LW screams on a G5, fine by me. That's what I base my choice on, everything else is (imho) just fundamentalism.
Cheers,
Mike

js33
08-21-2003, 08:04 AM
Originally posted by Ed M.
Well, judging from the tests we've seen with the Opteron thus far, you would expect those integrated memory controllers to be making quite a noticeable difference. All the Opteron tests to this point are extremely underwhelming. I suggest you spend more time over at Ars.

http://arstechnica.infopop.net/OpenTopic/page?q=Y&a=tpc&s=50009562&f=8300945231&m=9080959175&p=9

There's a lot of pages discussing the G5.

--
Ed

The forums are OK but I can't read web sites with black bkgd and white text for long because it hurts my eyes. Same problem with
flay.com

Cheers,
JS

js33
08-21-2003, 08:15 AM
Originally posted by eblu
JS33,
I'm not completely following this thread anymore, but I think I can speak to this a bit. The dec alpha was a nice machine, terribly limited bc it was tied to a particular version of NT, but nice nonetheless. The 64 bitness of the dec alpha was a big selling point for it btw, but I think With the G5 its the 2 Ghz, and the two floating point monsters that makes the huge diff, its apparent that this is a floating point performer, exactly what we need in 3d.
The other unix variants of LW were always shakey, and had every problem that plagues mac users today. poor implementation, no plugins, broken features swept under the rug. It was another feature to put on the box, "we're crossplatform" (remember Mac LW 5.0? yeah, it was like that). As for screamernet... the mac version breaks too many rules set by apple. it uses a completely outdated Command Line Interface emulation program called souix which is a farce if you think about it. Soiux is a redundancy, a waste of system resources, completely reliant on carbon, and an extraneous layer of uselessness on an already poorly designed application. When you want to quit you are forced to use "force quit", a mark of real quality there. It relies on completely end-of-lifed routines and functions, things that apple axed many years ago due to instability, and are only supporting because sloppy, lazy programmers will not update their code. Believe me, if Screamernet followed all the rules, and used technology that has been around for at least as long as there has been a LW for Mac os x, then it would work, even if Apple put out 30 more updates.
this dovetails perfectly with my suggestions for NT's dev team. Mac os X has rules, rules meant to make the development process more cost effective and more effective in the long run. Newtek obviously doesn't follow them at all. So either they don't know these rules, or they dont care about them. Neither one of these situations makes for a good product. My suggestion is for Newtek's developers to step up to the plate and get interested in Mac os X. Its a rock solid OS (in my novice developer's opinion) with incredible tools, like the unix layer which is a better solution than souix for screamernet in its current incarnation, or things like Applescript javascript cocoa and Rendezvous for say: a Real Network rendering tool. If the developers aren't interested in mac os x then they wont make good product, because they wont understand the os, or its benefits.
I am suggesting, that the current crop of programmers working on Lightwave are platform specific... PC specific. And that means that the platform Agnostic code is not exactly agnostic. the openGL implementation is in fact optimized for PCs... ecetra ad nauseam. This is unavoidable, Mac os X is incredibly new, nobody learns how to program on osX in OpenGL Gaming school, they learn on PCs, so they learn the PC optimized way of doing things whether they know it or not.
remember LW on SGI wasn't the cats meow and OpenGL is Owned by SGI.

Apple is seriously aware of this, and they seem to think that developers are aware of the disparity, and that the answer lies in Code, not hardware. they are giving away everything they learn about how to make Opengl Kick *** in OS X. just check out the apple dev site, a large portion is dedicated to Opengl, and integrating it with OS X. The problem is... I am a novice programmer, and I am apparently more aware of the problems and solutions than Newtek is... not a good sign.

I like LW and I can live with many of the problems that come with their approach to the Mac Port of LW, but that does not mean that I am blind to the fact that LW is first a PC app, and second a Mac app.

Wow Eblu, Long post.
Well you sound more knowledgeable about the Mac LW problems than anyone else in here. Why don't you contact the Newtek team and offer your help and suggestions.
I have an iMac and have run Lightwave on it a few times. It seemed to work alright but of course I have been using the PC version for about 6 years.

Cheers,
JS

Nakia
08-21-2003, 08:17 AM
The funny thing is the G5 is not even advertised by Apple to be workstation. Its considered a Personal Computer. Hmmm
Makes you wonder whats up Apple sleave. And we are comparing it to other 64bit "Workstations".
I'm Pretty sure if it was a workstation Apple would say it was.

js33
08-21-2003, 08:23 AM
Originally posted by Nakia
js33 - I dig SGI boxes. most PC folks do not understand how long it took for PCs to do 3D on a good level. And the history of nVidia and SGI. Alot of SGI technology is so far aheadd of the PC world and Mac. Apple has a chance to do what SGI do only for one main reason they make the OS and the Hardware. What I think SGI doing with the Intel 64bit is wild and crazy, they basically slam the CPU in thier current MIPs CPU setup. If SGI puts out a Intel or AMD 64bit IRIX (not windows or Linux) Workstation it will be the one to beat, they will put it all into it unless HP pulls it off.
js33- By how many threads I meant the OS not LW.

Having the Memmory Controller on the CPU is a hot deal. All the new Sun server I deal with have this feature. I make stacking memory up alot faster as long as you have a CPU on that bank of memory. Also Trouble shooting works best.
But this is a feature Sun has had for awhile.
We talk of 64bit CPUs but what about the rest? What type of filesystem will be introduce with these new 64Bit OS and CPUs? The filesystem can make a big diff. Right now SGI IRIX has a 64bit filesystem XFS which so far is untouch by any other fs. Sun has a 128bit filesystem in the works. We as Mac and PC users get so lost in our own world we forget we are getting is not the new.
Where I do contract work at they are moving some of the UNIX ops to work on the new Intel 64bit servers, my Sun team make a big joke out of it. Guess cause Sun been dropping 64bit now for years.

Oh I know it took the PC along time to do 3D and the Mac even longer. That's why I used Amigas for 10 years. :D

I think SGI could have owned the 3D market if they weren't so far out of the ballpark on pricing. If they would have only realized that about 10 years ago they would be in a much better position today than they are.

SGI and SUN have been 64 bit for along time but there machines have never been priced for the desktop. Now SGI still makes MIPS processors but how fast are they compared to say a 3Ghz Xeon or a G5? Also they seemed to be big supporters of the Itanium 2 and Linux rather than Mips and IRIX these days.

Cheers,
JS

Ed M.
08-21-2003, 08:32 AM
I mean, come on, you claim a per cpu memory controller for the G5, even if the techdocs clearly spell out a different case.

I never claimed the G5 had a dual memory controller (show me where I said that) Be careful though.

I thought you were talking about the bus (i presume). The dual G5s have independent busses. This is what I thought you were referring to.

Still, as I mentioned earlier, you would have thought that the tests displaying Opteron performance would have shown the advantages of the integrated memory controllers. The tests show the Opterons in a dull light. Just ask anyone. Completely underwhelming. Perhaps AMD hosed something up? I'm all for on-chip memory controllers. I thought that Motorolla would bring that to the Mac desktops. They kept it for their own 85xx cores though.

Oh, and just for the record, if you want to get technical, we probably should be comparing the Opteron to the likes of the Power$ and Power5, no? Just a thought.


What counts for me is the performance I get in the end from the apps I choose to work with.

Well, judging from what we've seen, that can't be too good for you if you're looking toward an Opteron solution.


If Digital Fusion screams on a quad Opteron.... (climax)

LOL This is what I mean by dreaming. We haven't seen any good tests for the duals yet.

If you remember IBMs initial introduction, the PPC 970 will START at 4-way configs. That's for IBM. I think you'll be surprised at how well the 970 scales.. What's more, its only the beginning.

Opteron looks like the end result of a long, drawn-out battle with Intel.. and I think it's a battle they are loosing and something that they bet the farm on. How much more do you think AMD can push the envelope? The battle with Intel has taken a lot out of them.

And still no tests showing Opteron performance to be stellar. No software. No OS. Nothing. Probably because its a SERVER CPU. Opteron's little brother is supposed to be for the desktop.

Where is it? The questions keep mounting. But go ahead, hitch your wagon to a dead horse. I'll have my MacOS X running on a G5 while you're still wishing and hoping for all this great stuff to come to the wizzz-bang! Opteron platform. Once again... No code. No OS. No drivers. No benchmarks with software that means anything. All a lot of talk and very little substance.

--
Ed

Lightwolf
08-21-2003, 08:35 AM
Originally posted by Nakia
The funny thing is the G5 is not even advertised by Apple to be workstation. Its considered a Personal Computer. Hmmm
True, but has Apple ever considered any MAC a workstation, or sold one as such? I remember the term being used only for un*x machines, and anything that was x86 (even if it was on par performance wise) was considered a PC. Times have changed, the majority of workstations are x86 based nowadays (just remember them shaving SGI off the 3D animation market).
I would however consider the G4s and G5s workstations, as opposed to the iMac PCs, at least as far as the currently used naming convention seems to go.

Cheers,
Mike

js33
08-21-2003, 08:39 AM
Originally posted by Ed M.
Lightwolf, Js... you guys must be high. We've seen ZERO in the way of anything from anyone regarding Opteron tests. You guys are just slinging bull$hit. Where are these tests showing the Opteron to be soooooo superior? The thing looks like a pile of $hi^ compared to what Intel is offering if you ask me. You guys are doing a lot of wishing and hoping for all this Opteron support, but the fact is that the G5 is ALREADY HERE and DEVELOPERS ARE SUPPORTING IT OUT OF THE DOOR. We can't say any of this about Opteron (a server CPU).
--
Ed

Good God Ed get a hold of yourself. I shudder to think what will happen to you if the G5 turns out to be a dud. :confused:

All I said was the Opteron, based on what Lightwolf was saying, is starting to sound pretty sweet. I'm also waiting for benchmarks. Just like we are waiting on benchmarks for the G5. We've seen nothing to praise the G5 about yet except for a ... uh... test showing ;) at the WWDC.
Other than that and alot of hype and speculation and arcane processor architecture minutia we know nothing about how the G5 will perform in the real world using real apps. You're already blaming Newteks code to be at fault when you know absolutely nothing about it. Have you even looked at the SDK? Do you even have a clue about programming? Just because Chris Cox said something doesn't mean that it applies to Lightwave. I'm not dissing Chris but unless he has actually seen the code how can he comment on it?
I've been working all night so I'm off to sleep now. So have a good day all and Ed take a chill pill man it's only computers. Better ones will come after these and so on and so on. Just be happy that we all have something to run LW on and go create something.

Cheers,
JS

Lightwolf
08-21-2003, 08:51 AM
Originally posted by Ed M.
I never claimed the G5 had a dual memory controller (show me where I said that) Be careful though.

You: "It's per-processor for G5 as well Lightwolf."
in response to:
Me: "Opteron (fast direct memory bus on every processor, 333Mhz DDR Dual Channel per processor, 4 x Hypertransport)."
Page 5 of this thread, towards the bottom.

[B]
I thought you were talking about the bus (i presume). The dual G5s have independent busses. This is what I thought you were referring to.
It is pretty obvious that I was talking about the bus, you've seen the specs and are just nitpicking. If you prefer we can continue in Germany, my language does tend to get more precise then ;)


The tests show the Opterons in a dull light. Just ask anyone.
Not really, it does smoke the competition on a bunc hog benches, not only SPEC, but also TPC et al. Mainly big iron benches though, where it directly competes against the likes of the Power+ in a different environment.


Oh, and just for the record, if you want to get technical, we probably should be comparing the Opteron to the likes of the Power$ and Power5, no? Just a thought.

Nope, not really. You'll be able to run 64 bit windows by the end of the year on it, 32bit windows currently, and it runs all of my favourite software (as does the mac, swap shake for DF). The Power5 isn't out yet btw.
Does the Power4+? No. Why bother. I prefer to compare CPUs that make sense for my work.


Well, judging from what we've seen, that can't be too good for you if you're looking toward an Opteron solution.

You mean from what you've seen. From what we've seen the G5 is currently quite nice on paper too, but this is all a matter of perspective as we both know.
It wil depend on my needs, as well as on my cash. I fo have the choice though, which is nice. Xeon or Opteron? Depends...


LOL This is what I mean by dreaming. We haven't seen any good tests for the duals yet.

You haven't.


If you remember IBMs initial introduction, the PPC 970 will START at 4-way configs. That's for IBM. I think you'll be surprised at how well the 970 scales.. What's more, its only the beginning.

So, who's building 1000+ processor systems with Opterons then? And why should they? And, why _do_ they?


Once again... No code. No OS. No drivers.

Actually I'm getting sick of always having to answer your same denials all the time, so I'll skip that part for now, I think I've answered to that often enough.

Cheers,
Mike

eblu
08-21-2003, 08:53 AM
Nakia,
I think apple doesn't really have designs on making a "workstation" they seem to want to be "approachable" to the general population, so its desktops. But that doesnt mean that the g5 cant be viewed as a workstation... its all marketing.
And I think you are right about Workstations in general, You can't compare an SGI to a personal computer based on MHZ, or Memory speed or any other single number, its a waste of time. Using an SGI you can clearly see the difference (in school I worked on a Personal Iris, and an Indy with Alias Animator v.3.2 to v 7.5). I'd like to suggest that this is also true for comparison with the g5 and processor x. Real world performance is really the only thing that will tell us anything. I'm glad to see that Apple has reversed their opinion of Integer performance over Floating point, and its obvious that the g5 is a vastly superior chip than the g4, but its the whole machine, hardware and software that will tell the tale.

btw: since you guys were comparing sgis... we have 3 onyx ahem "towers" an o2, an indy, an indigo ex, and an octane. great stuff, one trick ponies tho, we use em for discreet logic products (flame/ fire etc...) we would have put screamernet nodes on them (co-worker is a PC LWer, coming from amiga by way of a Dec Alpha, he has a small pc render farm... compaq xeons) but it would cost us A whole lotta dough, bc NT doesn't give you screamernet for other platforms, we'd have to buy LW for SGI. I also have one the original Colored SGIs, inherited from school, I had to gut it bc it didnt work anymore... the graphics board cracked. so its a nice addition to my desk at home holding my g4 933 on top of the main box, and a printer on the power supply box (that sgi was freakin HUGE, had a 33mhz processor but it did 3-d better than any pentiumII).

Nakia
08-21-2003, 09:07 AM
The freaky thing with Old SGI was the old GLQuake game that ran on SGI and not PC back then. No its no biggie.
Compare the MaxImpact gfx card from back then to those PC, it shows how long it took to get there for PC and Macs.

Lightwolf
08-21-2003, 09:15 AM
True, but they made it. Now SGI buys graphics chips from ATI to replace the infiniteReality ( I think...) series.
The times have changed...
Cheers,
Mike

tallscot
08-21-2003, 11:19 AM
I'm not a programmer, but I know that Apple sets up rules to follow for programming for their OS, as does every OS maker.

If you follow the rules, your applications should work well when updates are made to the OS. If you don't follow the rules, your application could break with an OS update.

This is true with all operating systems.

I have a handful of applications in Windows XP that broke with SP 1. Is that Microsoft's fault?

Again, Alias is capable of making Maya Complete work just as well on the Mac as the PC. I've specifically asked Maya Mac users if they were happy with the parity of Maya Complete and if they were happy with Alias's support. They were all happy.

Basically, it comes down to market share. If the Mac application has a significant market share, like Photoshop does, you will see great support by the developer. If the Mac version has a much smaller market share, the developer has a harder time justifying the time and cost to support it equally. This, of course, makes it hard for them to increase their market share, and ends up lowering the Mac market share because customers start leaving.

Alias made a strong commitment to the Mac and made it clear they were going to support if fully and give it the same attention as the other platforms. They have followed through on that, and the U.S. Maya Complete sales are 25% Mac, and will, no doubt, increase tremendously with the G5 and Renderman.

I'm not being a Maya troll here, I'm just pointing out that when people start blaming Apple instead of NT for poor Mac support, I think they should look at other products and they will see that other developers have no problem with Apple or the Mac OS.

silvergun
08-21-2003, 11:23 AM
On a different note

Lets hope Newtek make a good version of lightwave 8 on the mac..if not, ill wait for modo and the whisper sweet and jump ship. Newtek are useless with the mac and not being arsed to make a good version really tells us where their development with mac lightwave is going.

Antimatter
08-21-2003, 11:25 AM
Sience some people bought up the SGI work station, i would like to see some sort of benchmark comparing them to these desktop computer nowdays.

and sun is now making 128bit filesystem? that's funny that SGI are already moving to 128bit while the PC/MAC are just now moving to 64 bit heh.

anyway is there any good benchmark/information on the SGI system because i'm interested in learning a bit about them :) i belive my college have a few workstation that i woudl like to get my hand on and experment with a little bit.

mlinde
08-21-2003, 11:39 AM
Originally posted by Antimatter
Sience some people bought up the SGI work station, i would like to see some sort of benchmark comparing them to these desktop computer nowdays.

I would also like to see some benchmarks, utilizing something like an O2 and Lightwave (I know, 5.6 is probably as far as it goes, but I'd still like to see) since SGI used to rave about their system design, and they do, after all, design and maintain a little graphics language I like to call OpenGL...

Lightwolf
08-21-2003, 11:45 AM
Originally posted by Antimatter
and sun is now making 128bit filesystem? that's funny that SGI are already moving to 128bit while the PC/MAC are just now moving to 64 bit heh.
Well, please note, the filesystem doesn't really have anything to do with what the processor is capable of. You could even code a 256bit file system for the C64. A 64/128 bit processor just makes it easier to handle. Even now most file systems are 64 bit (actually they broke the barrier a long time ago, just think of all the 2GB / 4GB barriers we used to have, with hard disk, file sizes (QT), etc...).
Cheers,
Mike

dfc
08-21-2003, 12:09 PM
Some bench results of the G5 1.6 single were posted today.


http://www.chaosmint.com/powermac-g5-16/

Not very impressive, for this model I'd say. Esp considering that the base model is now 1999.

Karl Hansson
08-21-2003, 12:21 PM
The first MP macs that I know of where Daystar's Genisis MP (non Apple) machines with four ppc 604 processors. Those machines could take up to 1536MB RAM back in 1995, every macusers dream at that time.:cool:

Ed M.
08-21-2003, 12:21 PM
Why don't you contact the Newtek team and offer your help and suggestions.
He's tried. Like everyone else he was ignored. And yes, Ted is VERY knowledgeable about Mac Lightwave. More than anyone I've met so far. That's why I told you to read his previous posts.

Lightwolf. I want you to go to Ars MacAch forums (as well as anyone else who's interested) and read the posts by people like BadAndy, Hobold, MrNSX, programmer, gcc, and the MANY others. They know quite a bit about the G5 *AND* the Opteron.

Ars is pretty much where I get a lot of my info regarding the G5. If I'm wrong then they must be wrong and if that's the case it might be wise to read their posts and perhaps correct them because Lord knows we wouldn't want that misinformation being spread around. I suggest that you start from page one though. It's a very very loooooong thread. I'm not even close to being in their league so I just sit and read (lurk).

And Scot, well said my good man. I've been loosing my patience. I just had a nice bottle of "chill", so I'm a lot better now. :D

--
Ed

Lightwolf
08-21-2003, 12:32 PM
Originally posted by Ed M.
Ars is pretty much where I get a lot of my info regarding the G5. If I'm wrong then they must be wrong and if that's the case it might be wise to read their posts and perhaps correct them because Lord knows we wouldn't want that misinformation being spread around. I suggest that you start from page one though. It's a very very loooooong thread.
I did when you first pointed it out to me, and, as I said before, I didn't find anything there that invalidated my points, not at all.
I didn't find it interesting enough to follow it though, that thread is extremely messy...
I tell you what though, I expect c't (http://www.heise.de/ct - try babelfish) to test the first G5s and bring an article in the next issue, I'll tell you more about it. And no, they're not intel / AMD biased either.
Cheers,
Mike

Lightwolf
08-21-2003, 12:39 PM
Originally posted by dfc
Some bench results of the G5 1.6 single were posted today.
http://www.chaosmint.com/powermac-g5-16/

Just for laughs I quickly installed Cinebench on our two year old Athlon 1,4 GHz (not XP), 768 MB Ram, W2K SP3, not really a clean install either...

Single CPU 159.9 Seconds.

It looks like somebody has some heavy optimizing to do... (I mean Maxon...)

Cheers,
Mike

Lightwolf
08-21-2003, 12:42 PM
Actually, I think the 158.2 sec for the G5 is a typo or something. I don't think it is _that_ slow... I would expect it to be more in the range of 90 - 120 seconds...

tallscot
08-21-2003, 12:48 PM
Some bench results of the G5 1.6 single were posted today.

Ouch! That's a slow Cinebench score.

It looks like software really needs to be updated for the G5, because notice how the Photoshop plug-in for the G5 (available now) is suppose to make it "twice as fast", according to Adobe?

Is that twice as fast as the fastest G4 or twice as fast as Photoshop without the plug-in, which might be very slow like this Cinebench score is?

What's strange is the benchmark for After Effects, if it is real, shows the G5 screaming with a non-optimized After Effects.

The Apple developer page that was posted about how to tune your applications for the G5 had a list of "don'ts" for developers. Does this mean if an application is written with "don'ts", it won't perform well on the G5.

Even in the worst case scenario, it's clear that applications don't have to be recompiled. If a free 1.4 meg update to Photoshop brings the performance, I'm sure we'll see updates by all the major players. The question remains if the update is required for all software to run fast.

Still need more time.

So if you take the 156 score on that 1.6 Ghz G5 and extrapolate it to a dual 2 Ghz, I'm getting around a 70 score, which is still slower than the dual 2.4 Ghz Xeon score of 51 posted on barefeats.com.

One thing to consider, too, is that the 1.6 Ghz G5 has an 800 Mhz bus and DDR333 RAM, versus the dual 2 Ghz's 1 Ghz bus and DDR400.

tallscot
08-21-2003, 12:54 PM
I'm inclined to think something is amiss too, because according to XLR8yourmac.com, a dual G4/533 gets a score of 109, and a single 933 Mhz G4 with OS 9.2.2 gets 87!

Clean install? Oh yeah, Windows gets slower over time. :)

eblu
08-21-2003, 01:55 PM
Originally posted by Ed M.
He's tried. Like everyone else he was ignored. And yes, Ted is VERY knowledgeable about Mac Lightwave. More than anyone I've met so far. That's why I told you to read his previous posts.


I'm a LW user. I know some stuff about programming. I understand the importance of good design, and have had some hands on experience designing UI/workflow for applications. I've put in my share of bug reports, had the discussions though email and on the phone, with the support crew, I've had some discussions in this forum (well, the old one) with arnie cachelin, who is not with NT anymore, and a smattering of Other interested developers. I'm all for following procedure, I fully expect though, that NT holds up their end of the bargain. This forum is a user-to-user forum and also a place for the users to vent to the developers in a free form way. What I have found is that Newtek has slowly distanced itself from the mac forum (maybe others as well but I don't frequent them as much). The message i get from Netwek for the most part is that the support staff has no idea what a mac is, even the "mac" guys. Many of the UI flaws are regarded as quality design (the hub? why exactly do i need an extra application to make sure two applications behave themselves?). And the many gotchas/workarounds/bad coding examples are not going to get fixed, because "nobody notices/uses that anyway"(paraphrase). I have found that the attitude of the developers is probably very bad. they are either helpless to fix/change something, or they think that they are infallible, so my input is worthless, or even worse, they just don't know anything about the problem. Why don't i help the developers? I am. I have tried educating them, opening their eyes to things they might not know, get them excited by the neato whiz bang features. But they aren't in this forum anymore, so what can i do? call NT's president? nope. Not my thing.
I intend to incite customer demand. Many of the problems that paintboy, mlinde, and many many others repeat are flaws that affect the usability of LW across platforms. Many of these users have been accused of whining. I have personally been accused of whining. If Newtwk believes that I like making a stink just because I like the smell, then they should get their heads examined. Newtek should take a closer look at its mac clients because they know good UI almost instinctively (not to say windows users can't know good UI design, but for the most part Windows users are assaulted by bad design every day, and tend to not notice it), and they have real gripes. Anything done by newtek to make LW better on the mac is good for all versions, especially if they start to pay attention to workflow issues, UI issues, and start doing research into things they don't know. Then maybe they would stop with the spaghetti code. Its true that mac users make up a minority, together we can't possibly build up enough customer demand to show up on the newtek radar, but frankly, this aint my job, I don't feel particularly strong about these issues. I will have no difficulty surviving if Newtek cancelled the mac version of LW. So like I said before, I like LW and I want to see it become the best 3d package, so I help out where i can.

ED,
I skim right past most of the things you say about hardware (i do it to everything about hardware in these forums, actually) I use a mac, I know why, and thats all that matters to me about the hardware. But, you do seem to get a little "over stimulated" by promised performance, and when the delivery isn't exactly what was advertised, you get just as incensed about the software. But if thats the problem, then no new hardware can fix it, and Like i said b4, Newtek aint really involved in this forum these days, so they aint hearing you. I agree with you for the most part about many of the things you say, I can't confirm or deny anything you say about hardware, or for that matter anything anybody says about hardware. the problem is tho, you tend to alienate people with comments that seem to be scathing assessments of Technology not of apple. the reality is that people are different, they like different things, and they don't like to be told that their opinion is wrong. IN ED's DEFENSE, I have been listening to crap like that for almost 15 years From HUNDREDS of PC elitists. and ya know what, theres a part of me that feels, through Ed, justice is done. Apple makes dynamite products, and has advantages No Other company has.

I personally believe that the g5 is evolution not Revolution, and that Opteron, Xeon, Itanium, any other hunk of silicon, is the same thing. Companies like competition, so they work for evolution, slowly making a slightly better product in each iteration. They don't want to turn into SGI, who is dying because they made a vastly superior product, and found little or no demand for it.
But the mac, is the best platform. heh.

Lightwolf
08-21-2003, 02:13 PM
Originally posted by tallscot
[b]
One thing to consider, too, is that the 1.6 Ghz G5 has an 800 Mhz bus and DDR333 RAM, versus the dual 2 Ghz's 1 Ghz bus and DDR400.
Then again, my Athlon has a 133 bus as well as DDR266 RAM as well as only 1.4 GHz. That G5 score is very sad, I truly hope that this is not the rendering performance one will get out of the G5 (Renderman shown at Siggraph did put the G5 2GHz in the league of a Xeon 2.8 / 3.0).
May be the G5 doesn't handle some legacy code to well, so it's up to the developers, as is always the case when a new generation comes along (PPro, P4, Opteron, PowerPC, 68040 etc...).
Cheers,
Mike

Lightwolf
08-21-2003, 02:14 PM
Originally posted by tallscot
Clean install? Oh yeah, Windows gets slower over time. :)
Actually, I was just trying to state that I tested this on a work horse machine, you know, one that has been running 24/7 for the past two years...
Cheers,
Mike

tallscot
08-21-2003, 02:24 PM
Actually, I was just trying to state that I tested this on a work horse machine, you know, one that has been running 24/7 for the past two years...

Like most of ours...

But you are right to point out that a Windows machine that has been running for a long time is probably going to be slower than a system with a fresh install of Windows.

OS X doesn't have this issue.

Lightwolf
08-21-2003, 02:39 PM
Originally posted by tallscot
But you are right to point out that a Windows machine that has been running for a long time is probably going to be slower than a system with a fresh install of Windows.

OS X doesn't have this issue.
Sheez, you guys never give up, do you? :rolleyes:

At least it's still as fast as a spanking new G5, which hasn't been running for the past 2 years ;) Oh, and it did cost half as much back then, as the G5 does now...

mlinde
08-21-2003, 03:03 PM
Originally posted by tallscot
But you are right to point out that a Windows machine that has been running for a long time is probably going to be slower than a system with a fresh install of Windows.

OS X doesn't have this issue.

Well, I hate to disagree, but I find my OS X runs a bit nicer after a clean install, so much so that I do clean installations every 9-12 months. I wipe the drive, reinstall from the ground up, and go forward from there. I just recently did this with my dual 800, and some applications feel a bit more peppy, and I'm comfortable knowing I've got a good backup from August 2003.

Nakia
08-21-2003, 03:09 PM
My Mac OS X box seem to run good when I never shut it off. I just leave it running 24/7. I plug it in a UPS. Well I kind of run all my computers 24/7. I even leave my Cell phone, Pager, Sharp Zaurus, DVD player on 24/7.

Lightwolf
08-21-2003, 03:14 PM
Originally posted by Nakia
Well I kind of run all my computers 24/7.
Same over here, even though it's kind of stupid and a waste of energy. I do switch of our workstations every now and then, but I don't bother with the renderfarm (in another room across the building...) and the server is 24/7 anyhow (unless the somebody pulls the plug by accident, which happened a month ago, about the only reason for an UPS over here).

Beamtracer
08-21-2003, 04:52 PM
Many companies leave their gear on all day and night. Think of it like a light globe.

Light globes usually fail at the time they are switched on, and are less likely to fail after they've been running for some time. This is because they heat up or cool down when they are switched on or off, causing stress.

Computer processors are the same. When you switch them on they heat up and expand, when you switch them off they cool and contract. This stresses their innards more than if they were left on all the time.

Regarding Newtek & Mac: Newtek has a strange relationship with the Mac. Newtek's star product is the Video Toaster for Windows. The main competitor and biggest threat to Toaster is Apple with Final Cut Pro.

So you've already got this competitive atmosphere between the two companies, even though their products are on different platforms. It's possible this could influence the mindset of competitive people within Newtek.

Regarding Opteron: Believe it or not, I actually hope AMD does OK with it's processors. Before AMD, Intel was really lazy, producing the worst and slowest processors in the business. AMD forced Intel to get moving.

3 strong desktop processor manufacturers (IBM, Intel, AMD) is a good thing for everyone, whether Windows or Mac. It's bad if Intel becomes too much of a monopoly. It would just go back to its lazy ways again.

Ed M.
08-21-2003, 04:52 PM
OK, I was just in an e-mail exchange with Chris Cox from Adobe. I asked him about the latest scores. It seems to go back to the link to one of Apple's Dev pages that lists the things *not* to do when optimizing for the G5-- the very same page Scot makes reference to. Anyway, Chris sums it up this way...

Apparently XBench has known problems on the G5 and this was known for quite some time. To put it bluntly, it needs to be redone for the PPC970 (G5). Chris and some Apple engineers noticed this back in March, and knew it was going to be an issue when the benchmarks started to roll in. They obviously examined the code.

Chris mentioned that his STREAM code on the G5 gets over 2400 MB/s and that's without AltiVec being involved.

He explained that their STREAM code uses dcbz, and uses vec_dst (very badly at that). If you remember what I posted a few pages back and what Chris said he had a hunch about, you'll notice the data cache block zero issue I mentioned. refer to the Apple Dev page that discusses it.

The thing to keep in mind is that Lightwave needs to be reoptimized for the G5. It's common sense given the results; especially when compared to other applications running much faster on the G5 without any tweaking. Similarly, other programs and applications that fall victim to the same 'gotchas' are likely to show similar results.

Chris stated that the 970 is different enough that most optimized applications will need to do a little bit of tweaking to get best performance on the G5. I invited him to this discussion, but I know he's been busy. Maybe he'll pay us a visit.

Regarding Photoshop speed... Chris said that *all* of Photoshop will be sped up on the Mac and feel much faster. Some operations will show a 200% speedup.

And if 100% speedup is *twice as fast* then it follows... :D

--
Ed

Lightwolf
08-21-2003, 05:03 PM
Originally posted by Ed M.
Chris stated that the 970 is different enough that most optimized applications will need to do a little bit of tweaking to get best performance on the G5.
That doesn't sound too good. Then again, you have pretty much the same situation here that you blame, for example ( :) ) the Opteron for having. Let's wait and see which 3D package will be G5 optimized first (I assume that for most of them this means a compiler switch from CodeWarrior to gcc as well, which can lead to a whole bunch of other issues).
Does Metrowerks actually support the G5 yet?

Cheers,
Mike

tallscot
08-21-2003, 05:29 PM
I like how you continue to dismiss any benchmark that shows the G5 is faster, yet embrace any benchmark that shows it's slower, even though they are all equally suspect.

Lightwolf
08-21-2003, 05:37 PM
Originally posted by tallscot
I like how you continue to dismiss any benchmark that shows the G5 is faster, yet embrace any benchmark that shows it's slower, even though they are all equally suspect.
:cool: I'll try to actually do that in the future just to please you ;)
Actually, the CineBench test is quite neat because it is easy to replicate, and it targets rendering apps (I don't really have the time to run a full SPEC).
Note please, I also wrote: "Actually, I think the 158.2 sec for the G5 is a typo or something. I don't think it is _that_ slow... I would expect it to be more in the range of 90 - 120 seconds..."
I never doubted that the G5 will perform well, I'm just a bit suprised at the imho over the top fanboy hype. How good "well" is remains to be seen, and even Guru Chris Cox seems to be quite aware of the issues.
I've just seen too many processor generation changes that didn't immediately pay off for the users because the software/compilers still had to be optimized. Anyone remember NT shipping 2 different LW versions for the Amiga?
Cheers,
Mike

Ed M.
08-21-2003, 05:43 PM
Well, no, Scot, you had it right the first time. Lightwolf (and others) dismiss any benchmark that shows the G5 as faster. It's probably OK to be faster than the G3 or G4, I mean after all that's the least Apple can do, but saying that it's faster than the best that AMD and Intel can cook-up is absolutely preposterous and must be completely made-up. Add the fact that most of the tests were on a hamstrung compiler and crappy code (explore MacAch over at Ars) and it certainly can't be true.

The only tests that are valid are the tests that show the G5 to be the slug that it is. The piece of crap that $h%^-box IBM and tinker-toy Apple can throw together -- all copied tech. from the best companies like AMD and Intel. It's all old technology too. They had it all first. How can the G5 be faster? Remember, Dull.. errr.. Dell and BOXX have superior engineers. The benchmarks we've seen should be making us lust over having one of their fantastic creations and.... Yeah, Scot, we get the picture lol

--
Ed

Ed M.
08-21-2003, 05:46 PM
Then again, you have pretty much the same situation here that you blame, for example (:)) the Opteron for having.

Not exactly, Lightwolf. You mentioned that all that's needed for the Opteron is to *have the app recompiled* (into 64bit?) and that it would be simple.

I'm saying that if what you state is true and there were no other issues (because the same will hold for the drivers) or other complications, then developers would have been standing in line with pre-optimized (or partially optimized) versions ready to go.

Can developers get at any of them through their current apps and the current OS (32bit)? If so, please list for us which of those new (64bit) features are made available to developers to take advantage of immediately with current applications. Also list which applications and developers have released such "enhanced-for-Opteron" tweaks. Where can we download them?

Again, if my sources are correct, to get at the new features that Opteron offers, developers have to recompile their wares. The same goes for the drivers. All of this needs to be tested as well, so there could be a lot more involved than you might think. 32bit apps running through Opteron only gain performance do to enhancements made to the design OTHER THAN simply going to 64bit. Since there is no *official* OS developers must be having a hell-of-time testing all of this.

Regarding 64bit Windows. Let's say it's in the works and almost ready.

-Will it be different from the 64bit Windows running on Itanium?
-Will there be another 64bit version for the Athlon64?
Can we expect the performance of Athlon64 to be faster or slower than Opteron?

How about giving us an idea (a speculative timeline perhaps) of when and how you think things will start to come together for AMD64: Athlon64-desktop/Opteron-server.

- Do you have any Idea when we might have an official version 1.0 of Windows64-server for Opteron?
- How much longer for Windows64-desktop for Athlon64? - - When do you expect the apps to start rolling in from developers?

Please give us your idea of how that scenario will pan-out. I think a lot of us are curious to know (seriously).

--
Ed