PDA

View Full Version : more: mini vs xserve



eblu
02-08-2005, 01:06 PM
a coworker did a cost per megahertz ratio on the xserve and the mini. I was suprised by the results.

mini:

1.25 ghz (1250 mhz) / 499=
2.50 dollars per megahertz


xserve (using the lowest dual config)
2x2.0 Ghz (2000 mhz) / 3999 =
1.00 dollars per megahertz.

while its true that you could afford : an effective 10 Ghz in (G4) Minis at the price of one Xserve at 4 GHZ (2 x 2.0 G5s), it appears that the more expensive option (xserve) is the most cost effective on a megahertz to megahertz basis.

but... based on the blanos scores of the raytracing scene in Lightwave (subjective at best... I know)
the G4 (tower Not Mini, but its a good place to start right?) got 167 seconds
the G5 (2x2Ghz Tower, again not what were talking about but reasonably acceptable)
recieved a 58.

check my math but that gives us a 2.87 ratio. ie: it takes 3 low end minis to out perform a 2 processor 2 gigahertz Xserve. the cost ratio there is: 1500 vs 4000; or: 2.6 (as in the xserve costs 2.6 times more money than a similar amount of iron in Minis) its obvious that the megahertz per dollar ratio is influencing the outcome of the cost ratio, but Not enough to really hurt the cost effectiveness of the min vs the xserve.


quick note: all of this is speculation and ignores things like Ram, disk speed, and the like. I imagine in the real world, Minis will be hampered by their ram limit, but the disk speed issue will in all likelihood not be a large factor. even if they are affected, the amount of impact would have to be almost squared before the xserve becomes more cost effective. This is due to the fact that you can buy 8 minis for the price of the dual 2 gig xserve, and it only takes 3 minis to "roughly equal" the rendering power of the Xserve. If the minis were hobbled so bad that it took 9 of them (3 x 3 = 9) to equal the power of an xserve, then the xserve would be a better choice. but if that were to happen then the effective Megahertz of the Minis would be something like... 400 mhz... heh funny. thats about the spot where Cubes were.

Captain Obvious
02-08-2005, 02:01 PM
Your math is all backwards. For the Xserve:

4000MHz for $4000 equals 1MHz per dollar, that's true, but you reached it the wrong way. See here:


2x2000/3999 ~= 1

But let's say it cost $5999 instead! That gives

2x2000/5999 ~= 0.67 "dollars per megahertz"

A higher price means lower price? That doesn't make sense! You should divide the price with the performance!

3999/(2x2000) ~= 1
499/1250 ~= 0.4

Or this way:

$3999 is the same as 8 Minis. The Xserve has 4GHz of G5, versus the 10GHz of G4 the minis give, for the same price, just like you said later on.



I don't mean to be mean, though ;)

Alliante
02-08-2005, 02:20 PM
Don't forget that the power requirements and management requirements for the XServe are much less than the many mini's.

( Some render managers have a per-seat license fee too )

I'd go for the XServe personally, we use one as our webserver at work.

Captain Obvious
02-08-2005, 02:30 PM
Also, wouldn't it make more sense to use Cluster Node Xserves for a render farm? Aren't they $2999 and 2.3GHz?

Alliante
02-08-2005, 02:49 PM
I'm not amazingly familair with Apple's cluster node technology, but doesn't Lightwave's render have to be XCode to take advantage of that?

(a tad bit off topic, but still very parallel vector to the discussion)

eblu
02-08-2005, 03:32 PM
wow thats a lot of posts.
i guess.

umm...

the big deal with the minis is that they don't offer much leverage technology (os x server, and the crazy level of robustness, PCI slots) compared to the xserve. And for most applications (read: usages) thats a bad thing. But, Screamernet is just about the least compatible you can get and still be compatible with os X, so the neato whiz bang features of the xserve are wasted on Scremernet. For instance g5s are leagues ahead of g4s in a lot of ways, but... for the most part.. screamernet only benefits from the difference in Megahertz. Likewise... ScreamerNet Will *never* be able to grid/cluster render with the apple technology. If Newtek redeveloped Screamernet, then its possible, but... heh... theres a alot more issues in front of that one.

so in the end, it looks like... the extra stuff you pay for in the xserve are not worth it... If... and I stress the If... if you are making a dedicated screamernet render farm.

captain... I applaud your math skills. I will tell my co-worker his ratio is backwards.

Captain Obvious
02-08-2005, 04:06 PM
Apple's cluster nodes are just Xserves with fewer drive bays and such. It's not like you need more than 80 gigs of storage per render node, anyway. ScreamerNet is just as compatible with the Cluster Node as it is with the Mini, and you have the advantage of gigabit ethernet, 2x2.3GHz of G5 (though Lightwave does seem to perform about the same, megahertz for megahertz, on G5 and G4).

marinello2003
02-16-2005, 11:08 AM
Your math is all backwards. For the Xserve:

4000MHz for $4000 equals 1MHz per dollar, that's true, but you reached it the wrong way. See here:


2x2000/3999 ~= 1

But let's say it cost $5999 instead! That gives

2x2000/5999 ~= 0.67 "dollars per megahertz"

A higher price means lower price? That doesn't make sense! You should divide the price with the performance!

3999/(2x2000) ~= 1
499/1250 ~= 0.4

Or this way:

$3999 is the same as 8 Minis. The Xserve has 4GHz of G5, versus the 10GHz of G4 the minis give, for the same price, just like you said later on.



I don't mean to be mean, though ;)

"2x2000/5999 ~= 0.67 "dollars per megahertz"

Isnt that backwards? Don't you mean 0.67 megahurts/dollar or 0.75 dollars/MHZ? You are dividing 4000 MHZ by $5999, which gives you a MHZ/Dollar answer not dollar/MHZ.

The other thing that has not been considered, is that it will take a minimum of 768MB of ram to run LW or the ScremerNet node properly to handle a LW render. So you need to factor the additional cost of the 512MB ram into the total cost.

The other cost which is not considered is MHZ/Watt or how much power you will consume per MHZ outputt. When you buy a crapload of MacMini's, it will eat up a lot more power/MHZ than an Xserve will. So you will find yourself with higher electricity costs depending on how many you have running at one time. If you are running 5 or less, the cost won't be significant, but if you have 25 or 50 Mini's, that is when you would see an increase in your power bill, which may justify using the Xserve.

The third issue is that MHZ only is not really a good cross comparison with the G4/G5 since the G5 is a much more advanced chip. The Floating point power of the G5 is much more enhanced than that of the G4. It is like comparing a VW bug which is running its engine at 5000 rpm, with a ferrari that is running its engine at 2500 rpm. RPMs are not a good comparison between a VW and Ferrari engine. Same goes for MHZ. That is why my single processor 1.4 GHZ G4 renders LW at roughly the same time as my 2.0 GHZ Dell P4. They are simply not the same chip.

What I would do is use a benchmark of a LW scene render. See what the Mini renders it at, and then see what a dual 2.3 Xserve renders it at. Then you can start making price/performace comparisons based on $/render time.


-Brent

Johnny
02-16-2005, 11:21 AM
The other thing that has not been considered, is that it will take a minimum of 768MB of ram to run LW or the ScremerNet node properly to handle a LW render. So you need to factor the additional cost of the 512MB ram into the total cost.


I don't think that tid bit is an absolute fact.. I am rendering a scene right now which is taking less than 200MB per node to render. I have rendered many other scenes while using a G3 iBook with 640MB of total RAM, and that iBook turned in its frames in about the time you'd expect given its MHz rating as compared to that of the G5 I use. I have also run LW on that same iBook.

I'm sure it's possible to create animations which demand more RAM, but my experience tells me that there are plenty of scenes which render happily with less than 768MB per node.

As for making comparisons of render capabilities based on Mhz speeds, there's enough anecdotal evidence on this board to show that a G5 rated at 2Ghz will be about 4 or 5 times faster than a G4 of 450Mhz handling the same scene. Again, your mileage may vary, but the comparison seems to be linear, megahertz for megahertz, unlike, say FCP, in which a G5 would vaporize any G4 rendering the same footage.

J

eblu
02-16-2005, 12:51 PM
The third issue is that MHZ only is not really a good cross comparison with the G4/G5 since the G5 is a much more advanced chip. The Floating point power of the G5 is much more enhanced than that of the G4. It is like comparing a VW bug which is running its engine at 5000 rpm, with a ferrari that is running its engine at 2500 rpm. RPMs are not a good comparison between a VW and Ferrari engine. Same goes for MHZ. That is why my single processor 1.4 GHZ G4 renders LW at roughly the same time as my 2.0 GHZ Dell P4. They are simply not the same chip.


brent,
as discussed ad nauseam, the G5 vs the G4 in Lightwave, has an almost Mhz to Mhz speed ratio. that is, Lightwave has an increase in speed almost exactly proportional to its Mhz increases on the mac. Its fact. its sad. it implies bad things... Lightwave apparently doesn't benefit from the extras in the G5. That being said... as far as the G5 vs. G4... MHZ matters most. Your metaphor should go like this...

its like comparing a VW bug running at 5000 rpm with a severely near sighted grandma driving it, to a Ferrari running its engine at 2500 rpm with a severely nearsighted grandma driving it... both incapable of driving stick.

as for the the price per MHZ ratio... it was conceded along time ago to be backwards and faulty, in fact even if it was accurate i myself disproved its validity in my very first post, so while appreciate the help with the math, I'm afraid you went through the trouble for no reason. in the end... minis outperform the same monetary amount of xserves (theoretically speaking; I don't have any real world exp, and I know I said that at the start of all of this)

as for energy requirements... if your going to go that far lets talk about noise, space heat and weight. I don't have specs in front of me, but we do have a xserve. they are loud, they constantly blow hot air, weigh a ton, and they have a huge footprint... even bigger than the towers (its their height that makes them fit in a rack all snug).