PDA

View Full Version : New Computer for new ver.9



zonezapper
05-29-2006, 02:47 PM
Greetings, I am in need of some expert advice, and fast. I have a budget of approximately $3,000 which includes the upgrade price for LW3D ($469/printed manual from Safe Harbor...does anyone know of a less expensive place?).

I am upgrading to LW3D ver.9 from ver.8. Since I never really mastered v.8 (or used my previous v.5 to its fullest extent, for that matter...), I primarily spent my time with ver.5 building models for VRML, which DID teach me to optimize file size and build items polygon by polygon. I have a B.S. Degree in Filmmaking and know my way around Layout fairly well, at least ver.5's Layout...

My current *best* computer is a 5 year old Dell Dimension 8200 with two 30 GB HD's, 512 RAM and a Pinnacle video capture graphics card.

I have some skills, installing hard drives and CD/DVD burners in several computers. However, I am a bit lost when it comes to SLI, risers, duo core Xeon, Pentium D and most of the newer technology. My fear is that I spend a lot of money on a system that will not utilize Lightwave's capability to its fullest potential, or have features that Lightwave does not need.

So far, from reading the users group boards, it appears that a pair of Xeon processors is the best starting point for a base workstation... I would only save a minimal amount of money using the Pentium D chips. Also, one high quality Graphics card is better than two mediocre ones used together. And, as always, the more RAM the better...

FIRST QUESTION: Are the twin Xeon processors really that much better for LW3D than the Pentium D? (same CPU speed)

I can only afford 2 GB of RAM right now (2nd on priority list), putting my budget into the processors first and a decent graphics card third.
If I get a personally configured HP from Costco, I can get 4 GB of RAM running at 667, but it is not the FBD type used in the Dell workstation.
I can get a decent Xeon workstation from HP, but it is too overpriced for me....same thing for the Tsunami at Safe Harbor.

SECOND QUESTION: Is it correct to conclude that faster Xeon CPU's (3.2) with less RAM (2GB 533) is better than slower CPU's (2.8 Pentium D) with more RAM (4 GB 667)

THIRD QUESTION: should I get the cheapest option they have and install my own XFX GeForce 7900 GT Video Card with 256 MB after I receive the computer?)
I don't think I will ever need more than 2 video outputs

The rest of the options involve the HDD's. It appears that a RAID 0 system using SATA drives (I can afford 2 160 GB drives) will be both economical and provide performance needed to keep up with the Xeon chips as they render images. I think I will need a separate boot drive (which will double as my media drive for audio as well)...Dell does not have a specific set up to address this need, that I am aware of. However, It appears that I may have chosen a configuration to have a Raid 0 with a separate bootable HD.

The X-fi sound blaster audio card seems to be a good deal for $99 extra. I have numerous speakers to run on it.

I have a decent 19" crt monitor for modeling and plan to use my 42' plasma for display of final animations and demonstration purposes for potential investors in my project.

I have no intention of connecting this unit to the internet, but will burn finished product DVD's on it. I plan to use up to two other (Dell) computers I currently own, one for audio/soundFX composition/creation and the other for editing of the finished product. I will not use this computer for gaming, or running any programs other than Lightwave and various programs in direct support of it.

=====

Currently, the system I have picked out is:

Dell Precision Workstation 690 (1KW power case - 64bit...Q4: does this represent the motherboard architecture or OS?)

Windows® XP Professional, x64 Edition with Media (Q5: My LW3D is maximized for 64 bit, correct?)

Twin, Dual Core Intel® Xeon® Processors, 5060, 3.20GHz, 2 X 2MB L2, 1066

2GB, DDR2 SDRAM FBD Memory, 533MHz, ECC, In Riser (2 DIMMS)

128MB PCIe x16 nVidia Quadro FX550 (D)*, Dual VGA or DVI + VGA (I will not be getting the SLI or dual graphics card option)

2-160GB SATA 3.0Gb/s,7200 RPM Hard Drive with 8MB DataBurst Cache™ in Raid 0 configuration, with drives set to a NTFS File System.
1-250 GB boot/storage drive (to store model/animation/media data)
Q6: should this (Boot Drive) be Fat32 or NTFS?

System comes with an Integrated Broadcom 1 Gigabit Ethernet Controller.
Q7: Should I add a 2nd 1 Gigabit Broadcom NetXtreme 10/100/1000 Ethernet controller-PCI Express card? (cost $50, could I then use it on one of my other computers?)

Sound Blaster® X-Fi™ XtremeMusic (D), w/Dolby® Digital 5.1 (it's $99 extra, but I may swap it out with one of my older computers)

I priced Tsunami, Xeon based, workstations at Safe Harbor and had a hard time beating the Dell price. I wouldn't mind building my own but the only place I was familiar with was apparently bought out and went off line, or changed its name.

Thanks for any suggestions, advice or input in general.

KillMe
05-29-2006, 03:31 PM
i heard the intel dual core chips are abit of a hack job and that programs wont even utilise them fully

might be better looking into an amd opteron solution

that said thats just what i heard - might be bollocks

adrencg
05-29-2006, 07:15 PM
get an AMD Athlon dualcore x2 cpu. I've hade one for almost a year and I love it. Best bang for your buck.

Scott_Blinn
05-30-2006, 11:14 AM
i heard the intel dual core chips are abit of a hack job and that programs wont even utilise them fully

might be better looking into an amd opteron solution

that said thats just what i heard - might be bollocks

It's bollocks. :-)


I just built a new computer this past week and it runs LW very very well. Ran me about $2400 to buy all the parts and build it myself.

Pentium D 940 (dual core 3.2Ghz)
2GB DDR2 RAM at 800Mhz
ATI 1900XTX 512MB video
Soundblaster X-Fi ExtremeMusic audio
System HD- 74GB Raptor 10000RPM
Data HD- 300GB Maxtor 7200RPM
Plextor dual layer DVD/CD burner 18X
Full Tower ATX Case
Enermax Noisetaker 600W power supply
Win XP Pro w/ SP2

Tested and runs awesome with:
LW 8.5 and 9 Beta
Maya 7
Max 8
ZBrush 2
Photoshop CS2
and any game I throw at it.

If you went with a high end AMD CPU you will get a bit better performance- but it could be at a cost. I have always found some little issue cause by AMDs and stick with Intel now.

I'll back that up with a specific case I see at work. Runing a dual core AMD setup at work, we have issues between Nvidia video card drivers, AMD dual-core CPUs (even with the AMD patch and windows hotfix patch) and Motion Builder. We have yet to get a perfect combo of software/driver to have it all work smoothly. Dual core Intels work just fine.

Granted that is a specific case- but over the years I always seem to run into these "specific cases" a lot with AMD. Other people not using the range of software I do seem to do just fine with AMD.

My 2 cents. Hope it helps! :-)

zonezapper
05-30-2006, 04:14 PM
Thanks for input so far... I did read somewhere in here that AMD kept having *little "specific cases"*, and felt that two dual core Xeon 3.20GHz CPUs should be at least equal to 2 Opteron, 265 Dual-Core 1.8Ghz CPUs

It's a tough call, although I'm tempted to just get the new LW3D v.9 now, and buy the new computer later. Like I said above, I upgraded to lw3d v.8 from v.5 but never learned v.8 completely. So, spending 6 months smoothing out the learning curve on the new software might be the best choice for me.

Please, keep the comments coming :-)

TheDynamo
05-30-2006, 04:53 PM
I would suggest you might attempt to purchase one of the new Mac G5 Workstations (with the intel chips) with the specific purpose to dual boot it with Windows. I can honestly tell you that the new Intel Core duo chips are incredibly powerful. My laptop with a core duo 1.6 is pushing render times that compete rather well with dual 2.6 Xeons.

It makes me wonder what the faster chips can do.

-Dyn

-edit.. ack... they don't exist yet! Sorry I seem to get ahead of the times :)

zapper1998
05-30-2006, 05:51 PM
Sounds Cool.....

KillMe
05-30-2006, 07:00 PM
funny but i read about many cases of intel dual cores the second core barely firing up in lightwave amoung other apps meaning that dispite rendering with 8 threads the second core jsut wasn't getting involved and cpu usuage never went above 55%

obviously not all the time but i heard more about that than i have for opterons

i've been running amd cpu's for years now and never had a problem and i still think they are better bang for your buck even the high end expensive ones - you can nearly always still buy a slower but more exensive intel chip

Captain Obvious
05-30-2006, 07:29 PM
If you're going to go Intel, go Core Duo. The current Xeons and Pentiums are pretty crappy, next to the Core Duo or Athlon64/Opteron.

lots
05-30-2006, 09:11 PM
The Pentium D is a hack job at a dual core CPU. An Athlon 64 X2 is a superior design. I wont say that a Pentium D doesnt run on all cylenders (I'm sure it does) but it is not an ideal design, here's why:

1) Pentium D's two "cores" are pretty much just two Prescott CPUs linked together and stuffed in the same package. AMD's X2 are pretty integrated, so much so, that you cant produce two cores and slap them together like Intel does with Pentium D. This means there is a great deal of speed in CPU to CPU communication, and sharing of data.

2) The two cores have no way of talking to eachother other than going OUT of one core over the FSB and then BACK IN to the other core via the FSB. As you can imagine, the FSB is rather slow in comparison to internal CPU functions, and it is highly inefficient to leave the CPU only to come back in the other side. Note: Yonah and Conroe fix this problem, but on the desktop at this time, the Pentium D is all you can get from Intel until Conroe (Core 2 Duo) is out.

3) The Pentium D is very hot, and wastes alot of power. This is thanks to the Pentium D's lineage, the aging, and highly inefficient, Netburst architecture.

4) The chip lacks the I/O bandwidth that the AMD chips have. For example, the Pentium D is still funcitoning off the rather dated FSB -> North bridge -> South bridge approach to computer design. Intel is working on a solution, but it is still a year or so away. AMD hands down wins in this comparison. Intel has NOTHING to combat it. AMD's Hyper Transport is a dedicated serial link between the CPU and any I/O logic out on the motherboard. It is also not limited by the FSB, since RAM has its own pathway into the CPU's memory controller.

Pentium D lacks the onboard memory controller that an Athlon64 X2 has. Both cores have to share communication with the rest of the system through the FSB. This includes all traffic to and from system RAM, traffic to and from each core to the other, and any traffic that is bound for any devices out on the chipset. All this over the FSB? It seems rather unrealistic :P

What the Pentium D does have going for it is price. Intel's manufacturing capabilities are much better than AMD's. Now I dont mean in quality, but in shear quantity. Intel can produce a far greater amount of CPUs than AMD can, and the way the Pentium D is designed, it can have a higher yield than AMD's X2 CPUs. The Pentium D may not perform on par with an X2, but it is cheap.

I may sound fanboy-ish with my statments, but the Pentium D is really a joke of a CPU in comparison to the X2. It is basically a stop gap until Intel's next CPU architecture, Core 2 Duo. In fact it may prove to be a far superior chip to AMD's line up for the rest of the year. And all without an onboard memory controller, or advanced IO interfaces like HyperTransport. Come next year, we should see some exciting advances in CPU technology from both AMD and Intel.

brainysmurf
06-01-2006, 04:04 PM
if you're talking about prescott, you're right. presler however fixes many issues, even with netburst. of course, conroe will wint he race above all. don't knock the FSB. the fsb works fine for woodcrest, merom, and conroe. yeah, it sucked for prescott, because prescott was a flaming inefficient architecture. the front side bus works fine though, and has bandwidth to spare for most single/dual core apps.

nerdyguy227
06-03-2006, 12:24 PM
Right now for a <2,500$ system the best bang for buck I would say if the fx 60 proc. with 4 gigs of ram. If you want a dual proc + dual core then opteron seems to be more cost effective than intel, plus it runs cooler (with a deceint bugit as yours you may want to consider liquid cooling).

Building your own system is a good idea here since you have some of the parts that you may otherwise be paying for. Putting it together isnt too big of a challange for someone without all that much experiance; now they have made it half self explanatory. Only downsides are if there is a problem, you need to know how to find it so you can get it covered under the right warentee, and there is no tech support other than for each part.

Ram while rendering can be significantly more important than cpu speed.

O.T., KillMe, you have reached the death point in posts

Captain Obvious
06-03-2006, 01:02 PM
Best bang for the buck? The Athlon64 X2 3800+. Overclock it to 2.6GHz or so, and you have yourself a kick-*** system for next to nothing.

nerdyguy227
06-03-2006, 08:28 PM
yea but if you have more buck as might as well pay for more bang, with 2500 go for the fx 62 and am2 socket with 4 gigs of the xms ram and lately I have been looking at this british psu that lookes like it was taken out of the matrix- and its 580 watt for $99; http://www.frozencpu.com/psu-163.html

lots
06-03-2006, 09:28 PM
if you're talking about prescott, you're right. presler however fixes many issues, even with netburst. of course, conroe will wint he race above all. don't knock the FSB. the fsb works fine for woodcrest, merom, and conroe. yeah, it sucked for prescott, because prescott was a flaming inefficient architecture. the front side bus works fine though, and has bandwidth to spare for most single/dual core apps.
Presler is a nice step for netburst, but the architecture is really out of date. Intel's own Conroe and Merom show that AMD's design philosophies were the right direction this time around. Even though Presler was a positive step, it still can't really compete. Conroe will change this, but until then AMD is definate king of the hill.

If FSB is so wonderful, I guess Intel is developing its equivelent to HT and possibly an integrated memory controller, for no reason then. FSB in the traditional sense is on its way out. I dont think you understand the significance of hyper transport. Hypertransport gives the ability for hardware makers to create a direct connection to the CPU. This means you can run hardware at exactly the same clock as your CPU. There is ALOT of bandwidth, and very low latencies, making things like a PPU more effective, or any other kind of co processor you can think of (dedicated rendering engine). All of this with cache coherency, and that massive pool of system RAM. If you take a look at some of AMD's recent press releases about HyperTransport, and where they plan on taking it, you can see a whole slew of applications beyond just a connection to the north bridge on a chipset. This is a technology that Intel lacks, and this is right where AMD plans to compete with conroe till the K8L appears. Well that, and creating consumer oriented dual socket motherboards (silly as the idea is...)

Conroe will be a great move for Intel, it will finally be back to actually competing. So long as both companies keep the run going, I think we'll see great things for consumers (faster, cheaper hardware) and who can argue with that :P

zonezapper
06-04-2006, 03:47 AM
Ok, I ordered LW3D and am holding off on getting the computer for a few months. I got what I think is a good quote from BOXX... If all things stay the same, does this sound like a decent system?

Configuration 7400 Series:

Dual Opteron Model 265 (Dual Core)
2GB DDR400 ECC (4 DIMMS)
NVIDIA Quadro FX 1500 256MB PCI-E
80GB 7,200rpm Serial ATA 8MB Cache Drive
240GB Serial ATA RAID 0 Array (2x 120GB 7,200rpm)
Windows XP Professional x64 Edition
Black 103 Key Keyboard
Logitech MX310 Corded Optical Mouse

BOXX Technologies

System Total: $3,330.00
Shipping Cost: $50.00
*Grand Total: $3,380.00

Thanks to all of you... (even those who confused the sh!t out of me)

IMPERIAL
06-04-2006, 05:02 AM
dont know your needs but I would buy 2 PC for that amount and 2 LW seats.
this way you have 64bit but only 2 GB ram (whats the point?)
not everything yet works in 64bit..plug ins , other software...
ex. (MSI, Abit,) Nforce ulta400, AMD x2 3800, 2GB RAM, 6600GT, 2x120GB HDD, 1x80GB HDD, case, LW should be around 1500$
Other PC can be the same but with Win64bit and you can add another 2GB RAM.

nerdyguy227
06-04-2006, 09:12 AM
Good you are now thinking more of going with amd, thats the best choice now. That system is deceint but only if you think bang for buck includes support. Boxx is generally higher priced because of this. You might want to think of buying from a brand with more competative pricing or building your own system then using the rest on training material (which can be extrodenarily helpful).

Scott_Blinn
06-04-2006, 02:06 PM
funny but i read about many cases of intel dual cores the second core barely firing up in lightwave amoung other apps meaning that dispite rendering with 8 threads the second core jsut wasn't getting involved and cpu usuage never went above 55%

obviously not all the time but i heard more about that than i have for opterons

i've been running amd cpu's for years now and never had a problem and i still think they are better bang for your buck even the high end expensive ones - you can nearly always still buy a slower but more exensive intel chip


Just an FYI- I just did a bunch of render tests under LW9- The Pentium D 940 used both cores to the fullest when rendering with LW. Here is an example...

zapper1998
06-05-2006, 05:06 AM
sounds good price is right

mattclary
06-05-2006, 06:36 AM
I'm glad Lots is around, he says what i think, but is patient enough and eloquent enough to actually type it all out. :thumbsup:

As he said, Intel is good if PRICE is a concern, but AMD dual cores are much faster chips.

I am also not an AMD fanboy (I reserve that for LightWave), my 4200+ X2 system just replaced a P4 3.0 system that kicked a lot of a** back in the day.

The new chips coming out from Intel will probably smoke the AMD chips, so when they come out, i will probably recommend them.

Both AMD and Intel make great chips, but events are cyclical, and I am only beholden to the one who will give me the fastest renders on the day I hit the "add to cart" button.

There are plenty of benchmarks on the net that prove the performance advantage. If I have a chance, I will try to run a render to compare to Scott's outcome.

Zonezapper:
Try to get the memory in 1gb sticks rather than 4 512mb sticks, that will maximize the upgradeability later. If you can drop the quadro for a "gaming" card, i would do so. You are using LightWave, not Maya. That card is wasted on LightWave.

mattclary
06-05-2006, 09:31 PM
Well, this sucks. In 8.5 it renders in ~29 minutes, then in 9 it renders in 35 minutes.

Makes me wonder about the whole intel optimization thing again.

I really need to get a license key for my 6.5 code so I can test the theory...

X2 4200+ 2gb RAM

Captain Obvious
06-05-2006, 10:18 PM
If you have an X2, why did you set the number of render threads to one!?

mattclary
06-06-2006, 04:47 AM
If you have an X2, why did you set the number of render threads to one!?

Awww **ck!!

I checked it when I rendered it with 8.5, and I forgot to check it with 9! Thanks for pointing out my error! I actually lost a little sleep last night wondering why it was so freaking slow. :bangwall:

lots
06-06-2006, 08:07 AM
*snicker*

Isn't that new info window handy? :)

mattclary
06-06-2006, 08:46 AM
Bite me! ;)

Yeah, it is!

Captain Obvious
06-06-2006, 10:01 AM
So what's the render time with two threads?

mattclary
06-06-2006, 10:37 AM
Unfortunately, I have to work (non-CG or anything else fun related) for a living, so I won't be able to test until tonight.

Luckily I have internet access so I can hang out here at least to make the day go by quicker. ;)

Scott_Blinn
06-06-2006, 11:26 AM
For fun I tried the render on my work system (Dual CPU Xeon 3.2Ghz)...

nerdyguy227
06-06-2006, 01:10 PM
matt, should be at least twice as fast now right?

mattclary
06-06-2006, 01:29 PM
matt, should be at least twice as fast now right?

mmmmmm.... Maybe. Exactly double I think would be wishful thinking, I think. I am going to guesstimate around 18.5 - 20minutes.

Lightwolf
06-06-2006, 01:42 PM
mmmmmm.... Maybe. Exactly double I think would be wishful thinking, I think.
Try the Perspective Camera then, it scales much better using multiple threads... :D

Cheers,
Mike

mattclary
06-06-2006, 02:59 PM
Just as I predicted. The perspective camera was only about 20sec faster.

Captain Obvious
06-06-2006, 10:41 PM
From 29 to 19 minutes, even on such a simple scene? That's pretty good, I think. :)

Now if only they improved the irradiance caching for 9.1 or something such, we'd have a real radiosity killer on our hands!

mattclary
06-07-2006, 05:00 AM
Actually, it was from 35 minutes to 19 minutes. The speedup is very significant when you actually use both cores. ;)

mattclary
06-07-2006, 05:17 AM
Based on Scott and my benchmarks, it looks like you could go with a Pentium D 950 and not be too far off in performance from an X2 4200+. The 950 is $16 cheaper than the 4200+ at NewEgg. It's only 200mhz faster than Scott's 940, so I doubt it would COMPLETELY catch up to the 4200+, but it would be close.

If you do the math assuming it would scale linearly, I come up with a 6.25% speed difference between the 940 and 950 which would take off about a minute and a half from Scott's time, leaving the 4200+ only 30 seconds faster, which is pretty insignificant when accounting for testing variations.

Conclusion: Intel knows how to price their CPUs based on performance. Odds are, if you decide to spend X amount of money, at that price point it is probably moot as to which brand you choose, from a pure performance assessment.

Also, the expense of buying two Xeons and the required mobo isn't worth it, it only gains a little over two minutes over the 4200+

I would be curious about speeds on other systems. If someone could bench a dual Opteron system, and maybe a dual core Xeon, that would be awesome.

Lightwolf
06-07-2006, 05:20 AM
The 950 is $16 cheaper than the 4200+ at NewEgg.
I guess the price difference gets eaten up after a year on your electricity bill alone ;)

Cheers,
Mike

mattclary
06-07-2006, 05:37 AM
I guess the price difference gets eaten up after a year on your electricity bill alone ;)


The 950 is built on the 65nm process, does it really suck more juice than the 90nm AMD chip, or are we just remembering the... what is it, Prescott...? that ran so hot?

Lightwolf
06-07-2006, 05:46 AM
The 950 is built on the 65nm process, does it really suck more juice than the 90nm AMD chip, or are we just remembering the... what is it, Prescott...? that ran so hot?
It still sucks up more... it is a lot better than the previous ones though.

Cheers,
Mike

Scott_Blinn
06-07-2006, 10:21 AM
The 950 is built on the 65nm process, does it really suck more juice than the 90nm AMD chip, or are we just remembering the... what is it, Prescott...? that ran so hot?

Yes, the 9xx series is all 65nm. You can google reviews that compare watts and temp from the older Pentium Ds.

Mine runs around 60C while rendering and idles at 48C. I here an aftermarket CPU fan can greatly reduce that.

Right now I would suggest an AMD AM2 setup- mostly because at this point if you are buying a new AMD system, the future for upgrading will be with the new socket 940 motherboards (AM2 being the first). Also, getting DDR2 memory should boost things as well.

Shortly though I will be back to saying go Intel- when the Duo Core 2 arrives. :-)

It's a never ending battle. ;-)

Scott_Blinn
06-07-2006, 10:24 AM
I would be curious about speeds on other systems. If someone could bench a dual Opteron system, and maybe a dual core Xeon, that would be awesome.


:agree: As would I. I can't find any recent CPUs LW benchmarks online- and there are none benchmarking v9 yet.

Do we have an old benchmark thread here we can revive (have not looked myself)- or maybe we should start a new one? I'm sure it would be helpful to point people too...

Captain Obvious
06-07-2006, 11:28 AM
Actually, it was from 35 minutes to 19 minutes. The speedup is very significant when you actually use both cores. ;)
I meant from LW8 to 9! ;)


The 65 nanometer Pentium Ds can still suck up well over 100 watts under stress, I think. Also, there's one more thing you need to consider: your typical X2 3800+ can easily be overclocked to about 2.5GHz or so. That's a significant performance gain.

mattclary
06-07-2006, 12:17 PM
Overclocking isn't worth the risk to me. I know lots of people do it, but I'm not one of them.

lots
06-07-2006, 08:18 PM
Here is a series of tests. The first is the classic camera with 8 threads (to match matt's test). The second is perspective with 8 threads (again to match Matt). The last is perspective with 4 threads. You can note that the 4 threaded perspective test is nearly a minute faster than the classic camera with 8 threads (and during this test I was using the computer for other things :P). I suspect that going to 2 threads will get me some more ground in terms of rendering, but not too much. The new render engine is much more effective with the scene break down for multiple threads. I also expect that the performance gain between 4 threads and 8 is because of the smaller over head. Which is why I think two threads will have an even better effect.

Note: this is after several days of uptime and other things having happened. Not a fresh system. Most notably, the GF's logged in to her account (using fast user switching), and has recently had a several hour session of The Sims 2 ;) My tests are not scientific in anyway, and probably will represent a system in the real world more than a test machine.

All in all, my system comes to about where I expected it to in its current configuration (dual Opteron 246s). Close to matt's scores but somewhat distant due to clockspeed and memory speed advantages of an X2 based system. Maybe when Socket F Opterons come out, I can investigate swapping the CPUs out for two 280s. At which point I expect my render times to drop to below the Xeon times earlier in this thread.

Silkrooster
06-07-2006, 08:45 PM
Well I finally was able to order my new system, it may not be up to par with some you, but I thought I would post to compare with the original poster.
I decided to go with HP mostly because price was a factor.

HP Pavilion d4500y customizable Desktop PC
 Microsoft(R) Windows(R) XP Media Center
Edition
 Intel(R) Pentium(R) D 940 (3.2GHz,
4MB,800MHz FSB)
 2GB DDR2-667MHz dual channel SDRAM
(2x1024)
 160GB 7200 rpm SATA Hard Drive
 SAVE $30! LightScribe 16X DVD+/-R/RW
SuperMulti dr
 9-in-1 memory card reader, 2 USB, 1394, Svideo
 No Modem
 Single NTSC TV Tuner with PVR, FM Tuner,
Remote
 256MB DDR ATI Radeon X 1600XT, TV-Out
and DVI
 Sound Blaster Audigy X-Fi, 24-bit Xtreme
Fidelity
 HP Multimedia Keyboard, HP Optical Mouse
 Microsoft(R) Works 8.0/Money 2006/MSN
Encarta Plus
 HP Home & Home Office Store in-box envelope

All of these real cool features(for me anyways:D ) was $1,583.99 + tax
Silk

lots
06-07-2006, 08:53 PM
I was curious, so I OC'ed my system to 2.3GHz (100MHz slower than an X2 4800+). This puts my Opterons right square in between an Opteron 248 and 250, and only 100MHz slower than an Athlon64 X2 4800+.

My method for overclocking does not require a system restart, thus nothing aside from the overclock has changed. My system is still going for several days.

The results were interesting :) I repeated my 4 threaded perspective camera test. My render times came out to 18m 36s. Even below Matt's faster clocked X2 (Though I think the 4 threads had something to do with this). I can imagine that an Opteron 252 or 254 would out pace the Xeon mentioned earlier (even with the 8 threads everyone seems to be using).

I cannot OC any higher than this without some voltage mods, etc. And the K8WE does not allow this in BIOS. Since I'm in 64bit Windows my software options are somewhat limited, thus I can only get a 300MHz OC out of my system.

Here's a screenie

mattclary
06-07-2006, 09:33 PM
Tested with 4 threads, made a pretty big difference. Dropped me down to 18m 2s. :rock:

I'll try it with 2 threads tomorrow night, time for bed! :sleeping:

Darksuit
06-08-2006, 02:05 AM
Hey matt, been a little out of the loop. I pulled down the latest rev of Beta 9. However i was wondering where that test scene is that you all are using...


nevermind, i looked through my old content folder and there it was in benchmarks....

nlightuk
06-08-2006, 05:18 AM
Ok, I ordered LW3D and am holding off on getting the computer for a few months. I got what I think is a good quote from BOXX...

I currently run two dual Athlon BOXX workstations at home, with Quadro 980XGL's. They are getting kinda long in the tooth, but I can say I have been EXTREMELY satisfied with them.

The build quality is great (MUCH better than my Alienware box), and they have been 100% reliable since I have had them (touch wood!)

Given that EVERY other PC I have owned has exhibited some sort of major glitch sooner or later, that's the best recommendation/advice I can give :)

mattclary
06-08-2006, 05:41 AM
(touch wood!)


hehehehehe.... He said "wood"... hehehehe.... He's going to "touch it".... huhuhuhuh...

nlightuk
06-08-2006, 06:44 AM
hehehehehe.... He said "wood"... hehehehe.... He's going to "touch it".... huhuhuhuh...

:lol:

bobakabob
06-08-2006, 10:23 AM
I can say I have been EXTREMELY satisfied with them.


nlightuk,

Which UK vendor did you buy your Boxx machines from and have they offered good warranty / tech support? And would you recommend Opterons over the new dual core Xeons? I've had two solid dependable Dell workstations in the past, but thinking of moving on.

nlightuk
06-08-2006, 12:40 PM
nlightuk,

Which UK vendor did you buy your Boxx machines from and have they offered good warranty / tech support? And would you recommend Opterons over the new dual core Xeons? I've had two solid dependable Dell workstations in the past, but thinking of moving on.

Since the machiens in question were dual Athlon MPs, I can't comment on the Opteron/Xeon issue - I will be buying a dual, dualcore Opteron when I next upgrade, but that's partly because some of the other software I use gets more bang for buck out of AMD chipsets.

I bought my current machines ex-demo very cheaply from a company called Reality Computing ltd. They came with 18 months manufacturers warranty still to run, and it was transferred to me on purchase. They were in good condition, and as stated, have been solid and dependable, so i have not had to resort to warranty claims or tech support, but my queries during and after purchase were dealt with VERY promptly. They offered me a range of machines at the time, but for my purposes, the two Athon machines worked out the best deal.

If Boxx had a reseller in Australia (where I am relocating in the near future) I would be buying Boxx again in the future. With any luck by the time I upgrade, they will have a base in the Antipodes and I won't have the worry and cost of "halfwayroundtheworld" shipping! :)

Hope this helps...

monovich
06-08-2006, 02:37 PM
it may be pointless, but I have to cast a vote for AMD. I just finished a fairly big project (6minutes 960x540) and had a render farm that allowed me to compare cpus. I had one AMD 4400 X2, one dual 3.2 Xenon, and two 2.0 Pentiums.

The 2 pentiums were laughable in render performance (in comparison). The Xenons did okay, but the X2 blew it away even though it was rated at a lower mhz. it was typically 40-90% faster than the pentiums. These were real world numbers on the same scenes. I just watched the frame render times on the render farm.

I know you guys are talking about Intel dual cores, and I don't have experience with them, but my AMD system preformed so much better than I could have even wished, I'm going to stick with AMD until I see some nice numbers coming out of the next gen of Intel Dual Cores, which I'm excited about.

lots
06-08-2006, 03:27 PM
Take a look at anandtech's article on the future Xeon CPUs to be based on the Core architecture. They have fairly promising server numbers. We'll see if these number translate into workstation speed in later benchmarks (i suspect)

mattclary
06-08-2006, 06:22 PM
Well, two threads took it down to 17m 30s!

lots
06-08-2006, 10:09 PM
This seems to suggest that the perspective renderer works best with same number of CPUs (at least in this situation).

Further tests probably would help determine when a specific number of threads is best used.

StereoMike
06-09-2006, 05:57 AM
Perspective cam, 8 threads: 9:10

Made a test with 4 threads which gave almost the same result (9:11)

Mike

@Lots, what do you use for OC? The Supermicro board also doesn't allow bios oc. Some tips would be nice :)

lots
06-09-2006, 06:07 AM
I use clock gen. Unfortunately I can only incriment my HTT speed by 1MHz at a time. SO I borrowed a script (and modified it a tad) to OC my system with a click of a button :P

mattclary
06-09-2006, 06:51 AM
Awesome speed, Mike!

Lots, I know when I first built my machine I did some benches with Rad Refl Things, and more threads was faster. In the CPU usage graph, you could see usage drop off on a regular basis as a thread completed. With 2 threads the lag was pretty big, and the frequency of spikes was low. With 8 threads there was less lag, it seemed like the threads were "qued" better.

lots
06-09-2006, 07:24 AM
That was with the old render engine though. This new engine is much better in terms of multi threaded organization from what I've noticed.

Before in pre LW9, if you had your render split up into 2 threads, the engine basically chopped the image in half. One thread rendered the top, the second rendered the bottom. If the top thread finished early that CPU went idle. This is why adding more threads tended to help speed things up. With more threads, you could keep the second CPU busy longer by giving it other portions of the image to render.

In LW9, the render engine is way smarter. If you have a two thread render going, the image is split in half, much like before. If your top half finished rendering in a few seconds, but the bottom of the image was still chugging along, the remainder of the bottom half would be split in half, and the second CPU will start rendering that. If one of the CPUs finishes, and there's still more to be done, the remainder is divided in half again. The process repeats till everything is rendered.

This is just from my observations during the rendering process. But this is why you can set the exact number of CPUs (cores) you have insted of having more threads than CPUs. The reason, when you use the perspective camera, that you see a slow down when upping the thread count, is probably due to the CPU overhead needed to process this many threads. At least thats my guess :)

bobakabob
06-09-2006, 09:39 AM
nlightuk, thanks for the info :)

Interesting article on GamePC (http://www.gamepc.com/labs/view_content.asp?id=xeon5000&page=1&MSCSProfile=95385A1F52DEA1A229D5B37542054464E2BC41 05F85077D2A6B05F36765EC3187A838FFE638E0BE0CE27AAAC 43582634B3E6260DC35D3D6B447805B30413695AE0AEB2B5A7 8B89E8ADF79A75D3F557FC6A809C27CA9165DF055E008E79A7 90A3A9A46930A271CE3338F77462952AE4884BB11E94087967 31E91354540FEA496A7DB8B9BD069FB94A)comparing the new 5000 series Xeons with Opterons:


For a workstation environment, it’s a toss-up. The Opteron used to dominate these performance benchmarks, but Intel’s new high-end Xeons are certainly giving them a run for their money. Depending on which application you use, the Xeon can, in many cases, provide faster performance compared to AMD’s top of the line Opteron, which is a pretty major feat.

Verlon
06-09-2006, 09:46 AM
all right....not reading all of this just my .03 (its worth more because I actually work in a fab).

Overclocking:

CPU's (both AMD and Intel) are built to a spec so that they are expected to last 11.2 years (its how the math works out, not some arbitrary number). Overclocking will reduce this. By how much is anyone's guess. A small overclock probably will not make an apperciable difference, but know what you're doing if you go that route. The different speed grades of the same core are not different designs.

Imagine a sheet of cookies. The best cookies have 25-35 chocolate chips in them, but the chips are distributed so that there are 20-40 chips in each one, with 30 being the average. Cookies with 20 chips are worthless since you promise a minimum of 25 chips per cookie. Cookies with 40 chips are so rare that you can't assure your customers a box of them, but some people are lucky enough to find them.

Also, cookies cooked in the EXACT center of the sheet are the best because they receive the ideal bake cycle. All the cookies came from the same sheet, but you sort them by number of chips (excluding burned or undercooked).

So you sell boxes of 25-29 chip cookies for $2, 30-34 chip cookies for $4, and 35+ chip cookies for $8. However, someone willing to put the time and effort into it could break up a few boxes of 35+ chip cookies and MAKE a box of 40 chip cookies to sell for $12 if there were a market OR the could grab the ones from the exact center of the sheet and sell for a premium because they are perfectly baked.

So nothing immoral about an overclock. You're just assuming your cookie is a little better than what AMD or Intel said (and given margin for error built into testing, it almost certainly is). Also be sure to provide adequate cooling.

Verlon
06-09-2006, 09:57 AM
Next up, Intel vs AMD.

AMD chips are better at least until Conroe...What has been said before on the matter covers it.

Microsoft certifies AMD chips 100% Windows compatible. Intel chips have historically had WAY more errata than AMD. I think Pentium III had 35 pages of errata data compared to 1 entry for the AMD Athlon (K7). This has to be corrected in code or CPU microcode, and both slow things down.

Every single error I have (personally) seen blamed on on an AMD CPU vs Intel since K7 has been either a hardware problem (using non spec power supply for example), or 3rd party software (like a bad copy of DirectX on the Nascar Racing CD).

Both companies have the occasional lemon slip through the cracks, or one that gets zapped after it is shipped. There is no help for that.

Intel has recently slashed prices to a)offload inventory in prep for conroe release and b) fight off AMD market gains. You might find a bargain because of this. CPU price war = consumer gains.

Verlon
06-09-2006, 10:07 AM
Finally, the computer itself:

Builing one from scratch is no biggie. Read the instructions and it will go fine. You probably know 5 people who have done it. If you are near Austin, PM me and I'll help if we can get schedules to work out.

Don't bother with a sound card unless you are very picky about sounds. If you listen to music in MP3 format, the onboard sound on a good motherboard is fine. If you even considered a Quadro over a Geforce, you obviously aren't so concerned about your quake framerates that you can't spare 1-2% cpu utilization for sound sometimes.

I use Geforce over quadro since I also play games and have no trouble with lightwave. I have no experience with quadro and do not know if there is something amazing I am missing, but I do not have isses with large poly scenes using my 6800. You could save a few bucks doing this, but ask some of the workstation guys before you do (and ask for EVIDENCE, not marketing lines).

You can buy cheaper than Boxx....and get more ram or hard drive space or faster CPUS.

ESPECIALLY if you cannibalize the old system to build the new one. Can you save the old case, hard drive, and keyboards?

RAID drive won't do much for you unless you are in video capture. A single SATA drive on this system (SATA 1) can capture video from my VX2000 in full DV without dropping frames as long as I don't do something silly like defrag a hard drive at the same time. If you are paying extra for raid, save the money to get more computer.

StereoMike
06-09-2006, 01:12 PM
Funny thing: I just baked myself a tasty chocolate cake and you start talking about chocolate cookies :)
mhhhh, I'll just allow me another piece...

regarding OC: I buy/build a new main computer every two years, so I think I won't see a chip dying due to old age.
In two years I might get 8 or 16 cores in one machine for decent money, so regardless how well my current system runs, I guess it will end as a renderslave sooner or later.

But I'm really keen on having more power for those times when the deadline comes ugly near, 20% more equals an additional p4 with 3gHz for free.

Mike

lion111
06-09-2006, 02:32 PM
@ stereomike

i am very impressed about the rendertimes you get out of your comp
i am also looking for good com specs to build a new one
so it would be very nice if you could tell me all the inside secrects of yours and if possible how much it would cost me

thanks in advance

lots
06-09-2006, 02:50 PM
Yeah, even though OC decreases longevity, the useful life span of a CPU is considerably less than its time to failure.

As for mike's render times, I could easily hit that by upgrading my CPUs to what he has. But I feel I will wait and see if there are further price cuts to the dual core Opteron line before making that plunge ;) I hear price cuts in the future for AMD CPUs...

StereoMike
06-09-2006, 03:08 PM
Because of a tight schedule and close deadlines I decided against building the system myself and bought it from the guys I get my Lightwave from, here in Germany (C.A.I).
Building the system myself would have saved alot of money, but would have meant, I had to deal with the following points:

1.) getting the knowledge together about win x64, compatibilty of all components, sata raid, reliable parts...etc (that means hanging around in forums and reading the whole day long reviews and recommendations)
2.) order all parts (I wasn't able to get even one opteron (270) by myself, all were sold out)
3.) assemble the rig
4.) setup system
5.) problems in any of the above points

Because of the deadline I needed the new system really bad (I accepted the job depending on a faster computer), and had no time for points 1-5, so buying was the only option. And I'm not sorry for it, the system works great and is really silent. I paid 2900 EUR incl. 16% VAT for
a nice lian li case
2 opteron 270 with silent cooling
4 gb RAM
3 WD 250gb sata HD (two of them as raid 0)
firewire card
win x64
all assembled and installed and tested
prepared for my own 6800gt (only needed to plug it in after it arrives)

You could get the parts for 2200-2500 EUR (incl VAT).

System is real fun, can't imagine using the old single core P4 again :)

Mike

btw, thanks to the guys at CAI, you saved my butt :)

lion111
06-09-2006, 03:37 PM
thanks alot for the info

Weepul
06-10-2006, 08:23 AM
Perspective cam, 8 threads: 9:10
Made a test with 4 threads which gave almost the same result (9:11)Hey, that's almost the same as what I got - 9m10s with 4 threads while playing music and 'net surfing - so now we have a pretty good equal mark between platforms to use for comparisons.

a3dmind
06-11-2006, 02:30 AM
if you haven't yet, check this one out...
http://www.tomshardware.com/2006/05/10/dual_41_ghz_cores/

:eek: :eek: :eek: :eek:

zapper1998
06-11-2006, 09:54 AM
Here is my test.

And a question.?? the image has the question on it...

4 processors so 4 threads..

lots
06-11-2006, 10:11 PM
Since it is estimating.. the default value is likely .1 sec until it has a good basis to estimate the time left.

Guessing how much time is left in a render (or most anything related to computers) is pretty difficult. Just look at the times windows reports as you copy files :P

habañero
06-12-2006, 06:20 AM
I have been looking into system building the last few years and my main experiences are as follows:

* Sound card adds problems to a system and would generally not be worth it if there aint problems (bad noise) on the onboard sound. Creative's cards have tended to sink my computers since 1992 or whatnot and so I would recommend the HDA X-Mystique 7.1 Gold for better value and particularly compability with Nvidia chipsets. An usb solution as well is an interesting option that means you don't use card spaces and can carry around.

* Overclocking is about completely safe on fit hardware *as long as you don't turn up the voltages too much*. At most I would recommend +5%, and only if necessary. An opteron or *2 can usually go to about 2.5 completely stable and the risk would typically be for nothng worse than a crash, then adjust it to 2.45 and continue without issues for like 3 years. reading up on wht you are doing though in advance is recommended, just ignore the part about adjusting up the voltage for work machines. Could try hardforum.com for lots of guides. Heat also won't be much worse on if you don't touch the voltage. Almost forgot, you can have it done automagically from windows these days if you have an nidia board, using a program called Ntune.

* Nvidia's hd controllers, I don't trust anymore and would never consider any kind of raid on those. Just a cheap (but recognized brand) silicon image controller on a pci/pci-e port is my preferred option these days and will offer you sata-e as well. Raid0 should generally not be considered on onboard controllers, it kills lots of troubleshooting options, doesn't combine well with what I would call the safe part of overclocking and doesn't offer lots of real performance gains. On a decent controller card though it can be a good thing given you have backup, for large storage raid5 is obviously a nice investment. On an nvidia card, I would put the boot disk on the second hdd controller if it have one.

* arctic coolings freezer pro is a very simple installation, good, quiet and cheap cooling solution.

* PSU is darn important and what counts is quality and efficiency, not a high wattage specified. High watt is actually bad, it means hotter computer and electricity bill. If I was to recommend a series it would be seasonics silent line that are expensive but worth it and an 380W can run a computer (and help to overclock well) what a 550W generic brand couldn't.

* HDs are important and I research them at www.storagereview.com. Where you can find rather hard numbers instead of the classic "I never had a problem with my 4 maxtors drives". Personally favor hitachi T7K250 sataII drives now or samsungs/seagates 250 GB and Wd generally should be okay as well, but importantly I would recommend clearly against maxtor and while I had a drive blow up it is the numbers I have counted around with friends, the survey (40 000 drives) at storeage review etc etc. Also Raid controllers, usb cases, I check those there.

* Expensive memory aint worth it, although I buy brand memory and maybe a step up from the "valuest" cas 3 option. Cas 2.5 or maybe cas 2 pc3200 is enough for an Athlon/Opteron, there is lots of bandwidth (the mhz number or PC4000) and lower timings (the "cas" or number line 2.5-3-3-6) will be better than higher mhz in about all normal cases to the best of my knowledge. Also, if you add memory in all four slots, you will likely have to adjust the speed down a little, either the timings (cas or 1T in particular) or the "divider" (to pc2700, 5/6, 166 or similar). You could just do a render test on what alternative is faster, I would bet slightly on adjusting the divider instead of timings as mentioned. I favor some budget sets from Twinmos for the time being, or g.skill that offers a slight overcloking potential. If the price is right, maybe OCZ platinum 3200 but I wouldn't consider extending the budget beyond that.

* Consider a backup solution ... ! I have been successful with acronis trueimage and external Usb disks. This is money far better spent than raid1 solutions as an example, done properly you can be recovered from a complete hd crash, bad virus or kids attack within an hour working from where you left it. With raid1 you would have a copy of the disk with said virus ...

* Nvidias 7600GT would be my general recomendation for a LW card for a typcal *2 machine, the card have as a bonus the lowest Wattage/performance I think of them all.

* Rid your machine of any product made by norton. They make computers slow and unstable, and doesn't offer real neither date or threat security. I know this might sound strange to some, but I know what I am talking about. As an example, I am currently on a boat doing a little cooking and everybody has laptops. (We have 3000$/month satelite broadband). Several of them were insanely slow and described as ""not working" or "virus machines". Removing norton, installing AVG and spybot search and destroy fixed speed (up to doubled) and revealed around 20-30 average security problems. I woulnd't call it a scam since that could make people upset or sound alarmist, I would use a fine word for it. You might need a tool called symnrt as well, that is a story in itself, its nortons own tool that removes their applications like if they were viruses eg. breaks windows uninstall procedures. Just try a little google or maybe user reviews of norton products about any place, like CNET. I have personally removed literally thousands of viruses from norton protected machines.

And finally, the best value seldom is at the highest end of the price range. Anyway, I have to go make some delicious apple pie! :)

mattclary
06-12-2006, 08:08 AM
* Nvidias 7600GT would be my general recomendation for a LW card for a typcal *2 machine, the card have as a bonus the lowest Wattage/performance I think of them all.

* Rid your machine of any product made by norton.


Couldn't agree more. I agree wth most of what you said, but particularly these points.

I disagree slightly on the PSU choice. Just because a PSU is rated for 500w doesn't mean it sucks 500w, it just won't break a sweat when pulling 350w. Also, a (good quality) PSU designed to pull 500w will be designed to handle the heat generated by 500w usage, so if you only use 350w, the unit will be stressed less by the generated heat and hopefully move enough air to help cool your other components.

I have an Xclio PSU (forget the exact wattage but ~450-~500 I think) with a 14cm fan, the thing is absolutely silent and is rock solid. FYI, Xclio is made by the same OEM who builds Antec PSUs. Funny thing is, I actually like this PSU better than any Antec I've ever owned.

lots
06-12-2006, 08:57 AM
Well higher watt PSUs do suck up more power generally, but matt is right, it wont suck 500W constantly. Only when it needs to peak, which ideally, is never.

BTW My PSU is advertized as 650W, but can peak at 710W. Most of the time I'm drawing around 450W though (dual socket mobo = power hungry).

Captain Obvious
06-12-2006, 03:28 PM
The new iMac is rated at 170 watts, I think. According to one wall outlet measurement I saw, it used about 50 watts during normal use. So don't just look at the rating, and assume that's what the computer will actually use.

Defiance
06-12-2006, 08:22 PM
I disagree slightly on the PSU choice. Just because a PSU is rated for 500w doesn't mean it sucks 500w, it just won't break a sweat when pulling 350w.


If you spec your PSU higher than you need, what happens is you're hitting the PSU's efficiency curve at a lower value. (PSU's hit peak efficiency at a certain percentage of their maximum load, a max load way higher than your typical usage means your efficiency is lower).

habañero
06-13-2006, 04:59 AM
Christ the spelling in that post really is something ...

What makes PSU efficiency important is that 90% means 10 % of the energy goes to *heat*. 80% means 20% will go to heat. Now in itself this sounds bad, but what really makes this question interesting is that the efficiency (and max output) is dependant on the temperature of the PSU. So an inefficient PSU can then sink to 60-70% efficiency, which really makes for some house heating. The solution to this heat problem is bigger, more and faster fans eg. noise, but generally you will heat your case as well a little.

(Cheap) PSUs are also often advertised with their peak wattage, not what they can output sustained and also often that wattage is a hypothetical case of the PSU being 20 degrees under full load when it will be 50 degrees C in reality and the real max wattage thus 2/3 of this 20 degrees number.

Without being able to give any technical explanation the other big difference is the quality of the power outputted, which if it is all jumpy and wiggly will stress your components more. Also cheap ones might give higher or lower voltage than spec, both which can generate either/or instability and more stress (shorter part life).

I remembered the name of the PSU series i mentioned and its "S12", but there are other ones as well that can do similar feats. This one though is very quiet as well as great quality and so it carries a little price premium but I consider it worth it. If you add a little stress and save money by overclocking, you make it up again by spending a little bit on this part.

Again there is some rather learned discussions on such questions at Hardforum.com or I think ars technica, I suffice myself with having found out roughly how things work and a few options that actually work ...

mattclary
06-13-2006, 06:07 AM
The S12 is made by Seasonic, right? http://www.newegg.com/Product/Product.asp?Item=N82E16817151022


Yeah, they are very good power supplies.

If you want the best power supply available, take a look at www.PCPowerAndCooling.com they are expensive, but they are good.

This is my PSU:
http://www.newegg.com/Product/Product.asp?Item=N82E16817189012

StereoMike
06-13-2006, 06:42 AM
Regarding PSU, it's stylish to have connectors for all the cables to the PSU (less clutter) but it's bad for the overall efficieny cause you just add another connection with power loss. Each interuption of the cable is bad.

I have this one (600w)
http://www.driverheaven.net/reviews/Be-Quiet%20P6-470%20DarkPower/index.htm

mattclary
06-13-2006, 06:51 AM
Yeah, I heard PC Power and Cooling rant about that in MaximumPC magazine a while back, but last I heard, they were making a unit of their own that was modular. Funny how the market gets driven. I didn't really want modular cables for that reason, but when you start looking for SLI certified PSUs and such, it can actually be hard to find a PSU that ISN'T modular.

lots
06-13-2006, 07:44 AM
Mine's SLI certified, and not modular :P Though it has 70% efficiency (i'd have to look at what temperature this rating is at, but I believe it was a normal temperature :P)

mattclary
06-13-2006, 08:16 AM
Yeah, you can get them. When I was looking for mine, it seemed all the ones that fit my criteria where modular. Wanted PFC, wanted it quiet, etc... In the end, I just let go of the non-modularity requirement, figured it was probably just not that big of a deal.

As a side note, my wife just told me to get myself a second video card for SLI for father's day! Trying to decide if I want to do that or want something else. I'm currently playing Elder Scolls: Oblivion. My system has been handling it really great, but I've noticed a stutter here and there. Read that this is one of the most demanding games in existence right now. It actually makes use of dual cores too!

Verlon
06-13-2006, 01:25 PM
on that 7600GT card.....Gigabyte makes a 'silent pipe' version that has no fans (reducing noise and power requirements -- doesn't even need a separate connector from the PSU). It seems to test pretty well.

mattclary
06-13-2006, 02:23 PM
The 7600GT had just come out when I bought mine and you couldn't find a heat pipe version at the time even though they had been announced. Yeah, none of the 7600GTs (or GSs) have a power connector, that rocks. I have a Zalman cooler on mine, so it's pretty silent. My whole system was designed to be silent from the get go.

lots
06-13-2006, 02:49 PM
matt, for fathers day you should get me two Opteron 285s :)

Not that I'm a father... or close to it :P

mattclary
06-13-2006, 09:44 PM
Sure, hang on...

OK, they are in the mail! Enjoy! :thumbsup:

kenc3dan
06-16-2006, 02:17 PM
Here are my plans thus far. I'm still waiting for more feedback on the RAM I have
suggested here but I think everyone is in agreement on the other stuff.
Overkill, maybe, on the power supply but so be it.
Any opinions?

Goal
Barebones, 2 CPU, 4 core Opteron with 2GB RAM
Overclocked from 1.8GHz per core to 2.0GHz+
No drives, no graphics card
Will use with existing drives and PCI-X graphics card for Lightwave, After Effects,
video conversion and compression, and gaming

Cost
$1400-$1550

Motherboard
MSI K8N Master2-FAR (30x25cm)
http://www.zipzoomfly.com/jsp/ProductDetail.jsp?ProductCode=241183
250.00

Processors
2x Opteron 265
http://www.monarchcomputer.com/Merchant2/merchant.mv?Screen=PROD&Store_Code=M&Product_Code=120775
730.00

CPU coolers
2x Thermalright SI-9XV - Intel Xeon heatsink
http://www.jab-tech.com/product.php?productid=3010
70.00

RAM
4x 512MB PC3200 ECC Registered Memory
Timing advice varies...
Some say low latency (CL2) is so good for rendering and might overclock to 230MHz+
Some say that balanced RAM (3-3-x-x) overclocks better and that makes all the difference

2x Kingston HyperX 1GB (2x512MB) 3-3-3-8 timing, CL3
http://www.newegg.com/Product/Product.asp?Item=N82E16820144130
408.00

-or-

2x Crucial (2x512MB) unknown timing, CL3
http://www.crucial.com/store/MPartspecs.Asp?mtbpoid=5A9B4DB5A5CA7304&WSMD=MS%2D9620+%28K8N+Master2%2DFAR%29&WSPN=CT2KIT6472Y40B
316.00

-or-

2x Patriot (2x512MB) 2-3-2-5 timing, CL2
http://www.monarchcomputer.com/Merchant2/merchant.mv?Screen=PROD&Store_Code=M&Product_Code=140638
250.00

Power supply
Enermax EG651P-VE-24P ATX 1.3 EPS/SSI 550W Power Supply
Single rail 12V 36A, -5V too
http://directron.com/eg651pvefm24p.html
105.00

Case
Coolermaster Centurion
http://www.newegg.com/Product/Product.asp?Item=N82E16811119068
34.00 after rebate

Lightwolf
06-16-2006, 02:21 PM
I'd go for a different board. Mainly to have 4 memory slots per CPU... if you spread your memory modules to have 2 per CPU, it can improve memory performance dramatically. Not really needed for rendering, but if you plan on compositing...

Cheers,
Mike

zapper1998
06-16-2006, 08:28 PM
hmmmmm
interesting............

hstewarth
06-17-2006, 01:53 PM
My next machine should run Lightwave quite nicely..


2 3Ghz Xeon 5160 ( Woodcrest )
Supermicro X7DA3/i - SAS motherboard
Supermicro 750 Case
Supermicro ZCR Zero channel raid with 256M
4G FB-Dimm ( likely 2 2G sticks
eVGA 7900GT signature edition


Already have
------------
5 Seagate 320G 7200.10 in Raid 5
Creative Labs X-Fi Elite Pro
Dell 2405f 24in LCD
Logitech G15 keyboard
Logitech G5 mouse
NEC 16X DVD burner


Likely later
2 or 3 Segate 15K SAS drives

mattclary
06-17-2006, 02:15 PM
CPU coolers
2x Thermalright SI-9XV - Intel Xeon heatsink
http://www.jab-tech.com/product.php?productid=3010
70.00


Will this work with the Opteron? An why the big-a** font?

lots
06-17-2006, 03:33 PM
Being a Xeon heatsink, my bet is no :P

Verlon
07-25-2006, 05:09 PM
has anyone seen Core 2 Duo (Conroe core) benchmarks involving Lightwave, Windows x64, LW64, or Vista anywhere....

I haven't seen anything at my usual haunts.....and I have a desire to upgrade.

monovich
07-25-2006, 11:35 PM
I read some of the thread, but then gave up. has anyone here mentioned the AMD X2 price cuts? They are dramatic.
p.s. another vote for an X2 system. I've got a 4400+ 4bg Win64 and I'm a happy happy camper.

a compelling reason NOT to wait to upgrade your computer if you just upgraded to 9 is that you won't be able to make the most of many of the new features. Forget rendering millions of polys, SSS, Ani-reflections, radiosity, or any of the other intensive calculations on an old system. It's hard enough with a fast system.