PDA

View Full Version : New video card for Core?



Sting
02-19-2009, 12:03 PM
I plan to upgrade to an Intel i7 920, but I'm not sure about the video card. I'm not familiar with how Maya and other 3D applications take advantage of the video card in ways not available to LW at this time. The Core FAQ says that it is capable of leveraging GPU power where applicable. That description seems to be too vague. Does it mean we might eventually see it take advantage of more features of the GPU or could it just mean it will be the same as now, but with better OpenGL performance?

I'm currently using a NVIDIA 7900GT. Should I just keep using this card with the i7 until we have more information or should I just go ahead and upgrade the card? I was thinking about getting an ATI 4850. I'm not into gaming anymore and I don't want to overclock my system. I just want a decent card that is quiet and not too hot like the rest of the system. I'm also not sure how good OpenGL is with ATI versus NVIDIA at this point. Is there something else I should consider or is there too much speculation to make a good decision at this point?

geothefaust
02-19-2009, 01:18 PM
I'd wait a few more *days before you make that decision. :)

*Which may turn to a week, but again, wait a little while. You shall see!

kfinla
02-19-2009, 01:36 PM
LW 9.x had some minor openGL improvments but LW has always been a poor preformer in this area and has lots of CPU dependencies with its redraw. If you use other 3d apps than you will see gains in those apps from upgrading your card. I personally have never noticed big differneces in LW after upgrading my video card. Hopefully in LW core (1st public beta build slated sometime between now and March 31, 2009) we see similar (better) performance like in other 3d apps and even more utilization of the GPU perhaps aiding with rendering.

geothefaust
02-19-2009, 03:25 PM
Here's your answer!

http://www.newtek.com/forums/showthread.php?p=837190

Now go buy yourself that sweet, sweet video card. :D

IMI
02-19-2009, 03:48 PM
Not into gaming? IMO, no reason to get an ATI card then, you'd be better off with Nvidia, particularly for the new fancy GPU app-related things coming 'round the bend. IMO, that is. :)

One thing you said which struck me:


I just want a decent card that is quiet and not too hot like the rest of the system.


Hold on there... If I were you I'd do something about that "too hot..." thing. Not just for its own sake, but any powerful GPU you add is going to increase your heat quite a bit. Your current 7900 GT doesn't put out any heat even comparable to, say, an 8800 GTS/GTX/Ultra, so if your system is already "too hot", you're going to make it even hotter.
I haven't checked yet into what they supply with the i7 as far as heatsink/fan (hsf) combos go, but the typical Intel hsf is pretty lame. I brought my quad core down 20 degrees C just buy using a 3rd party copper fin Zalman hsf. And your case might need better fans and/or a cleaning, too.
And that thermal goop attached to the Intel stock hsf is kinda lame, too. Arctic Silver is better, and is very inexpensive.
Unless you're just exaggerating about the "too hot" thing... but anyone who's running apps using a multicore proc, with alot of RAM and a powerhouse GPU really needs to be concerned that it doesn't get close to the threshold, even after hours at full power, even if not overclocking.

Sting
02-19-2009, 04:06 PM
Here's your answer!

http://www.newtek.com/forums/showthread.php?p=837190

Now go buy yourself that sweet, sweet video card. :D


Wow. Thanks for that update from Jay. That make me feel better about any of my choices. :)

Sting
02-19-2009, 04:53 PM
Not into gaming? IMO, no reason to get an ATI card then, you'd be better off with Nvidia, particularly for the new fancy GPU app-related things coming 'round the bend. IMO, that is. :)

One thing you said which struck me:


Hold on there... If I were you I'd do something about that "too hot..." thing. Not just for its own sake, but any powerful GPU you add is going to increase your heat quite a bit. Your current 7900 GT doesn't put out any heat even comparable to, say, an 8800 GTS/GTX/Ultra, so if your system is already "too hot", you're going to make it even hotter.
I haven't checked yet into what they supply with the i7 as far as heatsink/fan (hsf) combos go, but the typical Intel hsf is pretty lame. I brought my quad core down 20 degrees C just buy using a 3rd party copper fin Zalman hsf. And your case might need better fans and/or a cleaning, too.
And that thermal goop attached to the Intel stock hsf is kinda lame, too. Arctic Silver is better, and is very inexpensive.
Unless you're just exaggerating about the "too hot" thing... but anyone who's running apps using a multicore proc, with alot of RAM and a powerhouse GPU really needs to be concerned that it doesn't get close to the threshold, even after hours at full power, even if not overclocking.

Maybe I shouldn't have said that I wasn't into gaming. The last game I played was Battlefield 2 and I stopped that about a year ago. I wouldn't want to play Crysis with my current card and I haven't had an interest in any other games lately.

The user reviews at Newegg seem to indicate that Giga-byte and MSI both have 4850 cards that are quiet with very good cooling compared to the other manufacturers. I also know a guy who has the Giga-byte card and he agrees with those reviews. I chose ATI because they are generally quieter and not as hot as the equivalent NVIDIA card. If you can tell me that ATI still has issues with OpenGL or particular problems with LW, I would go back to NVIDIA.

As for the i7, the Noctua NH-U12P SE1366 120mm SSO CPU Cooler has all positive reviews. It appears to be the quietest cooler for the i7 and only 1 or 2 degrees hotter than a Thermaltake cooler that is rated as the coolest for an i7.

IMI
02-19-2009, 05:58 PM
I chose ATI because they are generally quieter and not as hot as the equivalent NVIDIA card. If you can tell me that ATI still has issues with OpenGL or particular problems with LW, I would go back to NVIDIA.



I've never owned an ATI card, but after 10 years of doing 3D and reading online forums, I've seen an afwul lot of complaints about ATI's performance in 3D apps, as opposed to nvidia.
I'm not sure what ATI is up to, but recently nvidia has been reaching out to the 3D app developers with tools like CUDA, and in the future, apps will be taking advantage of real time rendering with the GPU (or even multiple GPU's, in SLI), along with the CPU. As I understand it, that is. And Core might benefit better from nvidia than from ATI, since it seems that's what they're using at Newtek, although they're using quadros.

Personally though, I would wait until Core's final release before sinking alot of money into a video card I was planning on using mostly for 3D and not games.
Although, I'm pretty sure you can use that particular ATI with the current LW, by the time Core comes out, it' will be way obsolete. You could do just as well with an 8800 GTS in the current LW, or even a very inexpensive 8600. I doubt the early versions of Core will be utilizing much of the available GPU power, and still relying mostly on CPU. I suspect the early (and maybe even later) versions of the Core beta will be optimized more for Quadro or FireGL.

precedia
02-19-2009, 07:05 PM
I have two 8 core Mac Pros. This is not really an Apples to Apples comparison, so to speak, because the latter is newer vintage with a higher clock speed.

The first one (3GHz) has the ATI X1900 XT graphics card.

The second one (3.2GHz) has the nVidia GeForce 8800 GT.

I can't tell the difference between the two machines even when working in large scenes in Textured Shaded Solid mode.

That could be, however, a function of my work habits, my scenes, or a side-effect of the LightWave Mac OpenGL bindings.

Definitely not a scientific benchmark but definitely an end-user gut-reaction data point.

Daniel

Sting
02-19-2009, 07:27 PM
Thank you guys for your feedback. I will wait a while to see what happens during Core's development. NVIDIA may become the obvious choice, but it is early and I'm in no rush to make a decision.

kfinla
02-19-2009, 09:16 PM
I had a X1900 XT in my mac pro (1st gen) and then upgraded to the 8800 GT. I did see a difference in Modo.

IMI
02-20-2009, 01:26 AM
I had a X1900 XT in my mac pro (1st gen) and then upgraded to the 8800 GT. I did see a difference in Modo.

When I upgraded my (then) main box from an 8600GTS to an 8800 GTS, I saw a considerable difference in the performance in modo (301 at the time). Very very little difference in LW, though, but as has been pointed out numerously here in this forum, currently GPU power means little to LightWave. LW's OpenGL is dealt with mostly by your CPU.
I just now had a look at a very high poly object being flipped and moved around in Layout and saw in my CPU, cores 1 and 4 were being used at about 85% each, while my 8800 GTS temperature barely budged up one degree. I wish there were a utility to see, in graph or percentage form, how much of your GPU is being used at any time, but I've not yet been able to find one.

JBT27
02-20-2009, 01:46 AM
This is all useful stuff, and I'm curious to see what my now fairly old Quadro will do with the first builds.

I reckon we'll be buying new hardware towards the end of the year - I'm told I should definitely go with i7, so that will allow a few months for them to settle in and me to figure out what to buy.

It would be very good to have a central LW page or three that helps out the hardware-challenged, rather than having to trawl through long threads by people who know and talk what they are doing with all these components.

Julian.

IMI
02-20-2009, 01:56 AM
This is all useful stuff, and I'm curious to see what my now fairly old Quadro will do with the first builds.



I guess we'll have to wait and see if they continue to write strictly for Quadro-type OpenGL usage or if they choose to take advantage of the advancements being made for mainstream consumer video cards.
Personally, I hope they dive into it and finally do something about it. I mean, look at all that Mudbox 2009 can do, for example, even with what is basically a gaming card.
There's some serious work that's been done there, and LightWave could definitely stand to have a piece of that action.

JBT27
02-20-2009, 02:24 AM
I guess we'll have to wait and see if they continue to write strictly for Quadro-type OpenGL usage or if they choose to take advantage of the advancements being made for mainstream consumer video cards.
Personally, I hope they dive into it and finally do something about it. I mean, look at all that Mudbox 2009 can do, for example, even with what is basically a gaming card.
There's some serious work that's been done there, and LightWave could definitely stand to have a piece of that action.

I think that economics as well as tech advancement may well dictate how NT develop the OGL performance - I agree with your comments there, which in many respects will make Core more and more appealing to far more people.

As for me, well, I'm just going to let some months pass by, as I say, and then take stock.....I've been buying and using this stuff enough years to learn that often, caution really is the better part of valour, as it were ..... :D

Julian.

colkai
02-20-2009, 04:29 AM
Well all my money will be going to core, can't afford a new video card so I'm hoping my lowly Geforce 6200 will at least behave as 'quickly' (ahem :p) as it does now.
I'm greedy mind, is a part of me holding out hope it will be quicker. ;)