PDA

View Full Version : Graphics Card Recomendations?



jlp.media
12-29-2011, 12:00 PM
I'm sure a lot of you have seen this thread title before. And call me lazy but when I tried to 'research' an ideal graphics cards for animation, most of the discussions seemed to be out-dated, even the one found on newteks webpage:

Does LightWave support SLi/Crossfire?

As of this writing no version of LightWave 9.3.1 supports OpenGL multi-threaded rendering. So while SLi or Crossfire enabled computers will certainly run LightWave the extra GPU(s) will not be taken advantage of.
What graphics cards are recommended for LightWave 9?

LightWave 9.x requires an OpenGL 2.0 (or higher) compliant graphics card with an nVidia or ATI GPU.

Minimum (64MB of RAM):
nVidia GeForce FX 5200
or
ATI Radeon 9500

Recommended (128MB of RAM):
nVidia GeForce 7300/Quadro FX 350 or higher
or
ATI Radeon X1300/FireGL V5200 or higher

So what is it exactly that I should be looking for? PCIe 2.1? clock cycles? ($200-$400)

BigHache
12-29-2011, 12:37 PM
To answer part of your question, SLi/Crossfire is supported in fullscreen programs, like games. Windowed programs cannot take advantage of that.

What is your budget? $200-400 is quite a range. If $400 is the max. you're ready to invest, that can help others give an answer.

What motherboard will this go in?

jlp.media
12-29-2011, 02:01 PM
no MoBo yet. but yes, it is quite a range in budget. Would the performance be that noticeable between a $200 card vs a $400 one? Well, let me rephrase that: would the investment be worth it? Everyone knows how fast technology grows, and there's no need for a little guy such as myself (Freelance animator) to invest in such a large sum in something that will be outdated in a few years...

Can I conclude that Windowed programs rely on OpenGL and games-DirectX?
Which would make sense if threading only works for games...I'm at such a loss when it comes to this stuff..lol

Rayek
12-29-2011, 04:18 PM
Nvidia: CUDA support, deliberately crippled OpenGL/CUDA performance to sell Quadro/Tesla cards - still fast though for CUDA, not so in OpenGL, Stereo3D support, awesome although proprietary linux drivers.
In particular gl_readpixel() and backface shading for doublesided triangles is extremely slow in GF400+ cards. Multiple times slower than GF200 series cards. Nvidia still says its for the best, else customers would have pay too much because the cards would be too good, so just lock out some features
And no, a quadro does not give you any real benefits in Lightwave and it has crippled CUDA performance as well so you need a Tesla too

AMD: No CUDA, sometimes driver issues haven´t heared of any lately though, some cards have slow gl_select, cranky linux drivers, very good OpenGL performance in general, Eyefinity.

(taken off a different forum, but sums it up well)

Also check this:
http://www.cgchannel.com/2011/10/review-professional-gpus-nvidia-vs-amd-2011/

For reference, my ati 5870 outperforms all these in the cinebench benchmark as far as opengl performance is concerned: 69.9fps.

Layout's 9.6 opengl viewport is very slow compared to my other apps, though, even with vbo's activated.

Question on the side: has v10 improved opengl performance, and if yes, how much was it improved? For example, this model:
http://www.scifi3d.com/details.asp?intGenreID=10&intCatID=10&key=592
(same as in the review above) runs about 21fps in Layout. In a certain open source app I can have 8 of these and the viewport still runs smooth - smoother than Layout's with just one.

I am thinking about upgrading at some point, but my scenes have become progressively more complex lately - a reasonably fast opengl viewport is now required.

Back on topic: at this point the v7900 probably offers the best price-performance deal (as far as workstation cards go) - they can be found online for $700.

Otherwise, I would say a good AMD/ATI consumer card will work well. Try to get a 2gb model.

I can't say I would advise you to get an nVidia card, however. A year ago I bought a 480gtx, and the general opengl performance was dismal, though it did depend on the application. Maya and Blender, and were horrid to work with. In Lightwave I did not really notice that much of a difference. So milage may vary.

jlp.media
12-29-2011, 04:33 PM
uh....what? you might as well just be talking to a wall.

I just want to know what to look for in a card minus all the technical gl_readpxl() crap.

and perhaps you misunderstood me, my price range is $200-$400, not $700... :sigh:

Burchigb
12-29-2011, 04:58 PM
Go to newegg.com and look up EVGA.

They have some good cards that are not way over your budget.
Also look at Tom's Hardware and they give some good reviews on cards.

Ernest
12-29-2011, 05:28 PM
No. There is no difference at all between a $200 and a $400 card in Lightwave. Yes, I measured it. In games, there is. In LW there isn't.

As you know, GPUs have been made on a 40nm process for almost half a decade and, in April, the 28nm GPUs will be released (earlier for AMD, if that is an option for you. I use 3D Coat, so AMD doesn't exist for me). Considering that, I don't think there is any justification for spending on anything bigger than a 1GB GTX560, for around $180 until that happens.

Make sure you do get at least 1GB of video memory, since that is the minimum for most GPU renderers, in case you become interested.

Surrealist.
01-01-2012, 11:10 PM
uh....what? you might as well just be talking to a wall.

I just want to know what to look for in a card minus all the technical gl_readpxl() crap.

and perhaps you misunderstood me, my price range is $200-$400, not $700... :sigh:

It is always a good idea to look beyond LightWave and into the future for any investment in my opinion.

http://forums.newtek.com/showthread.php?t=114261

I would consider a card that has CUDA support for instance which narrows cards down to NVIDA which is also "supposed to be" a better card for LightWave than ATI. At least that has been the case in the past although these days I am not sure. I use an ATI without significant problems.

CUDA support for instance is being used in Blender's new Cycles render engine. I don't know why they chose to use CUDA but they did. With a graphics card that has CUDA support you can get faster render times in Blender's cycles engine.

http://www.nvidia.com/object/cuda_home_new.html

I am a little confused about Open CL. It is listed as the other option for other cards like ATI. So that would be something to research fuerther.

http://sicg.atmind.nl/index.php?option=com_content&view=article&id=30

What does this have to do with LightWave?

Well at some point there is going to be support for GPU. And research into which way it is going would be good. See the thread above. Ask around, google etc.

Looking beyond LightWave however you might want to do more research. I found for instance that my graphics card has significant problems with Softimage and Houdini, putting them out of the question for now as solutions.

So, you can evaluate on your own.

I would say in general. Get the card with the fastest GPU and as much ram as you can afford. That is just smart thinking all the way around. LightWave will change over time and other apps are more and more gravitating toward using the GPU. So don't completely discount that as a factor.

So in short:

Lean towrard NVIDIA - but research for yourself and take into account comparability with other apps you use or may use as well as which way LightWave will go in the future.


Then once you pick a platform. Get as much ram and GPU speed as you can afford. Shop around and stick with the main manufactures not the off shoots.

And finally try to lean more toward the professional level cards not the game cards. I think you'll have better luck.

put this search into google:

"professional workstation graphics cards"

And see what you find that comes up and then shop according to platform (NVIDIA or ATI) GPU Speed, RAM and your budget.

Here is one that comes within your proce range:


PNY VCQFX1800-PCIE-PB NVIDIA Quadro FX 1800 Workstation Graphics Card
Create innovative designs with the NVIDIA® Quadro® FX 1800 by PNY professional graphics solution architected for leading-edge manufacturing and design companies. The next generation solution within the most requested mid-range professional graphics family, Quadro FX 1800 provides a balanced price and performance combination. Featuring the NVIDIA® CUDA™ parallel computing architecture, 30-bit color accuracy, and automatic configuration of application display settings for optimal performance, the Quadro FX 1800 sets the standard for power efficiency while delivering a rich user experience.

Good luck!

Rayek
01-02-2012, 02:56 AM
OpenCL is gaining a lot of traction lately. Most of the industry and universities involved with GPU-based calculation seem to agree that a proprietary technology, such as CUDA, is a step backward - probably the main reason why nVidia has open sourced part of its tech. In principle drivers can now be compiled for CUDA that run on AMD cards as well.

Blender's Cycles is still under heavy development, and OpenCL WILL be supported.

For an external renderer that works with OpenCL, try LuxRender:
http://www.luxrender.net

It actually supports simultaneous CPU/GPU OpenCL rendering. (Why isn't Lightwave supported, I do not know - it's a great unbiased render engine, and free!)

This works very well on my ATI 5870.

2012 will be a very interesting year for GPU-based rendering. At this point CUDA still has more commercial support, but this will probably (hopefully?) change.

@jlp.media: I just mentioned the best price/performance option - I concur with Surrealist: research some of the video card options. That is why I included the link to that workstation card comparison/review.

AMD workstation cards are currently much more inexpensive than the nVidia cards as far as price/performance ratio goes.

For example:
A Firepro 4800 - $154 performs better in Lightwave than a Quadro 4000 at $729 (according to that review).

However, it really depends on your requirements and the applications you are going to use it for. CUDA and/or OpenCL? More texture memory or less texture memory required? And so forth.

This benchmark may also be helpful:
http://www.videocardbenchmark.net/high_end_gpus.html

And yes, consumer amd cards might have some problems in certain apps - though in my experience so far my opengl performance is much better than the 480gtx I had before.

You will find there is no perfect or easy answer to this question. It depends on you - so a bit of research is required.


It is always a good idea to look beyond LightWave and into the future for any investment in my opinion.

http://forums.newtek.com/showthread.php?t=114261

I would consider a card that has CUDA support for instance which narrows cards down to NVIDA which is also "supposed to be" a better card for LightWave than ATI. At least that has been the case in the past although these days I am not sure. I use an ATI without significant problems.

CUDA support for instance is being used in Blender's new Cycles render engine. I don't know why they chose to use CUDA but they did. With a graphics card that has CUDA support you can get faster render times in Blender's cycles engine.

http://www.nvidia.com/object/cuda_home_new.html

I am a little confused about Open CL. It is listed as the other option for other cards like ATI. So that would be something to research fuerther.

http://sicg.atmind.nl/index.php?option=com_content&view=article&id=30

What does this have to do with LightWave?

Well at some point there is going to be support for GPU. And research into which way it is going would be good. See the thread above. Ask around, google etc.

Looking beyond LightWave however you might want to do more research. I found for instance that my graphics card has significant problems with Softimage and Houdini, putting them out of the question for now as solutions.

So, you can evaluate on your own.

I would say in general. Get the card with the fastest GPU and as much ram as you can afford. That is just smart thinking all the way around. LightWave will change over time and other apps are more and more gravitating toward using the GPU. So don't completely discount that as a factor.

So in short:

Lean towrard NVIDIA - but research for yourself and take into account comparability with other apps you use or may use as well as which way LightWave will go in the future.


Then once you pick a platform. Get as much ram and GPU speed as you can afford. Shop around and stick with the main manufactures not the off shoots.

And finally try to lean more toward the professional level cards not the game cards. I think you'll have better luck.

put this search into google:

"professional workstation graphics cards"

And see what you find that comes up and then shop according to platform (NVIDIA or ATI) GPU Speed, RAM and your budget.

Here is one that comes within your proce range:



Good luck!

Traveler
01-02-2012, 08:23 AM
If you want to buy a card specifically for lightwave and not for games, I'd say get something in the 200 range (perhaps even less). Really, Lightwave's requirements are low. I run an ATI (AMD) 4870 (not even sure you can buy those anymore) and it's more than capable. If you want to get the most out of Lightwave invest in a proper CPU (i7), memory (16gb) and if you can, a SSD Drive. (the latter only if you really want to, but I promise you'll love it)

JonW
01-02-2012, 01:22 PM
If you want to buy a card specifically for lightwave and not for games, I'd say get something in the 200 range (perhaps even less). Really, Lightwave's requirements are low. I run an ATI (AMD) 4870 (not even sure you can buy those anymore) and it's more than capable. If you want to get the most out of Lightwave invest in a proper CPU (i7), memory (16gb) and if you can, a SSD Drive. (the latter only if you really want to, but I promise you'll love it)

Get a cheaper card & buy an SSD. You will see far more performance improvements especially with all that single core work LW is doing.

My better cards are all old, 280, 260, 9800 & they are all perfectly ok with lightwave. Even the built in graphics on my old Mac Mini 2.0GHz with a 30" screen is manageable.

jasonwestmas
01-02-2012, 01:49 PM
Doesn't matter for LW other than to just get enough video ram if you want nice ogl texture previews. 1 GB or more of VRam is usually plenty. Plus that will work better with 3Dcoat if you plan on using the application for use with Lightwave.

Lightwave is predominantly a CPU based program performance wise.

Surrealist.
01-03-2012, 12:03 AM
Certainly.

However, having a better, faster GPU and as much video ram as you can afford it just a good idea all around. You'll have a much more fluid experience over all because you never use just LightWave. And there are many benefits to be had. Not to mention the future benefits for your investment - provided you do your homework.

Ernest
01-03-2012, 04:48 PM
Certainly.

However, having a better, faster GPU and as much video ram as you can afford it just a good idea all around. You'll have a much more fluid experience over all because you never use just LightWave. And there are many benefits to be had. Not to mention the future benefits for your investment - provided you do your homework.

I don't think there are too make future benefits in an industry with a 12-month product cycle. Unless you buy at the very beginning of the cycle (and now we're actually at the very end of a cycle) there will always be a cheaper, much more powerful card just a few months away. This is one of the few products where I've never thought the future had any real importance. In other words, by the time that those future benefits actually arrive, the cards that can take advantage of those future benefits will be better and cheaper than today's highest end cards, so it's better to get just the card that you really need today.