PDA

View Full Version : Considering Nvidia Quadro K4000 - is it worth it?



unstable
05-26-2013, 04:46 PM
As the subject line states, I'm considering a new graphic card. I currently have an

ATI Radeon HD 5870
◾PCI Express® based PC is required with 1 X16 lane graphics slot available on the motherboard
◾500 Watt or greater power supply with two 75W 6-pin PCI Express® power connectors recommended (600 Watt and four 6-pin connectors for ATI CrossFireX™ technology in dual mode)

I purchased this card with a Dell Studio XPS desktop, so I assume my motherboard and power supply meet this criteria. What I don't know is if it meets the criteria for the K4000.

Quadro K4000 requirements
System Interface PCI Express 2.0 x16
Maximum Power Consumption 80W
Auxiliary Power Required - Yes

One reason I'm wanting to upgrade is because every time I've tried to upgrade my video driver using updates from Dell and ATI (AMD) my system crashes and I have to restore the old 2009 driver. I've also had failures occur with LW and the 2009 driver where the screen goes white and I get a display driver failure message.

Questions:
1. Is this card worth it?
2. Will a 2.0 x 16 card work in a 1 x 16 slot? (probably not)
3. Where do I check for power supply wattage? (If mine is 500w, won't it fry a card that only needs 80w)
4. Is there anything else I should consider?

I'm just not much good at hardware so I appreciate any feedback received. Thanks

chikega
05-26-2013, 07:45 PM
My work just ordered a DELL Workstation with a Quadro K4000. From what I understand, LW as well as a lot of other 3d apps run better on Nvidia cards. I believe a consumer gaming card such as the Geforce GTX 660ti would serve you well.

Greenlaw
05-26-2013, 08:04 PM
Regarding the Nvidia Quadro card, I think it depends on what you intend to do with it. It was explained to me by an engineer that the main feature of a Quadro is compatibility and performance with many CAD applications, including backward compatibility with older versions of CAD software. This would be of greatest value to engineers and industrial designers.

However, if you run software that leverages gaming technology, specifically applications that use the GPU, performance will much higher with modern Geforce GTX cards. The programmers at iPi Software, for example, highly recommend a good GTX card over Quadro for their motion capture software because it uses the GPU to perform its dynamics calculations. Some of the recent 'realtime' renderers may also perform better with a GTX card. The 'downside' of a high-end gaming card is that, since the emphasis is on highest performance and cutting edge technology, you may be sacrificing compatibility for older applications. This is generally not a problem for 3D artists who work in the entertainment and gaming industries, who tend to stay up-to-date with their software.

To summarize, if you generally work with CAD applications then choose a Quadro card, and for everything else you should probably consider a good Geforce GTX card. If budget is a concern, Geforce GTX cards are cheaper than Quadro cards.

Regarding Nvidia vs. ATI, I prefer Nvidia--I think the drivers are better for a wider range of applications. This is a personal opinion of course.

G.

m.d.
05-26-2013, 10:14 PM
I have 2 quadros....only for 10bit color output....
I also have 2 GTX and an AMD mobile

If you dont need that, I would avoid them like the plague.... A gtx 470 will run circles around a quadro4000....and you can get the 470 for about $85
even the quadro 6000 will be smoked by a gtx 570...maybe even a 480...
It does have a lot more RAM....but for $3500 it should

the only one worth looking at is the k5000

but like I said....I only have them for 10 bit color....which Nvidia disabled in the GTX,to justify the quadro's premium....otherwise that would be the first card to get ripped out of my machine.


There is some certified drivers for certain CAD apps....like Greenlaw pointed out....that if you run that app....the extra $1000 you would spend on the quadro probably wouldnt worry you.


I would like to recommend AMD, for best bang for the buck.....but there are way too many CUDA accelerated apps out there now.

IF you dont need CUDA go with AMD.....if you do, grab a GTX..... if you need 10 bit or have extra cash that needs a home, go with a quadro....
If you have a bit more money, and want brute force coupled with 6GB of ram...go with a titan

m.d.
05-26-2013, 10:19 PM
A
Questions:
1. Is this card worth it?
2. Will a 2.0 x 16 card work in a 1 x 16 slot? (probably not)
3. Where do I check for power supply wattage? (If mine is 500w, won't it fry a card that only needs 80w)
4. Is there anything else I should consider?

I'm just not much good at hardware so I appreciate any feedback received. Thanks

to answer your questions

1. NO
2. Yes it will....if you were a massive PC gamer you might see a hit of a few FPS...otherwise you wont notice....and I think you mean PCIx 2.0 card in a PCIx 1.0 x 16 slot....A PCI x1 slot is very small....PCIx 2.0 runs at double the speed of version 1....but is backwards compatible.
3. The power is regulated....you will probably need a bigger power supply with modern cards.....you will never fry your video card because of too big a PS...thats 500w for the whole system

Greenlaw
05-26-2013, 10:35 PM
3. The power is regulated....you will probably need a bigger power supply with modern cards.....you will never fry your video card because of too big a PS...thats 500w for the whole system

That's a good point. In my situation, my computer case is smaller than normal and I didn't really want to install a bigger power supply, so I found a GTX 460 that is shorter than usual and has a lower power draw than normal. This card cost me a little extra for the 'green' tech but it works great in my computer--with any other model of this specific card, I would have had to upgrade to a bigger computer case and power supply to install it. This system was put together a few years ago and, even though I like what I have now, if was configuring a new system today I would start off with a computer that could easily support a modern GTX card with plenty of RAM.

Just something to consider.

G.

m.d.
05-26-2013, 11:04 PM
yup, always go as big as you can...or you might regret it
I got a 1300 watt power supply in mine....but it doesnt draw over 800

I would even go so far to get an open concept case...
so you could use double slot cards (with pci ribbon extension cable) mounted above the motherboard so you could use every single slot in the motherboard.
http://www.ebay.com/itm/PCIe-x16-Flexible-Extender-Cable-w-Molex-/150620070281

Right now I have a seven slot pci 16x board (i think 3 of them are 8x electrical) with a quadro, 2x GTX 470's, a red rocket....and everything is jammed full....I literally have 2 video cards lying around that I could use for octane...but I cant fit them...
Not too mention the massive overheating issues when rendering octane....have to take off the case door and blow a big house fan right into it to keep the cards at 90 celcius....

first world problems....I get no sympathy from my freind in Bangladesh running an orphanage

unstable
05-26-2013, 11:13 PM
Thanks for all the good feedback folks. You've given me a lot to think about. I rarely play games, so this would be for doing 3d graphics using LW. I have TFD and I know it utilizes CUDA if you have it. So I started looking at Nvida. Safe Harbor has the K4000 for less than $800 which caught my interest, but if a GTX card will work nearly as well, I wouldn't mind saving the money. Now I'm not sure what to do.

My ATI card has been good but from what I've seen in some of the threads here Nvidia seems to work well with LW, TFD, Houdini, and most other 3d products. Guess I'll keep thinking.

JonW
05-26-2013, 11:53 PM
If you are running 2 monitors & you want to calibrate each monitor you will most likely need 2 cards. I had to do this with my setup, but it is old & maybe this has been sorted out by now.

m.d.
05-27-2013, 12:40 AM
So I started looking at Nvida. Safe Harbor has the K4000 for less than $800 which caught my interest, but if a GTX card will work nearly as well, I wouldn't mind saving the money. Now I'm not sure what to do.

My ATI card has been good but from what I've seen in some of the threads here Nvidia seems to work well with LW, TFD, Houdini, and most other 3d products. Guess I'll keep thinking.

Nearly as wel....LOL
A k4000 has 768 CUDA cores
A GTX 670 has over 1300 CUDA cores....
A GTX 690 has 3072 CUDA cores....

If you need more ram....a Titan has 2600 CUDA cores, but 6 gigs of ram....

For the price of a k4000 you could buy a 690 or maybe the new 780, and you would literally see performance of 3-4 times in CUDA applications....

Quadros are all marketing, and some specialized drivers(none for lightwave)....and some slightly higher specs....10bit color....specialty stuff, at a mega premium price

Basically if you don't know if you need a quadro.....you don't

In the old days you used to be able to flash a gtx with a quadro bios and have a cheap quadro...but nvidia figured that one out

Greenlaw
05-27-2013, 12:46 AM
I individually calibrate my two monitors (a Cintiq 12WX and an old Dell LCD) connected to a single Geforce GTX card. Back when I ran under XP Pro, I needed to add a separate Windows color utility to be able to use two separate color profiles simultaneously but I haven't needed this utility since Windows 7 Pro came out. I use the Spyder4 Elite system to calibrate each monitor but I don't think it matters which calibration system you have to use multiple monitor profiles with a single graphics card in Windows 7.

I just checked and my two monitors show up as separate devices under the regular Windows Color Management window and each monitor has its own associated color profile.

G.

LW_Will
05-27-2013, 01:59 AM
I really don't want to drag... aw, hell, sure I want to brag! ;-)

My nvidia GTX 660 has a base speed of 1080mhz, 2gb ram, 960 cuda cores (Kepler), OpenGL 4.2... and cost $210. Its the 6XX series, so its power is much lower than any of the 5XX series of Geforce cards. I think that it is one of the best cards for the money you can get.

And I think that the nvidia cards win because of their driver software. Hands down.

Do not waste your money on a Quardo. Just don't... IMHO

geo_n
05-27-2013, 07:19 AM
Buy a quadro if you're using appz like 3dmax, maya or CAD software. They're optimized to run better than gaming cards. The particles for example in 3dmax are super fluid while scrubbing the timeline with quadro cards.
For lightwave and most media software, a gaming card is ok.

m.d.
05-27-2013, 09:58 AM
it's too bad AMD doesnt have CUDA......
for lightwave they smoke quadros
http://www.tomshardware.com/reviews/firepro-w8000-w9000-benchmark,3265-7.html

Greenlaw
05-27-2013, 10:02 AM
Yes, especially for Turbulence FD and 3D Coat.

G.

Airwaves
05-28-2013, 05:05 PM
So with all this talk of video cards...... I have a rather dumb question, How important is the video card for Lightwave?

I have lightwave 11.5 and my video card has issues and needs to be replaced. I was going to look at getting a really good video card for Lightwave but have had some people tell me it does not matter since it uses the processor the most for calculations. What do you guys recommend I get? Thanks

Digital Hermit
05-28-2013, 06:28 PM
I don't use my machine for LW only, I use it with Adobe Premier and AE as well. Are the GTX cards capable of handling these programs without choking? I currently have a duel monitor setup with a Quadro 2000 in my system and I am thinking of changing it out.

Airwaves
05-28-2013, 06:41 PM
I don't use my machine for LW only, I use it with Adobe Premier and AE as well. Are the GTX cards capable of handling these programs without choking? I currently have a duel monitor setup with a Quadro 2000 in my system and I am thinking of changing it out.

I am glad you mentioned this. I do use Adobe products but not very much. I was told by a friend of mine the GTX cards would do well with Premier and AE but I cannot tell you from experience. Does Lightwave need a good graphics card for just Lightwave then?

Greenlaw
05-29-2013, 12:16 AM
As mentioned previously, a few LightWave plug-ins can take advantage of GPU processing and some supporting applications like 3D Coat can most definitely benefit from CUDA technology. If you use anything more than stock LightWave, it's worth getting a decent video card. And by 'decent', I don't mean expensive--you can get a lot of bang from a sub-$200 Nvidia Geforce GTX card. And, IMO, if you can afford a better model GTX card, go for it.

Also, I agree with the majority of posters above that for most LightWave users, a Quadro card is a waste of money. Unless you have a specific need for a Quadro card, you're actually much better off running with a good GTX card.

In my case, for example, I use 3DC and iPi Mocap Studio as 'support programs' for what I do in LightWave, and both these programs run much faster with a GTX card than with a Quadro.

G.

m.d.
05-29-2013, 12:38 AM
I don't use my machine for LW only, I use it with Adobe Premier and AE as well. Are the GTX cards capable of handling these programs without choking? I currently have a duel monitor setup with a Quadro 2000 in my system and I am thinking of changing it out.

If you don't need a 10bit video pipeline...requiring a 10bit monitor...

Then a quadro will do nothing over a GTX for PPro or AE....

The mercury playback in PPro uses CUDA to accelerate certain effects.....just some, and performs a lanczos resize on scaling operations.

In AE a CUDA card is a must if you intend to use AE's ray tracer...which is either CUDA or CPU....and I have tested....a CUDA render is about 15 times faster.

Since the quadro cards have drastically reduced CUDA cores...(the k5000 being the exception) you will see diminished performance in these apps vs any modern gtx....

Look at the specs on nvidia....ignore the marketing


Where a quadro really shines, is massive amounts of polygons and openGL....

As greenlaw said, you can get a lot for a $200 video card....

m.d.
05-29-2013, 12:47 AM
So with all this talk of video cards...... I have a rather dumb question, How important is the video card for Lightwave?

I have lightwave 11.5 and my video card has issues and needs to be replaced. I was going to look at getting a really good video card for Lightwave but have had some people tell me it does not matter since it uses the processor the most for calculations. What do you guys recommend I get? Thanks

You need some type of video card for any decent openGL performance....look for a good modern card with lots of Ram....
Best bang for the buck is usually 2 steps below the highest end model...

If you don't use many other apps, and don't care about all this CUDA talk, then grab a good AMD card.

Digital Hermit
05-29-2013, 01:26 AM
Where a quadro really shines, is massive amounts of polygons and openGL....

As greenlaw said, you can get a lot for a $200 video card....

Thanks guys. So as chikega mentioned 660ti specs look like it could beat the pants of my Quadro 2000 and for half the price!... anyone want to buy a slightly used vid card :D

LW_Will
05-30-2013, 01:13 PM
IMHO... the GTX 660 is the perfect card, a good balance of cost ($220ish) and power (960 CUDA Cores).

BigHache
05-30-2013, 07:31 PM
My Z800 workstation at work has a Quadro 4000 card. The adjacent system with a GTX 670 is a much better performer (in AE).

erikals
05-31-2013, 05:19 AM
Quadro doesn't help in LightWave

if you use Turbulence / 3DCoat / Octane, you'd want many CUDA cores
these might be of interest >

GTX 690 - 3072 CUDA cores
GTX 780 - 2304 CUDA cores
GTX 660Ti - 1344 CUDA cores

(remember to get the Ti verion if you go for 660...)

Rayek
05-31-2013, 11:48 AM
For a different perspective: I run an AMD 7970 to drive the viewport (three screens connected), and a Nvidia GTX590 for CUDA stuff only. For Lightwave, Cinema4D and Blender this is the best of both worlds: those three's viewports run (sometimes much) better on an AMD/ATI card than an on a Nvidia. And 3dCoat makes use of CUDA as well. SoftImage runs really well too, smoking gigantic amounts of polys.

I have no experience with MAx or Maya, though.

Another benefit is that I can use Octane and Cycles to render on the GPU in the background while I happily continue working. After Effects renders incredibly fast with the 590.

The drawbacks: Houdini (free version) dislikes AMD/ATI cards. A lot. I had to give up on Houdini caused by all sorts of OpenGL issues. Modo also dislikes AMD/ATI cards, and although it runs, viewport performance is quite dismal: I get 36fps in scenes that run at 150~200fps on a Nvidia 680 card.

As far as Quadros go: unless dedicated drivers for software are available (Maya, Max, Autocad, etc) they're generally not worth spending the extra money on. They are slow for CUDA, and a high-end consumer card beats most Quadros (but for the best Quadro, which is ridiculously high priced).

To conclude, I would let the choice of video card depend on the software you use.

Firehawk
06-22-2013, 08:42 AM
Hi guys. I found this great thread. I'm needing a new computer and having a hard time deciding on a GPU. It appears the quadro k4000 is newer and better than the quadro 4000 and closer in performance to the old quadro 5000. Is that correct?

I'm looking at a GTX 780 3 GB or a quadro k4000.
My machine will probably be an i7 3930 6 core 3.2 GHz with 32GB RAM.
I will use it primarily with Adobe CS6 (premiere & after effects) and also 3ds max 2009.
Im guessing the 780 would offer better performance in the adobe stuff due to more CUDA cores and thr k4000 having an advantage in 3ds max. However since I'm using an older version of max, can it utilize those advantages?
I will be using adobe more than max so if there is a huge difference one way or the other I may need to keep that in mind.
Thanks in advance for any help with this. It helps to be able to ask people who know more.



If you don't need a 10bit video pipeline...requiring a 10bit monitor...

Then a quadro will do nothing over a GTX for PPro or AE....

The mercury playback in PPro uses CUDA to accelerate certain effects.....just some, and performs a lanczos resize on scaling operations.

In AE a CUDA card is a must if you intend to use AE's ray tracer...which is either CUDA or CPU....and I have tested....a CUDA render is about 15 times faster.

Since the quadro cards have drastically reduced CUDA cores...(the k5000 being the exception) you will see diminished performance in these apps vs any modern gtx....

Look at the specs on nvidia....ignore the marketing


Where a quadro really shines, is massive amounts of polygons and openGL....

As greenlaw said, you can get a lot for a $200 video card....

unstable
06-22-2013, 08:50 PM
Well, I finally made my decision and purchased the GTX 780 with 3GB memory. There might be a little improvement in LW, but I think most the improvement in LW is due to doubling my system RAM. Where I do see a VAST improvement is with Realflow and 3DCoat. Those CUDA cores help a lot. I haven't tried it with AE yet, but I am looking forward to doing so. So far I think it was a great investment.

juanjgon
06-23-2013, 02:14 AM
Well, I finally made my decision and purchased the GTX 780 with 3GB memory. There might be a little improvement in LW, but I think most the improvement in LW is due to doubling my system RAM. Where I do see a VAST improvement is with Realflow and 3DCoat. Those CUDA cores help a lot. I haven't tried it with AE yet, but I am looking forward to doing so. So far I think it was a great investment.

This is a nice card for Octane, the best one for CUDA applications, only a little slower than a GTX Titan ;)

-Juanjo

BeeVee
06-23-2013, 03:55 PM
I replaced my GTX560Ti with a GTX660Ti and got a reduced power draw and roughly three times better performance from Octane - it's not often you can say you got a 3x improvement by spending €300 these days... :)

B

Digital Hermit
06-23-2013, 07:19 PM
I think I am going to hold out, for the next few days, until the GTX 760ti is released... I think it's priced to be the same or better as the 660ti and has better performance... I hope.

Cozfx
08-25-2013, 03:05 PM
http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-12.html

Noone is really advocating for the Quadro but these results are impressive for the Quadros in Lightwave. Anyone know the specifics of the ViewSpecPerf tests for Light01?

spherical
08-25-2013, 03:52 PM
Haven't messed with Quadros for a long time. What they are good for, as one poster mentioned, is manipulating a bazillion points and a bazillion lines. Their Z-buffers are optimized for that. That is what CAD packages require. A CAD package doesn't always work in polys. Even though 3D applications have wireframe display mode, it's not quite the same. They are edges of polygons. You would get some speed increase when running a Quadro in these modes but it certainly isn't worth the price to get it. This is partly why display modes in LW that employ smoothing are way faster on a GTX or equivalent.

jboudreau
08-25-2013, 05:09 PM
http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-12.html

Noone is really advocating for the Quadro but these results are impressive for the Quadros in Lightwave. Anyone know the specifics of the ViewSpecPerf tests for Light01?

I have the Quadro k5000 and it rocks in lightwave. It blows any gaming card out of the water especially when it comes to opengl performance. I can verify that the benchmarks above are correct. The k5000 is very fast in lightwave especially if you use some of the driver presets for the Quadro card.

In the mammoth performance scene I can rotate around the viewport with ease and smothness and that's after putting 1000,s of instances of the mammoth to the scene.

Cozfx
08-25-2013, 05:26 PM
I have the Quadro k5000 and it rocks in lightwave. It blows any gaming card out of the water especially when it comes to opengl performance. I can verify that the benchmarks above are correct. The kids 5000 is very fast in lightwave especially if you set you use some of the driver presets for the Quadro card.

In the mammoth performance scene I can rotate around the viewport with ease and smothness and that's after putting 1000,s of instances of the mammoth to the scene.

That's what I was hoping since I just ordered a workstation with a K4000 card. I am like most. I want to have my cake and eat it too. I like to game on the side with my home office computer but I need more power in Lightwave. The question one should ask is not whether a gaming card will perform in Lightwave but rather will a workstation card work for gaming. From what I have researched, gaming, even on a K2000, is just fine. Whether you can justify the cost again depends on how much your time is worth. I hit the wall with dual gaming cards on relatively low poly models and as my fidelity continues to increase, I'm looking forward to production boosts with this new rig. (Dual 6-core, hyper-threaded Xeons)

I have not seen any press from Newtek on whether lightwave is coded to take advantage of workstation cards but the data seems to support it. ( I should rephrase that...Whether the quadro drivers are tweaked for Lightwave is more true)

jboudreau
08-25-2013, 05:43 PM
That's what I was hoping since I just ordered a workstation with a K4000 card. I am like most. I want to have my cake and eat it too. I like to game on the side with my home office computer but I need more power in Lightwave. The question one should ask is not whether a gaming card will perform in Lightwave but rather will a workstation card work for gaming. From what I have researched, gaming, even on a K2000, is just fine. Whether you can justify the cost again depends on how much your time is worth. I hit the wall with dual gaming cards on relatively low poly models and as my fidelity continues to increase, I'm looking forward to production boosts with this new rig. (Dual 6-core, hyper-threaded Xeons)

I have not seen any press from Newtek on whether lightwave is coded to take advantage of workstation cards but the data seems to support it.

Once you get your new card set your 3d presets to visual simulation. This will give you a huge boost in performance. I told this to the lightwave support team and they set theirs to that setting and couldn't believe the difference in performance. The opengl is extremely fast and the viewports redraw speed is awesome even layout opens up faster. The support team even told me they would talk to nVidia to see why their was such a difference in performance when using the visual simulation or video app presets. It makes that much of a difference you will seriously be quite shocked.

I created a post a while back explaining what those settings were and what they did. The post was something like huge boost in performance Quadro k5000 card or something like that. Do a search and im sure you will find it.

These settings only work for Quadro cards and especially the kxxxx series Quadro cards.

Cozfx
08-25-2013, 07:24 PM
Once you get your new card set your 3d presets to visual simulation. This will give you a huge boost in performance. I told this to the lightwave support team and they set theirs to that setting and couldn't believe the difference in performance. The opengl is extremely fast and the viewports redraw speed is awesome even layout opens up faster. The support team even told me they would talk to nVidia to see why their was such a difference in performance when using the visual simulation or video app presets. It makes that much of a difference you will seriously be quite shocked.

I created a post a while back explaining what those settings were and what they did. The post was something like huge boost in performance Quadro k5000 card or something like that. Do a search and im sure you will find it.

These settings only work for Quadro cards and especially the kxxxx series Quadro cards.

Thanks for the tip! I'll reply back with results as well. Should have my new box in a couple of weeks. Probably sooner.

jasonwestmas
08-25-2013, 09:01 PM
it's too bad AMD doesnt have CUDA......
for lightwave they smoke quadros
http://www.tomshardware.com/reviews/firepro-w8000-w9000-benchmark,3265-7.html

Open CL will be the answer to that, if it every becomes standard.

tcoursey
08-27-2013, 03:50 PM
If you are running 2 monitors & you want to calibrate each monitor you will most likely need 2 cards. I had to do this with my setup, but it is old & maybe this has been sorted out by now.

I don't think this is a feature of the CARD but rather your OS. I use one quadro FX 3700 (I know it's OOOLLLDDD lol) and calibrated with Spyder Elite. The spyder loads a color profile through windows for each individual monitor.

It worked pretty good. I have a 24" LED and a very old 19" LCD. They calibrated very close...

- - - Updated - - -


If you are running 2 monitors & you want to calibrate each monitor you will most likely need 2 cards. I had to do this with my setup, but it is old & maybe this has been sorted out by now.

I don't think this is a feature of the CARD but rather your OS. I use one quadro FX 3700 (I know it's OOOLLLDDD lol) and calibrated with Spyder Elite. The spyder loads a color profile through windows for each individual monitor.

It worked pretty good. I have a 24" LED and a very old 19" LCD. They calibrated very close...

Greenlaw
08-27-2013, 05:17 PM
I have a single GTX 460 but I can calibrate my desktop display and Cintiq separately. I think this is a function of my current calibration software (Spyder 4 Elite) but I used to be able to do this a few years ago under XP using a Microsoft color utility that I had to install separately.