PDA

View Full Version : GTX or Quadro for LW?



Taro Yoshimoto
02-17-2014, 09:16 AM
Hi, I've ordered a GTX 780ti card for my workstation. I just saw some article that say that a Quadro card is like 10x stronger to handle OpenGl polygons.

I am fed up with modeler and Layout slowing down when using heavy objects/displacement/Dpont part move node. (Mac Pro 2013, ATI HD 5870).

I though that for Lightwave and C4D, a GTX card was ok but as LW use OpenGl and not DirectX, maybe the quadro is the way to go.

thanks

Taro Yoshimoto
02-17-2014, 09:23 AM
Please look here:

http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-12.html

Quadro K4000: 97.15 fps
GTX Titan: 22.25 fps
GTX 680: 22.38 fps

That's quite dramatic. Almost insulting.

Dexter2999
02-17-2014, 09:47 AM
I was going to mention that this discussion happened not too long ago here in the forums. The long standing answer has always been "don't bother with the Quadro the extra money isn't worth the negligible performance increase".

But things appear to be changing. Someone posted that very same Tom's Hardware article (I believe they may have been part of the testing) and showed considerable increase in performance. That combined with the near astronomical price for a top consumer card like the 780ti or Titan and you have to re-evaluate.

I believe in the other thread they stated that the testing was not done with LW11. I would love to see LW3DG (or some user group like the LA group) do a side by side performance test to come up with a "recommended" card, or maybe a "best bang for buck" card. A LW specific test for the user base testing Modeler performance, Layout performance, perhaps Octane performance.

Taro Yoshimoto
02-17-2014, 10:15 AM
I got access to a K4000 here at my job. I might be able to swap it and make some speed test.

What is the tool to see the FPS in modeler and Layout?

Taro Yoshimoto
02-17-2014, 10:45 AM
Look like you can hard mod a GTX to become a Quadro once again:
http://www.eevblog.com/forum/chat/hacking-nvidia-cards-into-their-professional-counterparts/

Quite interesting!

souzou
02-17-2014, 11:03 AM
I would love to see LW3DG (or some user group like the LA group) do a side by side performance test to come up with a "recommended" card, or maybe a "best bang for buck" card. A LW specific test for the user base testing Modeler performance, Layout performance, perhaps Octane performance.

This would be valuable for not only the userbase but surely for LW3DG as well so they can see how LW is performing on a range of cards. I know they can't test every combination but you'd think a range of 6-7 cards would be feasible (both in time and cost).

Dexter2999
02-17-2014, 12:33 PM
Interesting, yes, but I don't have the nerve to take a soldering iron to my graphics card.

What I also find interesting is that Nvidia is making a clear delination between their pro and consumer cards. They cripple the pro features in consumer cards and disable the consumer functions, that tie into their "Shield" line and other software, in their pro cards.

So, kids who think they are going to get pro performance and top gaming performance out of the same card are disappointed to find, it's not gonna happen.

jboudreau
02-17-2014, 02:37 PM
I can definitely confirm these results are accurate. I own a k5000 and a k5000m and I can tell you it is extremely fast in lightwave. If anybody needs me to do any tests or needs any questions answered just let me know

here is a post I wrote a year ago about settings you can use to give a huge performance boost. http://forums.newtek.com/showthread.php?133958-Tip-for-a-HUGE!!!-Increase-in-OpenGL-Performance!!-Lightwave-11-to-11-5&highlight=quadro

I wonder if Toms Hardware knows about these setting for the quadro cards

Thanks,
Jason


Please look here:

http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-12.html

Quadro K4000: 97.15 fps
GTX Titan: 22.25 fps
GTX 680: 22.38 fps

That's quite dramatic. Almost insulting.

fazi69
02-17-2014, 03:58 PM
There is no need for soldering. You all can easy change Your GTX to accept quadro drivers. It is easy, few minutes job. I have my lame 550ti with quadro drivers and it is 20 frames faster in cinebench r15. Here : http://forums.guru3d.com/showthread.php?t=377158
I`m sure it will not unleash all power but it works !

realgray
02-17-2014, 08:39 PM
This is a great discussion. I was leaning toward a 780 but after reviewing the benchmarks I'm now leaning toward the K4000 for a LW/AE/Premiere workflow.

spherical
02-17-2014, 10:11 PM
Just remember that the benefits of one do not exist in the benefits of the other. Anything that requires the capabilities of a consumer card is pretty much negated in a pro card; and vice-versa. IOW, they have compartmentalized nearly everything. Tumbling in a viewport will be great in a Quadro. Rendering with one will not be so great, nor will anything that needs similar capabilities that would be employed in a game environment.

I don't think that people are grasping the essential difference and are just glomming onto benchmarks of one type or another, without realizing what they really mean or pertain to.

3dWannabe
02-17-2014, 10:35 PM
GTX Titan & Octane!! The CUDA results using the same TomsHardware review favor the Titan:
http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-20.html#

realgray
02-18-2014, 05:51 AM
GTX Titan & Octane!! The CUDA results using the same TomsHardware review favor the Titan:
http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-20.html#

Octane is a beautiful engine. But I would like to keep the topic (with the OP) of pro vs. gamer card Lightwave (and possibly Adobe) centric if it would be ok.

realgray
02-18-2014, 05:57 AM
Just remember that the benefits of one do not exist in the benefits of the other. Anything that requires the capabilities of a consumer card is pretty much negated in a pro card; and vice-versa. IOW, they have compartmentalized nearly everything. Tumbling in a viewport will be great in a Quadro. Rendering with one will not be so great, nor will anything that needs similar capabilities that would be employed in a game environment.

I don't think that people are grasping the essential difference and are just glomming onto benchmarks of one type or another, without realizing what they really mean or pertain to.

Thank you for the clarification. I plan on rendering in LW so I'll go beefier in the CPU. When you say the game card will render better are you talking about just 3d cuda enabled engines or everything like premiere and AE. It's a delicate choice since the difference between a 780 and K4000 is less then 100 dollars. Nvidia is making this all the more confusing as time goes on :)

OFF
02-18-2014, 11:01 AM
I have a lowend quadro 600 card for 2 displays and one GTX card (560Ti) for GPU calculations - quatro much smoother in large polygons scenes than GTX.

spherical
02-18-2014, 03:00 PM
I plan on rendering in LW so I'll go beefier in the CPU. When you say the game card will render better are you talking about just 3d cuda enabled engines or everything like premiere and AE.

Premier CS5 and AE CS6 use CUDA for some of their functionality. Encoding/decoding is done by CPU.

Taro Yoshimoto
02-19-2014, 07:51 AM
"quatro much smoother in large polygons scenes than GTX"

This is seriously bad news. I always though that the crippling was only affecting Maya, Softimage and 3DS Max.

Lightwave cant use OpenCL? Or Direct X? (I dont know much about these things)

OFF
02-19-2014, 09:28 AM
Nope, only GL. But even with lowend Quadro 600 card viewport manipulation speed in Lightwave higher than in 3D Max (in my case at least).

Taro Yoshimoto
02-19-2014, 10:39 AM
Well well well... Our technical director just contacted me. HP will not warranty the use of a GTX card on the workstation (z820). I asked for a K5000 to replace the GTX 780ti. 4 to 6 weeks of waiting the monster.

spherical
02-19-2014, 03:55 PM
HP will not warranty the use of a GTX card on the workstation (z820).

What's the point of that!? A card from the same manufacturer won't harm the machine in the least. They're essentially the same card, with straps open/closed in order to turn on/off capability. In fact, there are people adding/removing surface mount resistors to change the hardware ID, so that the drivers will recognize the card as a Quadro and turn on the Pro features. Arbitrary line drawing like this, for no real reason at all, is just stupid.

Dexter2999
02-19-2014, 06:10 PM
I just watched the VPR vs Octane vid on youtube. It says it was uploaded three years ago (seems hard to believe it's been that long.) That test says it was done with "2xx5650 vs GTX 460"

Would love to see what different benchmarks in LW would be with current cards vs CPU's. Xeon's vs i7. CPU vs GPU (VPR vs Octane). Modeler performance vs Layout performance. Rated performance plotted against budget.
All to help users determine the best system for their budget.

Octane with a Titan sounds awesome, but that is over $1400 and not really in my budget as a casual users. Whereas power users might find it indispensable for use with clients (needing faster real time feedback.) Non- client based houses might find other features more important, such as faster modeler feedback and spend the extra money on a render farm machine.

Taro Yoshimoto
02-20-2014, 03:37 PM
What's the point of that!? A card from the same manufacturer won't harm the machine in the least. They're essentially the same card, with straps open/closed in order to turn on/off capability. In fact, there are people adding/removing surface mount resistors to change the hardware ID, so that the drivers will recognize the card as a Quadro and turn on the Pro features. Arbitrary line drawing like this, for no real reason at all, is just stupid.

I think the HP seller convinced our TD that it was better that way (he want to sell that card!). I am not going to contradict them as they are replacing a 800$ card with a 2000$ one. The quadro might have less Cuda cores but I'm sure the driver is going to crunch more polys in OpenGL.

Point is, I work at a TV station and the people buying the stuff are rather unefficient. They have no clues. A bit like your aunt and uncle could be. This time I got lucky. Normally it's tedious.

spherical
02-20-2014, 04:20 PM
The OpenGL performance will be better, yes. For reference, here's a comparison on render performance between a few GPUs by Thea Render (http://www.thearender.com/cms/index.php/news/edition-13.html):

120288

Was going to post this in the thread you started today but trying to keep these related things all in one place, 'cuz this video card thing is gaining a life of its own.

Dexter2999
02-20-2014, 05:04 PM
I think the HP seller convinced our TD that it was better that way (he want to sell that card!). I am not going to contradict them as they are replacing a 800$ card with a 2000$ one. The quadro might have less Cuda cores but I'm sure the driver is going to crunch more polys in OpenGL.

Point is, I work at a TV station and the people buying the stuff are rather unefficient. They have no clues. A bit like your aunt and uncle could be. This time I got lucky. Normally it's tedious.

I went through something similar many years ago with my AVID workstation. To get customer support from AVID you had to have a Quadro card installed. The first question they asked when you called support was "Are all components of your system AVID Certified?" If you said "no" that was the end of your call. Back then we paid double for a Quadro card with lower performance than I could have gotten from a high end consumer card, simply to maintain the warranty and get support.

This could be the exact same situation. Spec's mean nothing. If you want the warranty/support you have fulfill the terms of agreement.

Sorry to hear this kind of thing is still going on.

Taro Yoshimoto
02-22-2014, 06:51 AM
Well, the seller is usually taking care of all our Avid workstations. That might explain it. However, on the HP website you have the option to get it with no card at all. It's really just the seller trying to make our TD nervous about gaming cards and buy the big monster card.

spherical
02-22-2014, 03:37 PM
That tells the story, right there.