PDA

View Full Version : Nvidia Quadro Fx series Video cards not Accelerating render time



jesblood
10-06-2009, 04:27 PM
Does anyone here know why my Nvidia Quadro FX 570 mobile video card isn't accelerating my renders or is it? I'm confused. I used to be a Max user but I switched to the light-side of the force. Now as a user of Lightwave I'm very happy with the animation toolset and personally feel it is superior and a lot easier to understand. However I have to give discreet their props in that when I went to render a still or an animation it fully made use of the video hardware and kicked out my renders with lightning speed. I am I missing something? Do I need to make adjustments to Lightwave's settings somewhere or my video card's? Or is it that Lightwave will use the hardware's acceleration for OpenGL only? My viewports run fine with the acceleration I'm just not seeing any real rendering speed.

Please someone with a bigger cranium than I, HELP!!!

IgnusFast
10-06-2009, 04:57 PM
You're right - the only benefit you'll get from a video card is in terms of OpenGL performance. Rendering is purely CPU bound, at least for the current Lightwave generation.

Hieron
10-06-2009, 06:55 PM
Is Max helped by GPU speed on final renders?

biliousfrog
10-07-2009, 03:00 AM
No current render engines aside from the new i-ray and the dead Gelato use GPU's.

jesblood
10-07-2009, 08:06 AM
biliousfrog I played with Max v5. I've never handled the latest Max. I don't know what the current one is like but when I used Max on this video card I saw a powerful difference in render time. Animation that would have taken hours shrunk to mere minutes. I don't know if thats because that v of Max was old or what but it did speed the rendertime.

By the way, impressive website and accomplished art projects as well. I'm humbled. Very cool stuff.

To everyone else -- Thanks for your help in this matter. I was horribly confused. Thanks again.

P.S. - Has anyone found the best settings for camera, radiosity, and motion blur for rendering. I just recently rendered an 8 second animation of just a spinning model with radiosity and motion blur applied at 800 by 600 and the bloody thing took 12 hours to kick-out. Does anyone have any suggestions? I'm still relativly new to Lightwave. Any advice is beyond welcome. Thanks.

Hieron
10-07-2009, 02:20 PM
MAX v5 is from 2002, no way that rendering time was aided by the GPU. Heck, nothing is aided by GPU now in 2009 (bar the ones mentioned)

Without example images it's quite impossible to say anything sensible about where to improve AA/GI other than: read tutorials.
For example this exce(pt)llent one: http://www.except.nl/lightwave/RadiosityGuide96/index.htm


Edit: I do remember something from Maya that it is possible to select some OpenGL(or DirectX no idea) thing as renderer. I used it once to composite image maps onto monitors. Sure it will help there, pretty useless to render general things with that though.

jesblood
10-07-2009, 08:43 PM
Hieron, Thanks for the link I really appreciate it. As for the Max v5 thing, look guys I believe you about the GPU. I'm the last guy I'd ever call an authority on the subject, ever. However I saw a massive diffrence between having this video card and not having a video 3d graphics acceleration card at all. I don't know, maybe I'm in the twilight zone or something. I was just shocked that Max was kicking out renders faster than Lightwave. Even so I still like Lightwave better any way, less confusion, better controls.

Oh well thanks for the info and the help everyone. I really appreciate it.

JonW
10-07-2009, 10:50 PM
Rendering is all CPU. The programs will need to be rewritten from the ground up to change rendering to GPU.

Maybe the graphics card reduced the CPU resources it used, but it definitely was not rendering.

Titus
10-08-2009, 12:08 AM
Of course there are programs rendering using the GPU:

http://www.studiogpu.com/

jesblood
10-08-2009, 12:52 AM
JonW - I'm sure that's it! What you said is the only thing that seems to fit the situation. Thanks for the suggestion. That has to be it.