PDA

View Full Version : Best GPU for Lightwave 2015



Blazoneer
11-20-2016, 02:04 PM
Hi all,

After bouncing around between trial versions of Maya, Houdini, Poser, Blender, and considering iClone, I am thrilled to finally get around to trying Lightwave. With it's animation and mocap plugin, library of tutorials and proven production line, it seems to be exactly the solution I've been seeking (although I love Houdini's physics and water simulation).

Anyway, I'm currently using an old Dell T7500 with a gtx 570. In some of the older threads on this site, I'm finding that Lightwave doesn't (or didn't) benefit much from more powerful gpus. I'm thinking of upgrading my computer to a used Dell 7600 and putting in an nvidia 1070 but couldn't find any advice as to whether Lightwave 2015 uses mostly the cpu or gpu.

Any advice would help!

rustythe1
11-20-2016, 02:53 PM
CPU and Ram for all rendering, GPU is just OpenGL disply so no you don't need anything powerful, it will quite happily run on any tablet, unless you get octane render

Danner
11-20-2016, 03:04 PM
Welcome to the forums, take advantage of us, we tend to trip over ourselves to help a fellow lightwaver. (a sharp contrast to places like the Unity forums where I have 3 unanswered questions I posted months ago) A 1070 will definitely be faster than a 570, but even with a 1070 it will get sluggish on high polygon counts in modeler (layout is much better at this.) The way we get around this is to keep models optimized and separated so we can work only on parts of the scene, and to clone as much as we can in layout. For example for large outdoor scenes with a lot of vegetation we only send one plant of each type and clone them in layout. The next version of Lightwave has made this a priority and have shown that it handles much better. We don't have an ETA on it and some folks are getting restless, I of course want better and faster but right now Lightwave is getting the job done.

jwiede
11-20-2016, 03:14 PM
It all depends on the plugins you intend to use (TurbulenceFD uses CUDA if available), whether you intend to use a GPU-based renderer, etc.

I'd recommend NV 9x0-gen GPU over 10x0-gen right now, because CUDA support for 10x0-gen is still experimental and a bit spotty (f.e. LW TFD plugin didn't yet support Pascal/10x0-gen architecture, last I checked). I went down the same path recently, and ultimately decided an NV980ti made more sense than a NV1080 (but I'm also on Mac, where 10x0-gen drivers have yet to be released, YMMV). Moving from NV570 to NV980ti was around double in performance for GPU-based calls and rendering, BTW.

rustythe1
11-21-2016, 01:12 AM
and the NV970 is really cheap if you can find them, they will support VR if you move that way, I expect you could pick up 4 of them for less than one 1080, I only paid 250 each for mine 2 years ago, each one will run 3 UHD monitors (I'm running 4 1080p 32" monitors and a Wacom cinq)

mav3rick
11-21-2016, 02:39 AM
whatever gpu you get... it will be waste unless you are going to use it with octane.

Norka
11-21-2016, 07:18 AM
Not a total waste, if one plays the occasion game... Blazoneer, I have three 980Ti Hybrids, and can't speak highly enough about them. One 980Ti is actually as powerful in OctaneRender as two 580s! And a single 980Ti is even powerful enough to set on low priority in Octane, use it with your monitor with no problems, and still get ridiculously fast renders.

And maybe consider building your own workstation. You can get so much more bang for your buck.

Blazoneer
11-21-2016, 10:11 AM
Thanks, Danner, For my models, have been attempting to find a sweet-spot ratio between low-poly and quality. In blender, this seems to be about 768 tris. Am currently moving into Mudbox and will experiment with Lightwave's Modeler. Also have been wondering about Poser's add-on Shade 3D. It advertises a slick looking polygon reduction tool--that said. Decimators by Blender and Mixamo usually destroy the model. So I'm hesitant about Shade 3D.

Blazoneer
11-21-2016, 10:14 AM
Jwiede .. Interesting on the 10x0 ... hadn't heard that. I had also been wondering what about some of the dynamic add-ons. Also, thanks, Norka, rustythe1, and mav3rick. Years ago, I purchased Octane Render for Daz, and I was actually wondering if Otoy's forthcoming Brigade would be a useful tool.

Danner
11-21-2016, 03:12 PM
Well object decimation tends to destroy the poly-flow, but it is a good alternative in many cases, as long as you save an editable version with the loops and quads intact in case you need to further modify it. Ofcourse if it's something like an animated character, then you'll have to retopologize (not a dictionary word but we all know what it means =D ) to reduce the polygons.

OFF
11-21-2016, 06:32 PM
Any low budget professional gpu, like nvidia quadro k600, for example - will be much help to you on a heavy scenes and models.

bkmvlswe
12-13-2016, 04:10 AM
Using the PNY P5000 quadro card myself, 16 gig beast but i intend to use the new dual cpu workstation for Lw,Zbrush, videoediting and games, did a test animation with LW and i could see all 40 threads work, not bad, Adobe software is not as good on that part.

How well do 21:9 monitors curved work with applications like Lightwave? i can imagine for games and movies it will be nice but for 3D, Cad, the curve might be a problem i guess, torn between dells 34" ultrawide and dells 27" 5k monitor.