View Poll Results: GPU Rendering in LightWave?

Voters
29. You may not vote on this poll
  • Yes, Please!

    27 93.10%
  • I don't need it.

    1 3.45%
  • Other

    1 3.45%
Page 2 of 2 FirstFirst 12
Results 16 to 30 of 30

Thread: GPU support - Rendering

  1. #16
    RETROGRADER prometheus's Avatar
    Join Date
    Aug 2003
    Location
    sweden stockholm
    Posts
    15,119
    Quote Originally Posted by Tim Parsons View Post
    Pretty strong opinion. We might do 20 - 40 product renders a day with an average render time of 3 minutes a piece - hardly the end of the world. However speed is king and while I think the LW renderer has an amazing look and the shading system is really good - it's not the speediest. It was indeed very discouraging to hear that they let Mark G. go as no one knows the LW render better and I'd put money on it that his next course of action after shipping the product was to optimize the hell out of it. Can that still happen? Maybe.
    my beliefs on how I suspect lw future will be..can only be judged in hindsight.
    I am in a topic related thread..just warning about it.

    the omens are out there..with users still using lw..agreeing on it being too slow..and recognizing a certain open source software providing both cpu.gpu.and alsoba realtime engine.

    there has been quite a few users..including former lightwave developers..that seem to have left lightwave for that particular software..
    and myself is focusing more on that as well.

    I still have hopes for lw....but this part is so important for me that I cant hang on to it if wevdo not see massive speed improvement on the renderside..natively.

    why would I ..when a free tool could offer me much faster speed in most areas..together with features lightwave do not have.

    I would at a certain state..have to weigh in what I can not do in the other software...which I can do in lightwave.

    as for your rendertimes..and product rendering..
    for simpler product renders.you may get away with those 3 minutes renders..that may take 1 minutes or faster with gpu.

    but if you need fiber fx characters...the acceptable level in rendering time will most likely not be within that low comparison range.
    other rendering tasks would need to be comprared as well.

    so..sure..for some studios.business or single artists..it would probably be tolerable for them since they may appreciate their own knowledge..and workflow they have established..but I suspect that list will shrink in relation to how much impact and buzz the other renderer will make.

    well its my warnings..heads up and suggestion to the lw team..that they seriously need to act on it.

    in a near future and in hindsight..we may see if I ..and others was wrong or right.

    oh.sorry for some spelling errors...in bed on the mobile phone...a bit sleepy.

  2. #17
    RETROGRADER prometheus's Avatar
    Join Date
    Aug 2003
    Location
    sweden stockholm
    Posts
    15,119
    Quote Originally Posted by DCJ View Post
    I just ordered a new workstation with dual RTX4000’S lightwave is the only software I use regularly that won’t benefit from them much. Most of my work is in video but when I need 3D lightwave is my go to tool. Or at least it has been. I’ve been using LW for 23 years and am dreading moving to something else but development seems way behind. I was just checking third party plugins and couldn’t find any released in a year. Maybe I’m just not looking in the right place.
    i mentioned omens in previous post..an lack of third party plugin releases in recent times may be one of them..there are however others who thankfully update their plugins..like our hero Denis P.

    but otherwise the lack of new releases worries me as well.

    with all that gpu power..maybe you should give that free tool a go..you would not only have both cpu and gpu with the same materials and same render look output to choose from...but you will be able to use octane for free as well..up to a certain cuda core level that is.

    if its too cimplicated or messy to use and learn...then you should of course stick to lightwave.

  3. #18
    Octane and Maxwell, as I've read, dont access LW's native surfaces or nodes..which to me is a deal breaker on using those.

    There are alot of features I'd love to see added/updated..but the renderer, I would like to see them focus on..either for the point or even major release.

    Of course, it would great if the devs would kinda chime in..but they're in silent mode again..hehe.
    Rob Depew
    Modeler/Animator/Editor
    Web Portfolio
    My Patreon
    YouTube
    Lightwave3D 2019, RF10, Gaea and more..
    Threadripper 32 core , 48gb, GTX-1070
    Asus 752vt i7 2.6 x 4, 48gb, GTX-970

  4. #19
    Quote Originally Posted by lwanmtr View Post
    ...and it really doesn't reduce time at all, from the few times I've tested it out.
    ok, now you're just pulling my leg... right??  
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  5. #20
    Registered User
    Join Date
    Dec 2008
    Location
    Ghana West Africa
    Posts
    849
    Quote Originally Posted by prometheus View Post
    I believe a native GPU solution is a must in the near future
    I honestly don't believe lightwave will survive without it...unless they do magic with the code for cpu.
    Rendering is irrelevant for Lightwave atm.
    I don't believe those who left lw for Blender, did so for gpu rendering. There were other reasons. Infact, most of those people would've happily stayed in lw, if it had all the functionality they needed.
    So for lw to stay relevant, it needs to start having long standing legacy issues addressed and adding missing functionality along with modern workflows

    The gpu market is already too saturated for Lightwave to compete and with realtime rendering is allso starting to take off, developing a gpu renderer would be wasted effort for Lightwave at this point.
    Last edited by Hail; 09-18-2019 at 03:39 AM.

  6. #21
    Rendering is quite relavant, if you use Lightwave for rendering, which many do.
    Rob Depew
    Modeler/Animator/Editor
    Web Portfolio
    My Patreon
    YouTube
    Lightwave3D 2019, RF10, Gaea and more..
    Threadripper 32 core , 48gb, GTX-1070
    Asus 752vt i7 2.6 x 4, 48gb, GTX-970

  7. #22
    RETROGRADER prometheus's Avatar
    Join Date
    Aug 2003
    Location
    sweden stockholm
    Posts
    15,119
    Quote Originally Posted by Hail View Post
    Rendering is irrelevant for Lightwave atm.
    I don't believe those who left lw for Blender, did so for gpu rendering. There were other reasons. Infact, most of those people would've happily stayed in lw, if it had all the functionality they needed.
    So for lw to stay relevant, it needs to start having long standing legacy issues addressed and adding missing functionality along with modern workflows

    The gpu market is already too saturated for Lightwave to compete and with realtime rendering is allso starting to take off, developing a gpu renderer would be wasted effort for Lightwave at this point.
    at this point?
    then later or never may be a good choice?

    unless cpu rendering will improve drasticly in hardware or coding development..
    newtek will most likely loose me as a customer, they may survive a loss of me as a customer...but then they need to evaluate how many others
    would be switching....just saying.

    Sure other reasons..but you are simply excluding gpu rendering all together cause you know for sure that is the case? so it wasnt at least one of the reasons.

  8. #23
    RETROGRADER prometheus's Avatar
    Join Date
    Aug 2003
    Location
    sweden stockholm
    Posts
    15,119
    Quote Originally Posted by lwanmtr View Post
    Octane and Maxwell, as I've read, dont access LW's native surfaces or nodes..which to me is a deal breaker on using those.

    There are alot of features I'd love to see added/updated..but the renderer, I would like to see them focus on..either for the point or even major release.

    Of course, it would great if the devs would kinda chime in..but they're in silent mode again..hehe.
    True, octane has itīs own materials,
    But from my point of view I prefer a GPU solution that works like Blenders, you either go with CPU or GPU the only render difference is in render speed and not the output look in materials or render output, and that is what I would like to see implemented in Lightwave as well.
    Octane for blender is free up to a certain level of course, and it makes me think it is a bit sad we couldnīt get that for Lightwave, considering Lino Grandi working now for otoy/octane, would be nice since it is free anyway, canīt be the code that is blocking if they already got it working with the commercial version, could it be that blender may have a larger market they can aim at currently..or something else that blocks an implementation to lightwave.

    I tried octane for blender, and it is fast, sadly not much of materials to choose from, and I had issues with importing VDB where it just crashed, but Iīve only tried it a couple of times, and I would need to
    forward those issues to otoy and see what can be solved, if it can due to it may require payed version for more cores.

    I have run in to issues with volumetric fluid and smoke scenes where I no longer can render with GPU, fine...then I switch to cpu, but the fact is I can evaluate a scene at a certain stage and get almost instant feedback on how it looks before I commit to CPU.

    Rendering VDB or TFD in Lightwave feels much slower than using fluids in Blender, and even though TFD doesnīt by default use the new volumetric PBR engine, it is still slower than Blenders fluids volumetrics, especially with multiscattering.

    Rendering of hair in blender is soo much faster, I just canīt stand the cpu speed render of Lightwave fiberfx hair, so a GPU renderer for that would be welcomed, that said..maybe they can improve the CPU rendering of hair, blenders CPU render is also faster than Lightwaves, at least the previewing, would have to do final render tests between blender and lightwave to back that up though.

    I believe itīs the same with subsurface scattering, faster in blender GPU than Lightwave CPU, it may depend on other settings within subsurface scattering as well though..but settings near defaults seem to yield that result.

    Now for GI, think itīs the same there..faster in blender.
    for PBR material in general, there may not be much of a difference.


    What still attract me with Lightwave is everything else not connected to the renderspeed, I know it better than blender, some interchange stuff flows more easier, it has a much much larger set of procedurals that I miss in blender, I still like the way how to set up volumetrics..even though it can at a certain level become too slow to render.

    From what I seem to be able to gather from user input, I hear a lot of ...noise..noise, renderspeed slow, when do we get modeling tools in layout..if ever.
    I think those issues are some of the biggest the Lightwave team may have to wrestle with currently, followed by the UI and Undo system, and even further..implement innovative new stuff.

  9. #24
    RETROGRADER prometheus's Avatar
    Join Date
    Aug 2003
    Location
    sweden stockholm
    Posts
    15,119
    Quote Originally Posted by lwanmtr View Post
    The GPU noise reduction was a good start....but its really useless unless youre doing print resolutions, and it really doesnt reduce time at all, from the few times I've tested it out.
    If that statement was in reference to a CPU rendering engine, you are completely wrong, If you ment noise reduction..thatīs another thing.

  10. #25
    RETROGRADER prometheus's Avatar
    Join Date
    Aug 2003
    Location
    sweden stockholm
    Posts
    15,119
    Quote Originally Posted by erikals View Post
    Back when Jay Roth was in charge he wrote NT would very probably include GPU support for rendering in the future.
    (unfortunately i couldn't find the quote)
    But I also recall vaguely that ..I think He stated it (shouldnīt fuel speculations) that GPU wasnīt an option, they decided that it wasnīt worth it.

  11. #26

    I think you are mixing that with Brad Peebler ?



    Brad stated those exact things.
    Last edited by erikals; 09-18-2019 at 10:22 PM.
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  12. #27
    Super Member Chris S. (Fez)'s Avatar
    Join Date
    Feb 2003
    Location
    Virginia
    Posts
    2,955
    He is.

  13. #28
    Registered User
    Join Date
    Aug 2016
    Location
    a place
    Posts
    1,850
    Quote Originally Posted by prometheus View Post
    But I also recall vaguely that ..I think He stated it (shouldnīt fuel speculations) that GPU wasnīt an option, they decided that it wasnīt worth it.
    genius!

  14. #29
    RETROGRADER prometheus's Avatar
    Join Date
    Aug 2003
    Location
    sweden stockholm
    Posts
    15,119
    Quote Originally Posted by erikals View Post
    I think you are mixing that with Brad Peebler ?



    Brad stated those exact things.
    Yeah..sorry, You are most likely right about that.
    Whoīs the the mutant who doesnīt need CPU accelerated rendering?

  15. #30
    I think at the time the 'no gpu' words were said was back when gpu's were still underpowered compared to cpu (i could be wrong, cause I dont remember when they said that)..

    Still, for anyone today to say that gpu rendering isnt needed is silly, specially when you see what the real time engines are doing with them.
    Rob Depew
    Modeler/Animator/Editor
    Web Portfolio
    My Patreon
    YouTube
    Lightwave3D 2019, RF10, Gaea and more..
    Threadripper 32 core , 48gb, GTX-1070
    Asus 752vt i7 2.6 x 4, 48gb, GTX-970

Page 2 of 2 FirstFirst 12

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •