Page 1 of 2 12 LastLast
Results 1 to 15 of 18

Thread: LightWave + RTX

  1. #1

    LightWave + RTX


    worth a read... >
    http://www.cgchannel.com/2019/03/rev...idia-titan-rtx

    also note the "Conclusion" part at the bottom

    Last edited by erikals; 09-03-2019 at 01:16 AM.
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  2. #2
    Registered User
    Join Date
    May 2012
    Location
    Virginia
    Posts
    431
    Ayup... that's why I got two Titan RTXs... so damn expensive, but will pay off for me in the end.

  3. #3
    Almost newbie Cageman's Avatar
    Join Date
    Apr 2003
    Location
    Malmö, SWEDEN
    Posts
    7,650
    With Otoys subscription model coming soon, and that I still use LW, I would rather get two AMD Ryzen Threadripper 2950X (around $1800) and then upgrade the two machines to 128GB ram. That pretty much sums up the cost of ONE of those Titans.
    Senior Technical Supervisor
    Cinematics Department
    Massive - A Ubisoft Studio
    -----
    Intel Core i7-4790K @ 4GHz
    16GB Ram
    GeForce GTX 1080 8GB
    Windows 10 Pro x64

  4. #4
    You're probably looking at closer to 2,000 for 2 of those threadrippers as each is going to need at least a 3 to 4 hundred dollar mobo. And depending on where your machine is at now, just 64GB of RAM is going to be 5 or 6 hundred each machine. So you're looking at 3,000 just for the CPU and memory and Mobo. (I just paid around 3,000 for my 2950 rig). Vs the 2500 of a Titan RTX.

    If you want to utilize RTX, you're better off getting a couple 2080Ti's (about 1200 each). Of course in LW that's only going to help if you use Octane or some other third party rendering that supports it.

  5. #5

    1cpu+1gpu+1gpu is Absolutely the best combo. Hands down.

    check how Blender does it.

    hopefully NT can copy the idea.
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  6. #6
    Super Member vncnt's Avatar
    Join Date
    Sep 2003
    Location
    Amsterdam
    Posts
    1,576
    I wouldn't mind buying a few Titan RTX GPU's if nvidia would also have a renderer plugin for Lightwave. It is in their own interest.

  7. #7
    Registered User
    Join Date
    Feb 2019
    Location
    Lancashire, England
    Posts
    9
    At the moment RTX is a proprietary technology exclusive to Nvidia. AMD and Intel will be releasing graphics cards soon with ray tracing hardware features/processing. It might make sense for the programmers of Lightwave to hang fire at the moment and wait to see if an API emerges if it does not already exist that that uses raytracing which is not strictly proprietary to Nvidia. There might be an update to OpenCl to add hardware raytracing features to a newer version. I know a lot of people want gpu rendering native inside Lightwave, but it might be a good idea just to wait not too long and see how things pan out with AMD, Intel and the open source APIs such as OpenCl or something better/new.

  8. #8
    Quote Originally Posted by erikals View Post
    1cpu+1gpu+1gpu is Absolutely the best combo. Hands down.

    check how Blender does it.

    hopefully NT can copy the idea.
    NT can. If they rewrite the renderer they just rewrote a few years back.

  9. #9
    Registered User
    Join Date
    Jun 2014
    Location
    Right here
    Posts
    1,535
    Quote Originally Posted by erikals View Post
    1cpu+1gpu+1gpu is Absolutely the best combo. Hands down.

    check how Blender does it.

    hopefully NT can copy the idea.
    Yes they can copy the idea - but not more. (You can also do that)

    NT doesn't have the skills or resources to create a GPU render engine that takes years to develop and refine. They don't even have their own CPU render engine developer anymore.

    See how long it took for Chaos Group (Vray) and AD (Arnold) to implement GPU rendering / shading in their engines and it's still not production ready, providing all features or fast compared to Redshift or Octane (Cycles is also much slower). And how many years it took specialized teams for those GPU engines to get there.

    Edit: It would be (would have been actually) wiser to implement Cycles or ProRender instead. Or write a Redshift bridge.
    Last edited by Marander; 09-04-2019 at 10:35 AM.

  10. #10
    Registered User
    Join Date
    May 2012
    Location
    Virginia
    Posts
    431
    Quote Originally Posted by Marander View Post
    Yes they can copy the idea - but not more. (You can also do that)

    NT doesn't have the skills or resources to create a GPU render engine that takes years to develop and refine. They don't even have their own CPU render engine developer anymore.

    See how long it took for Chaos Group (Vray) and AD (Arnold) to implement GPU rendering / shading in their engines and it's still not production ready, providing all features or fast compared to Redshift or Octane (Cycles is also much slower). And how many years it took specialized teams for those GPU engines to get there.

    Edit: It would be (would have been actually) wiser to implement Cycles or ProRender instead. Or write a Redshift bridge.
    With ProRender, it's not like you're starting from scratch; you have the advantage of an SDK. I don't know how robust it is, but it's a far cry from building a render engine from the ground up.

    That shaves a significant amount of time off of building a plug-in or native implementation.

    It's not "tomorrow" but it's also not years.

  11. #11
    Registered User
    Join Date
    Jun 2014
    Location
    Right here
    Posts
    1,535
    Quote Originally Posted by RPSchmidt View Post
    With ProRender, it's not like you're starting from scratch; you have the advantage of an SDK. I don't know how robust it is, but it's a far cry from building a render engine from the ground up.

    That shaves a significant amount of time off of building a plug-in or native implementation.

    It's not "tomorrow" but it's also not years.
    Yes I agree to that it's faster then to develop from ground up, that's why I mentioned it.

    Still in case of Cycles and ProRender it took several years for a good integration in Cinema. I prefer using Cycles4D, it's more versatile then ProRender, which is really just usable in the latest version.

  12. #12
    Registered User
    Join Date
    May 2012
    Location
    Virginia
    Posts
    431
    Quote Originally Posted by Marander View Post
    Yes I agree to that it's faster then to develop from ground up, that's why I mentioned it.

    Still in case of Cycles and ProRender it took several years for a good integration in Cinema. I prefer using Cycles4D, it's more versatile then ProRender, which is really just usable in the latest version.
    I guess my point was that implementation isn't outside the realm of "doable" now. It's just a question of what and if resources are dedicated to it.

    Cycles is excellent, but I think it's likely that there will be a heavy development shift to Evee. They already seem to be leaning very heavily forward on that.

    In this case, it might be a safer bet for LW to go with ProRender.

  13. #13
    Eevee won't be a replacement for Cycles anytime soon. I mean, comparatively, its like saying Unreal could be a replacement for LW's renderer. They are just two different things. While you can get some really impressive results from both Unreal and Eevee, there are just many things that can't be done in real time and some things are more approximations of real world lighting or shading which are normally intensive to simulate. In other words, Cycles won't be taking a back seat to Eevee, both are important. And Cycles should be receiving the ability to render with RTX in 2.81 which should be out by November. It won't be likely as fast as Octane or Vray but its no slouch already and with RTX, it should be quite performant.

    In all honesty, if I Had to choose one or the other, Cycles is a better renderer and more fully featured. And maybe it would be possible for either in LW but as Marander said, it took them years for good integration in C4D. They added it to Modo within the last year but its too early days for it, it needs a lot of work. I wouldn't use it in the state its in. People should be realistic about hoping for GPU rendering in LW as the outlook for something that is featured enough to be useful is slim to none. You're best off using Octane which has quite a good implementation in Lightwave or using something external to render with.

  14. #14

    With Otoys subscription model coming soon,...
    Because of this, and amongst other Octane policies, i'm leaning towards LW or B.
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  15. #15
    Super Member Chris S. (Fez)'s Avatar
    Join Date
    Feb 2003
    Location
    Virginia
    Posts
    2,955
    $20 a month for LW for Octane, correct?

Page 1 of 2 12 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •