Page 5 of 6 FirstFirst ... 3456 LastLast
Results 61 to 75 of 81

Thread: GPU Rendering - Anywhere in the development pipeline for Lightwave?

  1. #61

    only 3ds Max can open a 3ds Max scene
    yep
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  2. #62
    Quote Originally Posted by cresshead View Post
    Having to develop for 2 different applications and THEN also make sure they line up to talk to each other with data communication must be really
    inefficient development wise....would be like having to develop maya AND 3dsmax at the same time.
    This is probably why the genoma preview window is broken in 2019 when you save a rig.

    and why development is nearly always layout centric over the last 10+ years.
    they really need a roadmap. (hm, i can recall someone saying the same thing... )
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  3. #63
    Registered User
    Join Date
    Aug 2016
    Location
    a place
    Posts
    1,853
    Quote Originally Posted by cresshead View Post
    only 3ds Max can open a 3ds Max scene
    yeah i know just the idea. i guess the reason for this

    https://graphics.pixar.com/usd/docs/index.html

    would be great if it gets good traction

  4. #64
    Registered User
    Join Date
    Aug 2016
    Location
    a place
    Posts
    1,853
    Quote Originally Posted by erikals View Post
    they really need a roadmap. (hm, i can recall someone saying the same thing... )
    “strong three-year roadmap”

    http://www.cgchannel.com/2011/06/rob...-year-roadmap/

    must have been one at some point. unification looked to be one, 8 yrs later...

    anyway, wgas

  5. #65
    Registered User
    Join Date
    Jun 2014
    Location
    Right here
    Posts
    1,535
    Quote Originally Posted by robertoortiz View Post
    Honestly, the way things are going, they would be better off offering support for Blender scenes. Since that is too complicated, I would settle with full Alembic support.
    https://en.wikipedia.org/wiki/Alembi...uter_graphics)
    EDIT
    Lw 2019 does offers it. My bad.
    No, LightWave 2019 and before doesn't offer a proper Alembic integration.

    In most scenarios you need changing geometry (for particle / liquid simulation for example) which leads to a mess in LW.

    Works well in all other applications I use and I hope NewTek will fix this in a future version of LW.

  6. #66
    Registered User
    Join Date
    Aug 2016
    Location
    a place
    Posts
    1,853
    i think this was one of robs goals right? really good interchange.
    it’s a real kick in the pants when you hit one of those moments.

  7. #67
    Registered User
    Join Date
    Aug 2016
    Location
    a place
    Posts
    1,853
    Quote Originally Posted by Marander View Post
    Works well in all other applications I use
    this is one of the main reasons people end up switching. half implemented or buggy/lacking features.

    sigh, don’t mind me, one of those days..

  8. #68
    Registered User
    Join Date
    May 2012
    Location
    Virginia
    Posts
    431
    So I am seriously thinking of going with Marmoset Tool Bag.

    It's cheaper than Octane, has a perpetual license without restriction for systems that aren't connected to the internet, and doesn't require any additional hardware.

    I don't think it's a better renderer; but my choices are few and far between and I need a solution.

  9. #69
    Super Member OlaHaldor's Avatar
    Join Date
    May 2008
    Location
    Norway
    Posts
    998
    The downside with Marmoset Toolbag is you can't store the lookdev so it picks it up in a new scene. Yes, it can import textures for you (if the model file is exported properly), but it won't load any tweaks you've done, such as using the sliders to fine tune the spec/gloss or metal/roughness, AO, emissive, displacement opacity etc..


    If you're only doing stills, with only a few different scenes, Marmoset is pure joy and awesome.
    3D Generalist
    Threadripper 2920x, 128GB, RTX 2080 Ti, Windows 10

  10. #70
    Registered User
    Join Date
    May 2012
    Location
    Virginia
    Posts
    431
    Quote Originally Posted by OlaHaldor View Post
    The downside with Marmoset Toolbag is you can't store the lookdev so it picks it up in a new scene. Yes, it can import textures for you (if the model file is exported properly), but it won't load any tweaks you've done, such as using the sliders to fine tune the spec/gloss or metal/roughness, AO, emissive, displacement opacity etc..


    If you're only doing stills, with only a few different scenes, Marmoset is pure joy and awesome.
    I'm still holding out for a bit... Kray keeps teasing GPU render and I'm hoping that the LWDG will at least start working with some of the other existing GPU renderers to get plugins developed.

  11. #71
    www.Digitawn.co.uk rustythe1's Avatar
    Join Date
    Feb 2006
    Location
    england
    Posts
    1,280
    think that's only a couple of features and only via open cl, kray is still cpu and geared toward multithread CPUs (that's why all the tests seem to be on dual Xeons)
    Intel i9 7980xe, Asus Rampage Vi extreme, 2x NVIDIA GTX1070ti, 64GB DDR4 3200 corsair vengeance,
    http://digitawn.co.uk https://www.shapeways.com/shops/digi...ction=Cars&s=0

  12. #72
    Registered User
    Join Date
    May 2012
    Location
    Virginia
    Posts
    431
    Quote Originally Posted by rustythe1 View Post
    think that's only a couple of features and only via open cl, kray is still cpu and geared toward multithread CPUs (that's why all the tests seem to be on dual Xeons)
    In the current version, yes; but they have pretty much stated that they will be introducing full GPU rendering in an upcoming release.

  13. #73
    Registered User ianr's Avatar
    Join Date
    Oct 2006
    Location
    Chiltern Riviera
    Posts
    1,401
    Quote Originally Posted by RPSchmidt View Post
    In the current version, yes; but they have pretty much stated that they will be introducing full GPU rendering in an upcoming release.
    We need GPU Solvers, cos how can you climb that speed issue without

    discreet GPU solvers on board that sense a gpu , Jasha knew that in Turbulence did he Not?

    So Dev Gas man M they should be starting to seed them in LW for 2020.

  14. #74
    Registered User
    Join Date
    Feb 2019
    Location
    Lancashire, England
    Posts
    9
    Do you think there is enough space on a cpu die to add rtx type units on a cpu? I know Nvida have introduced RTX croes for the gfx card, I am just wondering if they could add an equivalent to a cpu die instead of a gpu.

    Another thing that I was thinking of was the cpu wars between AMD and INTEL which have provided us with cheaper cpus with loads of cores like the 32 core threadripper 2. This year there might even be a 64 core threadripper 3. Maybe if this continues as a trend cpu rendering might start catching up performance wise to GPUs, or at the very least make the difference smaller.

  15. #75
    Registered User
    Join Date
    May 2012
    Location
    Virginia
    Posts
    431
    Quote Originally Posted by MichaelBeckwith View Post
    Do you think there is enough space on a cpu die to add rtx type units on a cpu? I know Nvida have introduced RTX croes for the gfx card, I am just wondering if they could add an equivalent to a cpu die instead of a gpu.

    Another thing that I was thinking of was the cpu wars between AMD and INTEL which have provided us with cheaper cpus with loads of cores like the 32 core threadripper 2. This year there might even be a 64 core threadripper 3. Maybe if this continues as a trend cpu rendering might start catching up performance wise to GPUs, or at the very least make the difference smaller.
    At first, I thought this might be a possibility; but even though CPUs are climbing the curve by adding cores, architecturally, CPUs are still hampered compared with GPUs in this area.

    GPUs have thousands of cores in a parallel architecture that allows them to handle multiple tasks very quickly, but those tasks are typically focused and limited. They typically only handle a limited number of instruction sets. It makes them faster, but less flexible.

    CPUs use sequential serial processing, which allows them to attack a single task very well; but that task might be part of an extremely large group of instruction sets that it has been designed to handle. They are capable of handling far more tasks than a GPU and they are typically more exacting with their calculations while doing it.

    In the end, you get much faster renders with GPUs, although you get more accurate renders with CPUs. There's a small trade-off in quality, but even that can be compensated for in your GPU render, although that adds time to the render. Even then, the GPU will still be faster than the CPU.

    Personally, I don't see that changing any time in the near future.

    So in my opinion, it's extremely important that Lightwave increases its interoperability with as many rendering solutions as possible, and hopefully, at some point, switches over to GPU rendering so that they can become more competitive in the broader market.

Page 5 of 6 FirstFirst ... 3456 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •