Page 30 of 34 FirstFirst ... 202829303132 ... LastLast
Results 436 to 450 of 510

Thread: LW 2018 Comments/Opinions

  1. #436
    Registered User
    Join Date
    Jan 2016
    Location
    Stockholm
    Posts
    1,449
    Quote Originally Posted by dee View Post
    No offence but this remains to be proven.

    As I see it, Newtek made a wrong decision writing a new CPU renderer in times where everyone else is moving towards GPU. Some here say, they can add GPU support later. AFAIK they would need to rewrite the whole thing for GPU, materials and shaders included.
    CPUs scales better than GPU, are more flexible, and easier to maintain for LWG. Most importantly.. they are not dependent on GPU IHVs + its hardware technologies & drivers etc.. fluctuations. I think they made the correct decision when they opted for CPU. Especially when we all know how pressed for time they are.

  2. #437
    Registered User
    Join Date
    Aug 2016
    Location
    a place
    Posts
    1,850
    I'm finding VPR to be painful and I really thing they should have considered some use of the GPU for this version because of it.

  3. #438
    Super Member samurai_x's Avatar
    Join Date
    Jul 2015
    Location
    lalaland
    Posts
    1,231
    They don't have a gpu developer. Did they fill the mesh engine dev position?

  4. #439
    Registered User tyrot's Avatar
    Join Date
    Mar 2006
    Location
    in lights
    Posts
    2,167
    cpu correct decision? oh man come on now...!

  5. #440
    www.Digitawn.co.uk rustythe1's Avatar
    Join Date
    Feb 2006
    Location
    england
    Posts
    1,279
    Quote Originally Posted by samurai_x View Post
    They don't have a gpu developer. Did they fill the mesh engine dev position?
    I thought they had Dominick spina from Nvidia?
    Intel i9 7980xe, Asus Rampage Vi extreme, 2x NVIDIA GTX1070ti, 64GB DDR4 3200 corsair vengeance,
    http://digitawn.co.uk https://www.shapeways.com/shops/digi...ction=Cars&s=0

  6. #441
    Super Member samurai_x's Avatar
    Join Date
    Jul 2015
    Location
    lalaland
    Posts
    1,231
    Quote Originally Posted by rustythe1 View Post
    I thought they had Dominick spina from Nvidia?
    He left a long time ago afaik.

  7. #442
    Frequenter
    Join Date
    Sep 2003
    Location
    Munich
    Posts
    352
    Quote Originally Posted by tyrot View Post
    cpu correct decision? oh man come on now...!
    LW needs a good native render engine - and going for ONLY-GPU would surely be a bad decision at this point.
    And honestly - not only my opinion I can assure you - GPU rendering might be the future, but its just not there yet for a serious production yet...
    Scalabilty, memory constrains and also access to gpu based routines - ask Juanjo for instance - are still a serious issue for bigger productions.
    Last edited by fishhead; 01-10-2018 at 11:01 AM. Reason: weired double posting...

  8. #443
    Super Member samurai_x's Avatar
    Join Date
    Jul 2015
    Location
    lalaland
    Posts
    1,231
    Redshift and octane are pretty much used on serious production.
    Do people think they went with cpu and not a hybrid cpu gpu because of the reasons mentioned or because they dont have a gpu developer.

  9. #444
    Frequenter
    Join Date
    Sep 2003
    Location
    Munich
    Posts
    352
    Quote Originally Posted by samurai_x View Post
    Redshift and octane are pretty much used on serious production.
    Do people think they went with cpu and not a hybrid cpu gpu because of the reasons mentioned or because they dont have a gpu developer.
    Well, ofc those renderers are used for productions - for productions where the feature set and constrains match the demands. But there are productions that simply need more or simply something different...
    Be it the amount of data that has to fit into RAM or the just the scalability. the need for gpu´s can also quite easily outgrow a given budget...
    And you just cannot simply code for instance a new camera (which would be needed for certain formats - our case...) whereas the sdk - in probably most cases - for the host software would allow to access the needed hooks in software for this.
    We were inquiring the RedShift developers and also Otoy for something like that, just doesn´t work right now...
    So I for one am very happy to have a native cpu renderer where you have the necessary access for now.

    Well, I surely believe NT/LWG will consider going GPU (possibly even as an option/hybrid, who knows...) sometime in the future. Maybe they already even have a gpu developer already. Why not... :-]

  10. #445
    Registered User
    Join Date
    Mar 2012
    Location
    Socal
    Posts
    403
    GPU rendering only works well when all the cores are busy doing the same thing. You can get some huge improvement in that case. When all the threads go off in different directions, doing different things, you lose the parallelism, and performance. Performance goes from many orders of magnitude faster, to much less. unbiased rendering is just one of those things that is very hard to do and get all the threads doing the same thing. I never had much success with Blenders GPU implementation. When it worked, if at all, it was only a few times faster at best, and usually only about 40% faster than the CPU.
    It takes much more time and effort to get it right. Not saying it can't be done, just that right now there time could be spent on unification and other things, then come back to it. We all want the best lightwave, it's just how best to spend the time right now.
    Last edited by Dan Ritchie; 01-10-2018 at 04:04 PM.

  11. #446
    Super Member samurai_x's Avatar
    Join Date
    Jul 2015
    Location
    lalaland
    Posts
    1,231
    Quote Originally Posted by fishhead View Post
    Well, ofc those renderers are used for productions - for productions where the feature set and constrains match the demands. But there are productions that simply need more or simply something different...
    I think for majority of current lw users, they're not creating any AAA movies like Marvel, Lord of the Rings.
    Lightwave is currently used for budget movies, low end game cinematics, viz work, tv with tight deadlines.
    A hybrid renderer would have been better. It serves small to medium studios well.
    I don't believe gpu rendering for lightwave is in the roadmap. You need a specialist to develop it. They need to hire one and they don't come cheap.

  12. #447
    Registered User
    Join Date
    Jan 2016
    Location
    Stockholm
    Posts
    1,449
    Don't know where you get the idea that Lightwave isn't used in top AAA movies: https://www.youtube.com/watch?v=kJstfy0WvFo

    As for GPU code.. LWG can use these:

    https://gpuopen.com/firerays-2-0-ope...g-ray-tracing/
    https://gpuopen.com/open-source-radeon-prorender


    Goodies:
    https://github.com/GPUOpen-LibrariesAndSDKs
    http://raytracey.blogspot.com


    That should give a good head start.
    Last edited by MichaelT; 01-11-2018 at 12:24 AM.

  13. #448
    GPU definitely a specialized build.

    I wouldn’t have went with a new render anyhow....

    The future is gonna be specialized, compartmentalized features. Developed by third party developers and bolted to a centralized geometry and display engine.

    The 3D world is becoming more unified in its approach to formats. We are on the verge of universal shading languages, universal geometry formats, universal animation formats, and universal image formats.
    I think the best efforts going forward would be a virtual ‘3D OS’ that is a stripped down shell with industry standard SDK hooks to incorporate different deformers, solvers and renderers....and build the features you can handle.
    IMHO

  14. #449
    Quote Originally Posted by MichaelT View Post
    Don't know where you get the idea that Lightwave isn't used in top AAA movies: https://www.youtube.com/watch?v=kJstfy0WvFo
    So a small part of the previz 17 years ago....that would be what, lightwave 6 maybe?

  15. #450
    Super Member samurai_x's Avatar
    Join Date
    Jul 2015
    Location
    lalaland
    Posts
    1,231
    Quote Originally Posted by MichaelT View Post
    Don't know where you get the idea that Lightwave isn't used in top AAA movies: https://www.youtube.com/watch?v=kJstfy0WvFo
    Animatics? Purposely out of context.

    Please read up we're talking about rendering. Besides the shark movies and the natzi dino movie that are budget movies, which movie was the lw renderer used...in AAA movies in the class of Marvel, LOTR?
    The question was why did they go with cpu instead of cpu/gpu hybrid like Indigo, Cycles, etc. The reasons given are not convincing. Scalability, good for serious production, sure if the studios that use the renderer were making AAA LOTR stuff. That's not the case.
    There are probably more independentfreelancer and small "not serious?" productions that could have benefited from a hybrid lightwave renderer because its easier and cheaper to buiild a GPU farm. This is why redshift and octane are so popular with small to medium studios that have limited budget. These are the same clients newtek are catering to. Not AAA studios.
    So why didn't Newtek do it?
    Its more logical that Newtek doesn't have a gpu developer.
    Last edited by samurai_x; 01-11-2018 at 12:54 AM.

Page 30 of 34 FirstFirst ... 202829303132 ... LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •