Page 7 of 12 FirstFirst ... 56789 ... LastLast
Results 91 to 105 of 174

Thread: Brent's Lightwave 2019 Render Challenge

  1. #91
    Founding member raymondtrace's Avatar
    Join Date
    May 2003
    Location
    Ohio
    Posts
    861
    Quote Originally Posted by Pepper Potts View Post
    ...So to try to shut down someone who seems to be genuinely starting a debate, not a fight, about both the pros and cons of the current LW seems crazy to me...
    It is indeed crazy, because there is no debate. There's no confusion about the benefits and limitations of GPU, either by NT or most of its customers. Those that want to render via GPU in LW can do so with external renderers, just as Brent is already doing with Octane for both LW and Blender. While NT does not yet offer GPU rendering natively, they've been working to make it easier to interact with external GPU renderers with each release.

    I valued the information in this thread until some unnecessary intellectual posturing on post #48. Couple that with interactions observed in other threads and in other social media ...and I can understand Steve's POV.
    LW7.5D, 2015, 2018, 2019 running portably on a USB drive on an Amiga 2500 running Wine.

  2. #92
    Hardcore Lightwaver Photogram's Avatar
    Join Date
    Sep 2010
    Location
    Montreal
    Posts
    294
    Quote Originally Posted by Tim Parsons View Post
    Changed the camera samples to 3 and lowered the GI rays to 16. Render came out really clean and nice.
    My render time is 39.5 second with theses settings.
    Dual Xeon E5-2670 2.66 ghz
    Dual Xeon E5-2670 2.6 GHZ // 64 gig DDR 3 // 32 cores // Windows 10

  3. #93
    Registered User
    Join Date
    Jun 2014
    Location
    Right here
    Posts
    1,534
    Render times 36.2 seconds on an aging i7 6C/12T.

    My next workstation will most likely be something like a 32 to 64 Core Threadripper so these rendertimes look promising!

    I'm impressed with the LW2019.1 + NVidia Denoiser speed.

    Without the Denoiser the image would be aweful noisy but that is a game changer for me.

    I also rendered the scene in Cinema Physical Render and Vray on CPU, it's not possible to reach these render times and quality using GI without a denoiser. Vray has one built-in but it sucks in my opinion.

    It is only a clay render so that can look different when using textures, reflections, refractions, transparency, subsurface, GGX / Beckman shaders etc. but nevertheless I'm impressed.

  4. #94
    Registered User Rayek's Avatar
    Join Date
    Feb 2006
    Location
    Vancouver, BC
    Posts
    1,456
    Optimized scene from #24 https://forums.newtek.com/showthread...=1#post1564456

    1m30s on my ancient i7 [email protected] and Lightwave 2019.

    Quite impressive. When I tested 2018 rendering, I wasn't impressed by the render times to arrive at noiseless renders, but 2019 matured, and works well now.
    Win10 64 - i7 [email protected], p6t Deluxe v1, 48gb, Nvidia GTX 1080 8GB, Revodrive X2 240gb, e-mu 1820. Screens: 2 x Samsung s27a850ds 2560x1440, HP 1920x1200 in portrait mode

  5. #95
    Banned OnlineRender's Avatar
    Join Date
    Dec 2008
    Location
    Glasgow , Scotland , UK
    Posts
    6,359
    i7-5930K - Default Scene 3Mins 35 Secs
    Optimized: 1min 12 Secs
    Low End " but passable" 36 secs
    -------------------------------------------
    Octane 24 Secs
    -------------------------------------------
    GarageFarm 2 secs


    comparing GPU vs CPU is like having sex with a bathing suit on, pointless!

    in the next 10 years most things will probably be done on the cloud, I have yet to see the argument that CPU's are actually getting pretty damn fast.
    GPU rendering brings a whole new set of problems, initial startup costs for example ... decent PSU , decent case or rack...

    each to their own

  6. #96
    Registered User
    Join Date
    Mar 2016
    Location
    Oxford, UK
    Posts
    793
    Quote Originally Posted by OnlineRender View Post

    comparing GPU vs CPU is like having sex with a bathing suit on, pointless!

    in the next 10 years most things will probably be done on the cloud
    Incuding the sex, probably between Daz characters on Genesis 37 while we tap in wirelessly. Still, I guess we shouldn't knock it till we've at least tried it once....

  7. #97
    Super Member Qexit's Avatar
    Join Date
    Feb 2003
    Location
    Warrington, UK
    Posts
    1,078
    Quote Originally Posted by OnlineRender View Post
    comparing GPU vs CPU is like having sex with a bathing suit on, pointless!

    in the next 10 years most things will probably be done on the cloud, I have yet to see the argument that CPU's are actually getting pretty damn fast.
    GPU rendering brings a whole new set of problems, initial startup costs for example ... decent PSU , decent case or rack...

    each to their own
    Very true about the costs. My LW PC is reliable but not exactly state-of-the-art. It has a 3GB Quadro K4000 graphics card that isn't really suitable for GPU rendering. It only has 24GB of DDR3 RAM and a pair of Xeon E5620 CPUs running at 2.4 GHz, so it can run LW2019 quite happily but doesn't let me get anything done in a hurry. I am hoping to replace the whole thing at some point this year as it is now 5 years old but with a budget that wouldn't cover any of the current top spec gear. High end graphics cards and GPU renderer licenses are simply not on my shopping list as I cannot afford them. I only have a K4000 now because I picked it up for a ridiculously low price as a refurbished unit four years ago. Nice gear is nice to have...but not everyone has access to it
    Kevin F Stubbs

    Remember...one size does NOT fit all

  8. #98
    Super Member OlaHaldor's Avatar
    Join Date
    May 2008
    Location
    Norway
    Posts
    998
    Quote Originally Posted by TheLexx View Post
    Here's a crazy convoluted scenario - import a fully animated LW scene into Cinema 4D and render in Redshift just to see what happens. I don't think anyone would ever do that, but it is possible in theory ?
    When I studied 3D a few years ago we had to use Maya for model, rig, animation. But I hated the look of Mental Ray, so I moved everything to LightWave and rendered with Octane.
    Anyone could do that.



    Re the "ads for Modo"... It's not that I wanted to use Modo for the test render with Octane, so it wasn't meant as an ad. I stated I cannot open the file and render with Octane in LW2019 trial. Nor could I export.. Whether I had used Octane in LW or Modo really doesn't matter. It'd still be Octane. Sorry if I stepped on some toes by mentioning the enemy...
    Last edited by OlaHaldor; 02-12-2019 at 06:26 AM.
    3D Generalist
    Threadripper 2920x, 128GB, RTX 2080 Ti, Windows 10

  9. #99
    Registered User
    Join Date
    Mar 2016
    Location
    Oxford, UK
    Posts
    793
    Quote Originally Posted by OlaHaldor View Post
    When I studied 3D a few years ago we had to use Maya for model, rig, animation. But I hated the look of Mental Ray, so I moved everything to LightWave and rendered with Octane.
    Anyone could do that.
    I guess I was just curious about speed differences between Octane and Redshift with a thought if LW could also access Redshift (can't let 4D have all the fun !). Interesting that in theory LW will render in RS. Good that LW has three GPU options, with one being free.

    If Marander does manage a test of LW native with a 64 Core Threadripper, that would be interesting too.


  10. #100
    Medical Animator mummyman's Avatar
    Join Date
    Aug 2009
    Location
    Connecticut
    Posts
    1,028
    I'm slowly doing this from LW to Maya to render in Redshift. Redshift is amazingly fast for me. Very VERY similar to LW's new renderer for me to understand. Being able to bake out instances using OD Tools is fantastic. Sadly... I don't have much to show for a comparison test. Loving this thread. Hopefully I can do some tests down the road. But something still grainy and crawly slightly in LW with a 1 min 45 sec render can be pretty damn clean in about 30 seconds in Redshift. To me it might be worth working in LW and converting to fbx / baking to use RS. Sorry for going off-topic.

  11. #101
    Just as a note about the CPU vs GPU thing, especially as related to Arnold...

    Arnold is still mostly CPU bound - we use it at our school here and while the GPU helps with iterating the lighting in the viewport... it completely fails with mayabatch.exe / render.exe on a renderfarm.

    Also, one note about Optix - Nvidia is working on it, but in its current state the denoising actually impacts TWO things... in addition to the denoising not being temporally aware, it also makes the final image unsuitable for compositing even if you can live with the shimmering denoise. Vray is much the same.

    Thanks for the scene - I'll try it on some varied hardware when I get a chance :-)
    please power off before disconnecting connecting connectors

  12. #102
    Almost newbie Cageman's Avatar
    Join Date
    Apr 2003
    Location
    Malmö, SWEDEN
    Posts
    7,650
    LightWave 2019 CPU + Houdini Mantra are great if you have a big farm like we have. I mean, the potential to have over 400 rendernodes overnight (usually it is more around 200 mark) turns things around quite quickly. There is no GPU solution for us that would be cost-effective to replace that farm.

    1) Octane has a very shady licensing model that I do not like from a studio perspective.
    2) Redshift cost 500-600 USD / license + you have to get some fairly expensive hardware to go with each license.
    3) Out of Core features if used extensively (large complex scenes with high res textures, millions of instances, and 10-20 fully deforming characters with hair etc) has a tendency to make GPU rendering go a tad bit slower than its original potential, and in those cases, higher end CPUs starts to catch up.

    LightWave 2019 upgrade was $395 / seat with pretty much unlimited rendernodes.

    So, those are some of the reasons I see CPUs being winners for our situation.
    Senior Technical Supervisor
    Cinematics Department
    Massive - A Ubisoft Studio
    -----
    Intel Core i7-4790K @ 4GHz
    16GB Ram
    GeForce GTX 1080 8GB
    Windows 10 Pro x64

  13. #103
    www.Digitawn.co.uk rustythe1's Avatar
    Join Date
    Feb 2006
    Location
    england
    Posts
    1,279
    Quote Originally Posted by Cageman View Post
    LightWave 2019 CPU + Houdini Mantra are great if you have a big farm like we have. I mean, the potential to have over 400 rendernodes overnight (usually it is more around 200 mark) turns things around quite quickly. There is no GPU solution for us that would be cost-effective to replace that farm.

    1) Octane has a very shady licensing model that I do not like from a studio perspective.
    2) Redshift cost 500-600 USD / license + you have to get some fairly expensive hardware to go with each license.
    3) Out of Core features if used extensively (large complex scenes with high res textures, millions of instances, and 10-20 fully deforming characters with hair etc) has a tendency to make GPU rendering go a tad bit slower than its original potential, and in those cases, higher end CPUs starts to catch up.

    LightWave 2019 upgrade was $395 / seat with pretty much unlimited rendernodes.

    So, those are some of the reasons I see CPUs being winners for our situation.
    and this was exactly one of my points, the cost is even worse for us in Europe, in the states I believe you can pick up a 2080 for around $600 if I am to believe forum posts, but here in the uk they can set you back nearly £1600, that's over $2000, but intel chips seem to go the other way, I only paid just over £1200 for my 7980, (I think in the states they are $2000 or more) don't quote me on anything it s just here say, but gpu is well over priced in Europe compared to states and east, and then like you say there is the software cost its self
    Intel i9 7980xe, Asus Rampage Vi extreme, 2x NVIDIA GTX1070ti, 64GB DDR4 3200 corsair vengeance,
    http://digitawn.co.uk https://www.shapeways.com/shops/digi...ction=Cars&s=0

  14. #104
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by raymondtrace View Post
    It is indeed crazy, because there is no debate. There's no confusion about the benefits and limitations of GPU, either by NT or most of its customers. Those that want to render via GPU in LW can do so with external renderers, just as Brent is already doing with Octane for both LW and Blender. While NT does not yet offer GPU rendering natively, they've been working to make it easier to interact with external GPU renderers with each release.

    I valued the information in this thread until some unnecessary intellectual posturing on post #48. Couple that with interactions observed in other threads and in other social media ...and I can understand Steve's POV.
    If you aren't interested or don't agree with the subject of this thread then simply don't take part in it, no one is forcing you to read.
    No need to try to slander me since anyone who has a mouse (or trackball) can easily find my training videos and see my opinions and perspectives over the years on LW, Modo, and Blender on my Youtube channel and I'm on the Octane/Blender group FB as well. On my Youtube Channel if you scroll way down you will see I uploaded, for free, some of my Lightwave Lecturers and demonstrations I created for my classes I taught at Howard University when I ran their 3D Art and Animation program, we had a full Lightwave lab and a 20 machine Screamer Net farm that I ordered, setup, and arranged. I'm glad to say I taught hundreds of students Lightwave during those years which I'm sure added to Newteks revenue and hopefully reputation at that time. Just had to add these bits since I've always argued for Lightwave, as Cinematic Director at Firaxis Games during Civilization III production, as a Professor, and as an Art Director for mobile games, but hey maybe I don't like Lightwave and maybe I'm just a troll trying to create division on the Lightwave forum, go figure. Remember you don't have to read any of this, just take a bite of the cookie and everything will be as right as rain.

    I would like to personally thank Rayek for his post #53, which I believe caps the discussion, OlaHaldor for his unique perspective coming from a Modo side, and the countless users who participated in this exchange with their thoughts and ideas, felt like the old days.

    Thank you for mentioning my post #48! When users can answer these important questions for themselves they will have a clearer understanding where things are going in the short term, but once again no-one is forcing anyone to have to read these questions or to answer them and they are not meant as a snarky response or any form of attack on ones intelligence.

    brent3d's Post #48:
    When is the last time you've seen Autodesk marketing MentalRay?
    What was their latest big company purchase in regards to rendering?
    What rendering software was just intergrated into Maya and soon 3Ds Max?
    What did that software's developers announce 10mths ago they had achieved with the renderer, although they were six years behind other companies?
    What is Vray, what does Vray now support and why?
    What does Nvidia's RTX stand for and what 3D softwares have announced upcoming support for it?
    What branches of the government does Nvidia contract with?
    There are no fruits, only a basket."

    My Youtube Channel:
    https://www.youtube.com/user/alleyne3d/videos
    Last edited by brent3d; 02-12-2019 at 10:25 AM.

  15. #105
    'the write stuff' SBowie's Avatar
    Join Date
    Feb 2003
    Location
    The stars at night are big and bright
    Posts
    19,394
    Quote Originally Posted by Pepper Potts View Post
    I'm not sure if some of the statements that you are making are actually fair to say.
    No offense taken, but let's review what I wrote:

    • There are plenty of pro-GPU threads here: Clearly true (I assume something true is also 'fair')
    • (so no-one)is debating something that isn't allowed: True; see above.
    • I'm not the only one here who thinks this was a ploy: Also true.
    • you could have just started a pro-GPU thread (as many others have done before) and no-one would have called you out for doing so: True again.

    So I'm not seeing it, sorry. Any valuable information in this thread would have accrued just as easily in a straight-up gpu-versus-cpu thread without the bait and switch approach.
    --
    Regards, Steve
    Forum Moderator
    ("You've got to ask yourself one question ... 'Do I feel lucky?' Well, do ya, spammer?")

Page 7 of 12 FirstFirst ... 56789 ... LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •