Page 4 of 12 FirstFirst ... 23456 ... LastLast
Results 46 to 60 of 174

Thread: Brent's Lightwave 2019 Render Challenge

  1. #46
    www.Digitawn.co.uk rustythe1's Avatar
    Join Date
    Feb 2006
    Location
    england
    Posts
    1,128
    Quote Originally Posted by thomascheng View Post
    If we are comparing eevee, we should also bring in some Unreal with the bridge. Anyone want to try?
    i use unreal all the time now, just rendered a 2 min animation of an entire school fly through with millions of polys and background elements, full dynamic lighting no bake with around 80 lights, a full exterior grounds with dynamic vegetation and rendered at 4k cinematic in 15 mins, unreal bridge is the best thing to happen to lightwave in a long while (even animated the camera sequences in lightwave and one click instant unreal movie!)
    and as regards to comparing gpu and cpu, its pointless as there are so many arguments for and against each, and then on the cpu side there are massive differences between cpus themselves, as Ive argued on other threads even the way the intels handle the FP values makes a huge difference, my current cpu renders 5 times faster than my old 5960, I rendered the original scene in 32 seconds, that's almost 4 times as fast as matts time, the optimized scene was under 10 seconds, so if you want to compare cpu to gpu you first need to compare all cpus to each other, especially if your comparing a cpu to a top end gpu like a --80ti should you not say compare an 8 core system to that of a standard single 20-1050gtx?
    Intel i7 5960X Extreme, Asus Rampage V extreme, 3x 4gb NVIDIA Geforce GTX970, 32GB DDR4 2666 corsair dominator
    http://digitawn.co.uk https://www.shapeways.com/shops/digi...ction=Cars&s=0

  2. #47
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by OlaHaldor View Post
    1 x EVGA RTX 2080 Ti. No overclock or anything.
    (seeing this I *really* want to get another one if I get some jobs to pay for it )
    That's insanely fast on one card! If anyone wants to know why you would need to render that fast with Path-Traced, Brute Force, or Direct Lighting think VR rendering and animation at 4k and 8k, it's here and that truly is a GPU market.

  3. #48
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by rustythe1 View Post
    i use unreal all the time now, just rendered a 2 min animation of an entire school fly through with millions of polys and background elements, full dynamic lighting no bake with around 80 lights, a full exterior grounds with dynamic vegetation and rendered at 4k cinematic in 15 mins, unreal bridge is the best thing to happen to lightwave in a long while (even animated the camera sequences in lightwave and one click instant unreal movie!)
    and as regards to comparing gpu and cpu, its pointless as there are so many arguments for and against each, and then on the cpu side there are massive differences between cpus themselves, as Ive argued on other threads even the way the intels handle the FP values makes a huge difference, my current cpu renders 5 times faster than my old 5960, I rendered the original scene in 32 seconds, that's almost 4 times as fast as matts time, the optimized scene was under 10 seconds, so if you want to compare cpu to gpu you first need to compare all cpus to each other, especially if your comparing a cpu to a top end gpu like a --80ti should you not say compare an 8 core system to that of a standard single 20-1050gtx?
    You are ignoring what's going on in the market. When is the last time you've seen Autodesk marketing MentalRay? What was their latest big company purchase in regards to rendering? What rendering software was just intergrated into Maya and soon 3Ds Max? What did that softwares developers announce 10mths ago they had achieved with the renderer, although they were six years behind other companies? What is Vray, what does Vray now support and why? What does Nvidia's RTX stand for and what 3D softwares have announced upcoming support for it? What branches of the government does Nvidia contract with?
    There are no fruits, only a basket.

  4. #49
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by OlaHaldor View Post
    1 x EVGA RTX 2080 Ti. No overclock or anything.
    (seeing this I *really* want to get another one if I get some jobs to pay for it )




    Ah ok Sounds very interesting.
    While you can't do a ton of AA samples etc on the fly in Marmoset, you can when you save an image (or image sequence). I didn't make a directional light in my example, but that'd make the shapes read better by adding a hard shadow.

    (marmoset really is my go-to app for quick and dirty renders. So fast to set up something that looks good enough)


    I can't really tell why Octane used more VRAM here. I've seldom thought about VRAM usage when I render. I only care about it if I get an error.
    For clarity and still in BETA:

  5. #50
    I love to see Newtek add GPU support too, but I think it might be better for them to just optimize the current CPU renderer first. I think there's still a lot more optimization that can occur. The GPU side would require a lot of work looking at how long it took Vray and Arnold to implement GPU rendering. The resource might be better spent on supporting AMD to implement Pro Render into LW with some additional code to get it to match the LW renderer as much as possible. Who knows, maybe the future would be pcie CPUs to compete with GPUs.

  6. #51
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by thomascheng View Post
    I love to see Newtek add GPU support too, but I think it might be better for them to just optimize the current CPU renderer first. I think there's still a lot more optimization that can occur. The GPU side would require a lot of work looking at how long it took Vray and Arnold to implement GPU rendering. The resource might be better spent on supporting AMD to implement Pro Render into LW with some additional code to get it to match the LW renderer as much as possible. Who knows, maybe the future would be pcie CPUs to compete with GPUs.
    Arnold "had" to go GPU, watch their talks on the issue. They knew they were 6 years behind the competition. The Autodesk partnership funded it which gave them direct access to Nvidia with Optix. (Optix is more of a selling point for users since Arnold got into GPU late in my opinion).
    Native GPU/CPU supported Path-Traced PBR is now quickly becoming the norm, the short-term rendering future is RTX, realtime Path Tracing at 4k-8k, and 4k-8k VR. If Lightwave's 2019's renderer is the closest to an Octane Direct Lighting render setting then what are you optimizing to, oblivion?

    A rnold GPU tech talk:

  7. #52
    Founding member raymondtrace's Avatar
    Join Date
    May 2003
    Location
    Ohio
    Posts
    662
    Quote Originally Posted by brent3d View Post
    You are ignoring what's going on in the market...
    Is there a chance that you were ignoring what's going on in the post to which you were replying?
    LW7.5D, 2015, 2018, 2019 running portably on a USB drive on an Amiga 2500 running Wine.

  8. #53
    Registered User Rayek's Avatar
    Join Date
    Feb 2006
    Location
    Vancouver, BC
    Posts
    1,409
    I agree. The C4D developers realized they had to add GPU render tech to their software ASAP to remain relevant and competitive. I think their situation was comparable to Lightwave: an aging CPU based render engine while all the competition was already, or were going, GPU.

    Instead of reinventing the wheel, Maxon decided to integrate ProRender, which was/is a smart move. It would have been an even smarter move for Lightwave's development: a mature combined GPU/CPU renderer out of the box, and less headaches how to divide and strategize development resources. Existing resources that could have been easily re-used (existing material libraries), and so on.

    When Lightwave 2018 was released, I did not understand the lack of GPU rendering in the new render engine. Aside from first-release bugs and rough edges, I thought it is a great quality engine - but the competition offers far more flexibility with BOTH CPU and GPU support, and offer the same quality and beyond. 2018's new renderer was years behind the competition the moment it was released.

    The Newtek team could have focused more on other long outstanding improvements, instead of starting a CPU renderer from scratch that would be years behind the competition when released in 2018. I still can't wrap my head around it. In particular seeing the development behind (almost) real-time GPU based rendering based on game render tech. Newtek could have really made a smashing comeback if the new render engine would have been a CPU/GPU hybrid one. Such a lost opportunity.

    Still, water under the bridge. The only way forward now is to add GPU rendering to Lightwave, preferably combined CPU/GPU to keep being relevant in the semi-long term. Which is a VERY specialized coding job, and developers in this field are very hard to hire and keep hold of. So I see Newtek's Lightwave team going forward without GPU support because of the invested time and effort to develop their new CPU only renderer, even to the detriment of long-term survival (sunken cost fallacy). Many Lightwave users will have to invest in an alternative GPU renderer (Octane) to remain competitive in their work, and based on what I read here, many already have. Which marginalizes the built-in new 2018 / 19 render engine.

    Modo is sort-of in the same boat, but can at least boast being one of the best modelers in the market.

    Having said all this, I might be proven wrong, and the LW dev team is going to surprise us all with GPU support next year, or the year after that. The question is whether it will matter anymore to anyone at that time.
    Win10 64 - i7 [email protected], p6t Deluxe v1, 48gb, Nvidia GTX 1080 8GB, Revodrive X2 240gb, e-mu 1820. Screens: 2 x Samsung s27a850ds 2560x1440, HP 1920x1200 in portrait mode

  9. #54
    Valiant NewTeKnight Matt's Avatar
    Join Date
    Feb 2003
    Location
    San Antonio, Texas, USA
    Posts
    13,054
    Quote Originally Posted by Tim Parsons View Post
    Changed the camera samples to 3 and lowered the GI rays to 16. Render came out really clean and nice.

    I love the speed of Octane as well as the look, but it's a pain in the *** to surface stuff in it. I mostly do interior stills so my workflow is generally work all day on the scenes and then load them into RenderQ and go home. So native is just fine by me and 2019 is faster and cleaner than 2018 so I can't complain. If a GPU renderer becomes available that works with LW shaders etc., I'll check it out for sure.
    What's the render time when you don't change the scene?
    UI / UX Designer @ NewTek
    __________________________________________________
    www.pixsim.co.uk : LightWave Video Tutorials & Tools


  10. #55
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by raymondtrace View Post
    Is there a chance that you were ignoring what's going on in the post to which you were replying?
    No, addressing the CPU arguement, not the use of Unreal or anything like that.

  11. #56
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by Rayek View Post
    I agree. The C4D developers realized they had to add GPU render tech to their software ASAP to remain relevant and competitive. I think their situation was comparable to Lightwave: an aging CPU based render engine while all the competition was already, or were going, GPU.

    Instead of reinventing the wheel, Maxon decided to integrate ProRender, which was/is a smart move. It would have been an even smarter move for Lightwave's development: a mature combined GPU/CPU renderer out of the box, and less headaches how to divide and strategize development resources. Existing resources that could have been easily re-used (existing material libraries), and so on.

    When Lightwave 2018 was released, I did not understand the lack of GPU rendering in the new render engine. Aside from first-release bugs and rough edges, I thought it is a great quality engine - but the competition offers far more flexibility with BOTH CPU and GPU support, and offer the same quality and beyond. 2018's new renderer was years behind the competition the moment it was released.

    The Newtek team could have focused more on other long outstanding improvements, instead of starting a CPU renderer from scratch that would be years behind the competition when released in 2018. I still can't wrap my head around it. In particular seeing the development behind (almost) real-time GPU based rendering based on game render tech. Newtek could have really made a smashing comeback if the new render engine would have been a CPU/GPU hybrid one. Such a lost opportunity.

    Still, water under the bridge. The only way forward now is to add GPU rendering to Lightwave, preferably combined CPU/GPU to keep being relevant in the semi-long term. Which is a VERY specialized coding job, and developers in this field are very hard to hire and keep hold of. So I see Newtek's Lightwave team going forward without GPU support because of the invested time and effort to develop their new CPU only renderer, even to the detriment of long-term survival (sunken cost fallacy). Many Lightwave users will have to invest in an alternative GPU renderer (Octane) to remain competitive in their work, and based on what I read here, many already have. Which marginalizes the built-in new 2018 / 19 render engine.

    Modo is sort-of in the same boat, but can at least boast being one of the best modelers in the market.

    Having said all this, I might be proven wrong, and the LW dev team is going to surprise us all with GPU support next year, or the year after that. The question is whether it will matter anymore to anyone at that time.
    Well said and directly on target.

  12. #57
    Founding member raymondtrace's Avatar
    Join Date
    May 2003
    Location
    Ohio
    Posts
    662
    Quote Originally Posted by brent3d View Post
    No, addressing the CPU arguement...
    I beg to differ that you were addressing anything by launching into a quiz game.

    Yes, GPU rendering is a thing and big brand name companies are incorporating it. What's going on in the market is that there are separate renderers that 3D animation apps utilize. Even AD couldn't excel at this internally and had to purchase a renderer from outside. Isn't LW part of this trend of external rendering options by continuing to integrate with Kray, Octane and UE? Isn't LW part of this trend by leaning into PBR with its native render...and inherently making it easier to move materials to external renderers? Is Russell really ignoring the market by positioning himself like everyone else (being able to render via CPU and GPU)?

    I recently watched a video of a guy that wanted to animate in Softimage since college and did not get a license until the time Softimage was dragged toward the grave by AD. All of us live in ignorance of what will happen tomorrow. Let's just enjoy the ride of discovery and use the tools that work for us.
    LW7.5D, 2015, 2018, 2019 running portably on a USB drive on an Amiga 2500 running Wine.

  13. #58
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by raymondtrace View Post
    I beg to differ that you were addressing anything by launching into a quiz game.

    Yes, GPU rendering is a thing and big brand name companies are incorporating it. What's going on in the market is that there are separate renderers that 3D animation apps utilize. Even AD couldn't excel at this internally and had to purchase a renderer from outside. Isn't LW part of this trend of external rendering options by continuing to integrate with Kray, Octane and UE? Isn't LW part of this trend by leaning into PBR with its native render...and inherently making it easier to move materials to external renderers? Is Russell really ignoring the market by positioning himself like everyone else (being able to render via CPU and GPU)?

    I recently watched a video of a guy that wanted to animate in Softimage since college and did not get a license until the time Softimage was dragged toward the grave by AD. All of us live in ignorance of what will happen tomorrow. Let's just enjoy the ride of discovery and use the tools that work for us.
    Rayek's post below says it best:

    "I agree. The C4D developers realized they had to add GPU render tech to their software ASAP to remain relevant and competitive. I think their situation was comparable to Lightwave: an aging CPU based render engine while all the competition was already, or were going, GPU.

    Instead of reinventing the wheel, Maxon decided to integrate ProRender, which was/is a smart move. It would have been an even smarter move for Lightwave's development: a mature combined GPU/CPU renderer out of the box, and less headaches how to divide and strategize development resources. Existing resources that could have been easily re-used (existing material libraries), and so on.

    When Lightwave 2018 was released, I did not understand the lack of GPU rendering in the new render engine. Aside from first-release bugs and rough edges, I thought it is a great quality engine - but the competition offers far more flexibility with BOTH CPU and GPU support, and offer the same quality and beyond. 2018's new renderer was years behind the competition the moment it was released.

    The Newtek team could have focused more on other long outstanding improvements, instead of starting a CPU renderer from scratch that would be years behind the competition when released in 2018. I still can't wrap my head around it. In particular seeing the development behind (almost) real-time GPU based rendering based on game render tech. Newtek could have really made a smashing comeback if the new render engine would have been a CPU/GPU hybrid one. Such a lost opportunity.

    Still, water under the bridge. The only way forward now is to add GPU rendering to Lightwave, preferably combined CPU/GPU to keep being relevant in the semi-long term. Which is a VERY specialized coding job, and developers in this field are very hard to hire and keep hold of. So I see Newtek's Lightwave team going forward without GPU support because of the invested time and effort to develop their new CPU only renderer, even to the detriment of long-term survival (sunken cost fallacy). Many Lightwave users will have to invest in an alternative GPU renderer (Octane) to remain competitive in their work, and based on what I read here, many already have. Which marginalizes the built-in new 2018 / 19 render engine.

    Modo is sort-of in the same boat, but can at least boast being one of the best modelers in the market.

    Having said all this, I might be proven wrong, and the LW dev team is going to surprise us all with GPU support next year, or the year after that. The question is whether it will matter anymore to anyone at that time."

  14. #59
    So what is the solution? At this point, it would be hard for Newtek to abandon CPU rendering and I doubt they have the resource to do both. Should they not even bother to optimize the cpu renderer?
    Last edited by thomascheng; 02-11-2019 at 01:07 PM.

  15. #60
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by thomascheng View Post
    So what is the solution? At this point, it would be hard for Newtek to abandon CPU rendering and I doubt they have the resource to do both.
    The solution is so clearly stated Rayek's post, re-read it.

Page 4 of 12 FirstFirst ... 23456 ... LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •