Page 9 of 12 FirstFirst ... 7891011 ... LastLast
Results 121 to 135 of 174

Thread: Brent's Lightwave 2019 Render Challenge

  1. #121
    'the write stuff' SBowie's Avatar
    Join Date
    Feb 2003
    Location
    The stars at night are big and bright
    Posts
    19,353
    OK, listen up peeps ... movin' on. Despite the questionable genesis of this thread, the main conversation in it is not particularly objectionable, and I don't think anyone really has a problem with it. Apart from the moderation already imposed (regarding which I refer anyone who is interested to the moderation policy thread), I don't see any reason it can't continue - without unhelpful and irrelevant personal commentary. That which does not fit this model will simply be moderated without further ado.
    --
    Regards, Steve
    Forum Moderator
    ("You've got to ask yourself one question ... 'Do I feel lucky?' Well, do ya, spammer?")

  2. #122
    ..since 4.0 and beyond brent3d's Avatar
    Join Date
    Mar 2005
    Location
    Maryland
    Posts
    345
    Quote Originally Posted by Ryan Roye View Post
    The reason why the sentiment of "comparing apples to oranges" is accurate is because it really is comparing two very different systems optimized for two very different tasks. It'd like judging a fish on its ability to climb trees.

    Try rendering volumetrics in Octane... you'll find that Lightwave produces results much more quickly and with a much higher level of control. It'd be an inefficient use of my time to try and explain why the comparison between LW and Octane isn't just "this is faster so use program X over Y", I can only ask that you do your research and try to determine why myself and others would all come to the same conclusion.

    I say this as someone who uses Octane in my workflow as well. It's a great renderer for scenes that require good GI, not so great for any non-photoreal stuff like visual effects or styled imagery.
    Yes, I've liked being to manipulate looks and renders within the standard raytrace engine. PathTracing is for Photoreal or close to it.
    Don't agree though on the Octane volumetric stuff though see my example below:

  3. #123
    Quote Originally Posted by brent3d View Post
    Show us how fast your CPU really is in Lightwave 2019 and post your video results in this brute force render comparison. The test scene is in the video description. Let the games begin!

    https://youtu.be/jx0q1CGP4vg

    Attachment 144061
    I personally like render challenges. Then users can play with own settings and learn more from others.

    If you will do more in future I have request. Try do better render settings, more optimized, to not teach bad habits. Especially before make youtube videos whitch can go to wide range of recipients

    You could also use your own models. You know, to brag something self made.
    Last edited by MarcusM; 02-12-2019 at 01:41 PM.

  4. #124
    Almost newbie Cageman's Avatar
    Join Date
    Apr 2003
    Location
    Malm÷, SWEDEN
    Posts
    7,644
    Quote Originally Posted by brent3d View Post
    No, addressing the CPU arguement, not the use of Unreal or anything like that.
    No, you didn't. You simply _ignored_ facts that CPUs are taking another big step forward in computing power. No one here, in this thread, are saying GPUs aren't faster for rendering... but you seem to deny the fact that CPUs are doing some great forward movements in this area.
    Senior Technical Supervisor
    Cinematics Department
    Massive - A Ubisoft Studio
    -----
    Intel Core i7-4790K @ 4GHz
    16GB Ram
    GeForce GTX 1080 8GB
    Windows 10 Pro x64

  5. #125
    Almost newbie Cageman's Avatar
    Join Date
    Apr 2003
    Location
    Malm÷, SWEDEN
    Posts
    7,644
    Quote Originally Posted by brent3d View Post
    If you aren't interested or don't agree with the subject of this thread then simply don't take part in it, no one is forcing you to read.
    So, if I say that 400 CPUs will be faster than anything you can throw together as a single person (machines with GPUs etc), then, I should not take part in this thread?

    I mean... I work at a Ubisoft studio that have more than 400 pretty good Workstations. I would be mad if I would do anything less than have those in a renderfarm overnight, even if I only can get 200 of them... instead I should go GPU... is that what you are suggesting?
    Senior Technical Supervisor
    Cinematics Department
    Massive - A Ubisoft Studio
    -----
    Intel Core i7-4790K @ 4GHz
    16GB Ram
    GeForce GTX 1080 8GB
    Windows 10 Pro x64

  6. #126
    Quote Originally Posted by Cageman View Post
    No, you didn't. You simply _ignored_ facts that CPUs are taking another big step forward in computing power. No one here, in this thread, are saying GPUs aren't faster for rendering... but you seem to deny the fact that CPUs are doing some great forward movements in this area.
    I totally agree CPUs are very good route to go for many depending on your needs. I'm sure we will see CPUs make even larger steps forward in the next couple years. I give AMD all the credit for this since they are the ones who started it with the Threadripper. Intel has made some big strides as well but only as a reaction in an attempt to remain competitive. If the Threadripper had not come to market GPU based rendering would look more attractive than ever.
    Threadripper 2990WX, X399 MSI MEG Creation, 64GB 2400Mhz RAM, GTX 1070 Ti 8GB

  7. #127
    Registered User
    Join Date
    Apr 2015
    Location
    France
    Posts
    93
    Quote Originally Posted by brent3d View Post
    Yes, I've liked being to manipulate looks and renders within the standard raytrace engine. PathTracing is for Photoreal or close to it.
    Don't agree though on the Octane volumetric stuff though see my example below:
    Cool your video

    But i don't know and honestly and no offense i prefer to LW volumetric just maybe it renders more "polished", otherwise it needs to compare more closer to both.

  8. #128
    Almost newbie Cageman's Avatar
    Join Date
    Apr 2003
    Location
    Malm÷, SWEDEN
    Posts
    7,644
    Quote Originally Posted by Nicolas Jordan View Post
    I totally agree CPUs are very good route to go for many depending on your needs. I'm sure we will see CPUs make even larger steps forward in the next couple years. I give AMD all the credit for this since they are the ones who started it with the Threadripper. Intel has made some big strides as well but only as a reaction in an attempt to remain competitive. If the Threadripper had not come to market GPU based rendering would look more attractive than ever.
    AMD are really inovating now... their Threadripper series is only at a start, and still, they manage to keep the costs of those quite low, compared to Intel. I am very impressed, indeed.
    Senior Technical Supervisor
    Cinematics Department
    Massive - A Ubisoft Studio
    -----
    Intel Core i7-4790K @ 4GHz
    16GB Ram
    GeForce GTX 1080 8GB
    Windows 10 Pro x64

  9. #129
    Almost newbie Cageman's Avatar
    Join Date
    Apr 2003
    Location
    Malm÷, SWEDEN
    Posts
    7,644
    Quote Originally Posted by MarcusM View Post
    I personally like render challenges. Then users can play with own settings and learn more from others.

    If you will do more in future I have request. Try do better render settings, more optimized, to not teach bad habits. Especially before make youtube videos whitch can go to wide range of recipients

    You could also use your own models. You know, to brag something self made.
    I have to agree. The LW scene was one of the most sloppy and un-optimized scenes I've seen in years, especially when LW2019 has Optix.
    Senior Technical Supervisor
    Cinematics Department
    Massive - A Ubisoft Studio
    -----
    Intel Core i7-4790K @ 4GHz
    16GB Ram
    GeForce GTX 1080 8GB
    Windows 10 Pro x64

  10. #130
    Almost newbie Cageman's Avatar
    Join Date
    Apr 2003
    Location
    Malm÷, SWEDEN
    Posts
    7,644
    Also... this... Multitasking that doesn't hog your computer.... In this video, he is encoding a cue of videos, while working with Maya+Arnold + Recording a video of it.

    If this would have been Octane or Redshift, your computer would have been hogged quite badly if not doing the task manager and do prio stuff...

    https://www.youtube.com/watch?v=ceihAHySwdw&t=
    Senior Technical Supervisor
    Cinematics Department
    Massive - A Ubisoft Studio
    -----
    Intel Core i7-4790K @ 4GHz
    16GB Ram
    GeForce GTX 1080 8GB
    Windows 10 Pro x64

  11. #131
    Not so newbie member lardbros's Avatar
    Join Date
    Apr 2003
    Location
    England
    Posts
    5,837
    There are a few inconsistencies I've read from your posts, which kind of shows a lack of understanding, or something.

    Autodesk don't market Mental Ray because nvidia stopped developing it. Instead, they're focusing on Iray and their AI stuff.

    Also... Arnold using nvidia Optix means nothing at all. The Arnold renderer has its own built in denoiser called noice. Do you know that LW 2019 has Optix denoiser built in too??
    Just seems like you're skipping the comments which are valid but disagree with, yet the comments which back up your view/opinion are all okay.
    Would have been nicer if you'd spent a bit more time weighing up the benefits of GPU versus CPU and vice versa. Believe it or not, there are benefits to using CPU over GPU too you know.
    LairdSquared | 3D Design & Animation

    Desk Work:
    HP Z840, Dual Xeon E5-2690 v2, 32GB RAM, Quadro K5000 4GB
    Desk Home:
    HP Z620, Dual Xeon E5-2680, 80GB RAM, Geforce 1050 Ti 4GB

  12. #132
    Quote Originally Posted by Cageman View Post
    So, if I say that 400 CPUs will be faster than anything you can throw together as a single person (machines with GPUs etc), then, I should not take part in this thread?

    I mean... I work at a Ubisoft studio that have more than 400 pretty good Workstations. I would be mad if I would do anything less than have those in a renderfarm overnight, even if I only can get 200 of them... instead I should go GPU... is that what you are suggesting?
    I'm not sure that a 400 cpu renderfarm has much to do with your average user, so not sure why the comparison.

    But on that subject, you mentioned earlier that it wouldn't be cost effective to replace the render farm with a GPU solution. It's only not cost effective because you already have a system in place that is likely working. But it would be more cost effective then say replacing the whole farm with a newer CPU farm and you certainly wouldn't need anywhere near 400 GPU's to match or even beat the CPU network.

    You also mentioned that CPUs are making notable advancements. They are, but they're not keeping up with Moores law, not even the newest Ryzens are. But GPU's are advancing even faster. I have one of the newer threadripper chips, but I only bought it for more all around performance in my apps. The future is GPU.

  13. #133
    Registered User
    Join Date
    Apr 2015
    Location
    France
    Posts
    93
    Quote Originally Posted by hrgiger View Post
    The future is GPU.
    Not sure but GPU is quite fast but the problem on my nvidia gtx 780 is a bit old when i compare on Blender it renders one thread and in CPU core makes more threads that work fast too. We can not deny the CPU core still has a future as well GPU.

  14. #134
    hi, i don't post much here but i'll add my general input on the cpu speed test conversation.

    I saw some render results data for a renderer that can use cpu or gpu and in those results a AMD 16core threadripper (32 thread) as on par with a nvidia 1080 or a 1080ti depending on the scene rendered out.
    so for me that's a good 'ballpark' on where each is at right now.

    I found this video interesting on how both the cpu and the gpu dev curves will flatten out soon, the gains will be TINY.
    Once CPU reaches 64 cores the curve flattens out.
    Once the GPU reaches 128 RT cores ...not much gonna get better either with 4000 RT cores...

    long story short might be multiple cpu's again...either in 1 pc or on a network
    or multiple gpu cards hung off the main workstation.

    so... the future needs another shift in the next 5 years.


    stee+cat
    real name: steve gilbert
    http://www.cresshead.com/

    Q - How many polys?
    A - All of them!

  15. #135
    Quote Originally Posted by 3dslider View Post
    Not sure but GPU is quite fast but the problem on my nvidia gtx 780 is a bit old when i compare on Blender it renders one thread and in CPU core makes more threads that work fast too. We can not deny the CPU core still has a future as well GPU.
    going a bit off topic here but on GPU , tile size impacts render times quite abit...don't use cpu render tile sizes like 48x48 ...got much bigger like 256x256 when using GPU.
    stee+cat
    real name: steve gilbert
    http://www.cresshead.com/

    Q - How many polys?
    A - All of them!

Page 9 of 12 FirstFirst ... 7891011 ... LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •