Page 3 of 3 FirstFirst 123
Results 31 to 40 of 40

Thread: Choose your weapon GPU or CPU rendering?

  1. #31
    A.K.A "The Silver Fox" Gungho3D's Avatar
    Join Date
    Aug 2009
    Location
    Oztralia
    Posts
    284
    Quote Originally Posted by Lewis View Post
    ... In V4 there is also out of core for Geometry so that basically means it's almost unlimited scene size/models you can render in v4
    That is great news, wasn't aware of that. Been hitting an upper polygon limit in Octane v3 which straight out crashes LW/Octane (although LW native handles it ok).

  2. #32
    Super Duper Member kopperdrake's Avatar
    Join Date
    Mar 2004
    Location
    Derbyshire, UK
    Posts
    3,107
    Likewise - good to hear about the geometry handling. That was why I had to comp one arch viz shot with loads happening inside and outside

    Another caveat I've bumped into with Octane is when two sets of geometry overlap, say an object and its morph target, you can get some rendering weirdness, even if one is totally dissolved. There are a few small gotchas that creep up like that, but I've found a work around for most. There's also the animations where you need to reload the frame due to movement (I think it was instance movement that needed a scene reload for each frame rendered), and the scene reload can take a chunk of your rendering time.

    But the reason I went with Octane was a simple project where depth of field and motion blur were crippling the native renderer (2015.3). I tested Octane out on trial with my GTX970 and was so impressed with how it dealt wit motion blur and DoF out of the box that I bought it and a single 1080Ti, usin the 1080Ti in tandem with the GTX970. Each time I did a job I put the money aside I've use for rendering and built a pot for another 1080Ti to replace the GTX970. Next step would be to build an external box for more 1080Ti cards I guess, but I haven't needed to do that yet.
    - web: http://www.albino-igil.co.uk - 2D/3D Design Studio -
    - PC Spec: Intel i7-5960X OC@4.2GHz | 32Gb | 2 x GeForce GTX 1080 Ti 11Gb | Windows 10 Pro -

  3. #33
    Grafiks iz us dsol's Avatar
    Join Date
    Sep 2003
    Location
    London, UK
    Posts
    1,854
    The thing that limits Octane is the number of GPUs you can stick in a PC - and this is massively affected by the CPU and chipset you have. So, I've just upgraded to a cheap 1st gen 12-core threadripper (the CPU was less than £300!) and will be transplanting my twin 1080tis from my old i7 system to it. With the extra PCI-channels that Threadripper brings, I can stick at least another 2 GPUs in there in future (once the 2080s come down in price). I only do GPU rendering now (with Octane) - I just love the look of the renders so much.

    Buuuuuuut..... I think it's worth pointing out the downsides with Octane as well. Though it renders brute force GI incredibly fast, it's achilles heel is that you are pretty much stuck with using a single PC for rendering (network rendering via ORC is supposed to be a nightmare). And their license agreement means that 3rd party render farms aren't allowed. They are planning a rather clever blockchain-based distributed render system called RNDR, but god knows when that will actually be up and running. Octane 4 is close to getting to full release, but there's still quite a few bugs in there.

    But getting back to your original post, personally I'd try and get hold of a cheap 16 core threadripper and 1 1080ti. Then later, when prices drop you can either upgrade to a 32-thread Threadripper 2 (if you want to go the CPU route) or get an extra 2 x 2080s - which will have the ability to share a RAM pool via NVLink - so 2 x 8GB cards will create a shared 16GB space, to work alongside the original 1080ti's 11GB.
    Dan Sollis : Independent CGI Designer & compositor
    Digital Distortion : Post Production for Film and Television
    http://www.digitaldistortion.net

    http://www.londonlightwave.org.uk/

  4. #34
    Super Duper Member kopperdrake's Avatar
    Join Date
    Mar 2004
    Location
    Derbyshire, UK
    Posts
    3,107
    Quote Originally Posted by dsol View Post
    The thing that limits Octane is the number of GPUs you can stick in a PC - and this is massively affected by the CPU and chipset you have.
    I told myself that if I ever needed more than the two cards, I'd look at something like the Amfeltec Expansion Cluster. They don't seem too bad in price, especially when compared to the cards themselves, plus they'd keep you warm in the winter

    http://amfeltec.com/products/gpu-oriented-cluster/

    - web: http://www.albino-igil.co.uk - 2D/3D Design Studio -
    - PC Spec: Intel i7-5960X OC@4.2GHz | 32Gb | 2 x GeForce GTX 1080 Ti 11Gb | Windows 10 Pro -

  5. #35

    in regards to Clusters, or 3 cards or more,

    doesn't Octane 4 require you to buy more licenses when using more than 2 Gpu cards though ?
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  6. #36
    Quote Originally Posted by erikals View Post
    in regards to Clusters, or 3 cards or more,

    doesn't Octane 4 require you to buy more licenses when using more than 2 Gpu cards though ?
    No, you got it wrong.
    RAM-Studio
    WS - Dual Xeon E5-2698v4/128GB/Win10x64/x TitanX(M) + 2x GTX 1080Ti + GTX 1080
    My LWM Video Car Modeling Tutorial

  7. #37

    [relief]   glad i got that wrong.    
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  8. #38
    Stuck in a very big cube Waves of light's Avatar
    Join Date
    Aug 2007
    Location
    South Yorkshire, England
    Posts
    2,499
    I'm an Octane convert too. I purchased 3 x 780tis from ebay (around £100 each) and a second hand motherboard with 4 PCIe slots (2x 16 and 2x8) which came with an old i7 2600k and 8gb of ram. I then built a home made render box with 4 intake fans and 4 exhaust fans for cooling. I had to buy an expensive raiser (don't buy cheap - it crashes Octane) and I haven't yet had the need to upgrade.

    I'm sure I will once my scenes require for VRAM - and that's where the 780 tis may not be good enough for you. They only have 3GB and even if you add say a 1080ti (with 11gb) Octane will only use the VRAM limit of the smallest card. There is something called out-of-core memory, which uses system ram as a backup for heavy scenes, but I've never needed it.

    I went down the subscription route (20$ per month) and that will, once V4 comes out, allow me to use upto 19 GPUs and will give me two seats (so one for one PC and one for the render box). The free version will allow upto 2 GPUs, but may not include all third party plugins (e.g. Lightwave, Blender, Cinema 4D plugins) but you'll get the standalone renderer to try out.

    For product viz it is stupidly fast.. and with the new denoiser in V4, even quicker. And you get no GI flicker.

    I did a cartoon test animation recently and it was coming in at 4-6 secs per frame (using just two 780tis). It was all Octane Toon shaded:
    http://rsldesigns.co.uk/downloads/Te...on_Grimm_2.mp4

    And that's the best part... you can start off with old tech and still get really good render times. Then, when the time comes, upgrade when necessary (or when a bigger job requires it).


    Systen Spec: i7 4930k (OC @4.5ghz),ASUS p9x79 MB, 32GB Ballistix RAM, H60 cooling, Samsung SSD 120GB, WD 1TB

  9. #39
    Registered User
    Join Date
    Mar 2016
    Location
    Oxford, UK
    Posts
    520
    @ Waves of light, very interesting, and at HD resolution too with what appears DOF blur on the hand. I imagine Octane would make very short work of character animation at SD resolution, which I find quite exciting.

    In my understanding, HD is about a quarter of 4K resolution, but four times SD resolution, so I was wondering if the rendering time scales in "fours" in direct proportion to those output resolutions.

  10. #40
    Stuck in a very big cube Waves of light's Avatar
    Join Date
    Aug 2007
    Location
    South Yorkshire, England
    Posts
    2,499
    Quote Originally Posted by TheLexx View Post
    @ Waves of light, very interesting, and at HD resolution too with what appears DOF blur on the hand. I imagine Octane would make very short work of character animation at SD resolution, which I find quite exciting.

    In my understanding, HD is about a quarter of 4K resolution, but four times SD resolution, so I was wondering if the rendering time scales in "fours" in direct proportion to those output resolutions.
    @TheLexx -

    Ok, I optimised the scene and the results are as follows (no denoiser needed as I'm not using GI for the toon shader):

    i7 2600k @3.4, 8GB of RAM, 2x780ti

    100 samples, MB done inside of Octane:

    SD (1280x720) - 1 sec per frame (system RAM used 2014MB, 1698 by Octane - avg)
    HD (1920x1080) - 3 secs per frame (system RAM used 2254MB, 1864 by Octane - avg)
    4k (3840x2160) - 15 secs per frame (system RAM used 2944MB, 1998 by Octane - avg)

    I need to look more into Denoiser as for some reason it's taking a long time to produce and save out the denoised render (like 15 secs per frame at HD). This maybe to do with my tech or because it's the beta version of Octane 4.

    But at SD I was able to render out all these frames in just under 6 mins, stick it into Fusion and produce this video:
    http://rsldesigns.co.uk/downloads/Te...Grimm_2_SD.mp4


    Systen Spec: i7 4930k (OC @4.5ghz),ASUS p9x79 MB, 32GB Ballistix RAM, H60 cooling, Samsung SSD 120GB, WD 1TB

Page 3 of 3 FirstFirst 123

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •