Results 1 to 15 of 15

Thread: RTX 2080 Ti + Octane (benchmark tests)

  1. #1
    Super Member OlaHaldor's Avatar
    Join Date
    May 2008
    Location
    Norway
    Posts
    996

    RTX 2080 Ti + Octane (benchmark tests)

    Since the latest OctaneBench doesn't support the RTX series (yet) I have compared it to data I have collected since I owned the first gen Titan, then the GTX 1080 and now the 2080 Ti. Thought it might be of interest to some of you.

    TLDR; The 2080 Ti is a massive upgrade from the 1080

    Click the image to view fullsize.
    Click image for larger version. 

Name:	result.jpg 
Views:	571 
Size:	211.9 KB 
ID:	142996
    3D Generalist
    Threadripper 2920x, 128GB, RTX 2080 Ti, Windows 10

  2. #2

    Thanks! a Nice comparison there.

    Thinking of getting a used 1080Ti, wonder how that would compare ?
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  3. #3
    Electron wrangler jwiede's Avatar
    Join Date
    Aug 2007
    Location
    San Jose, CA
    Posts
    6,494
    Depends very highly on workload, obviously, but in many reviews 1080Ti is (very roughly) somewhere between 20-25% improvement over 1080. See this article for another comparison of 2080Ti, 2080, 1080Ti and 1080.
    John W.
    LW2015.3UB/2018.0.7 on MacPro(12C/24T/10.13.6),32GB RAM, NV 980ti

  4. #4

    thanks, fixed link
    https://www.forbes.com/sites/antonyl.../#5e07f9b159fe

    will check it out
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  5. #5
    Super Member OlaHaldor's Avatar
    Join Date
    May 2008
    Location
    Norway
    Posts
    996
    I would love to have something more beefy to show off, but.. Performance isn't half bad.
    3D Generalist
    Threadripper 2920x, 128GB, RTX 2080 Ti, Windows 10

  6. #6

    in regards to performance/price it is a very nice card.
    tempting to go for this one, just to much $ for me atm 
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  7. #7
    I will surely sell my two GTX 1080 Ti then replace them with two RTX 2080 Ti or more…
    Eddy Marillat - PIXYM
    WS : MB ASUS X299-Pro/SE - i9 7980XE 2,6ghz 18C/36T - 32GB Ram- 2 GTX 1080 TI - Win 10 64
    GPU Net Rig : MB Biostar - Celeron 2,8 ghz 2C/2T - 8GB Ram - 2 RTX 2080 TI - Win 7 64

  8. #8
    Registered User
    Join Date
    Aug 2006
    Location
    UK
    Posts
    21
    Quote Originally Posted by OlaHaldor View Post
    Since the latest OctaneBench doesn't support the RTX series (yet) I have compared it to data I have collected since I owned the first gen Titan, then the GTX 1080 and now the 2080 Ti. Thought it might be of interest to some of you.

    TLDR; The 2080 Ti is a massive upgrade from the 1080

    Click the image to view fullsize.
    Click image for larger version. 

Name:	result.jpg 
Views:	571 
Size:	211.9 KB 
ID:	142996
    Absolutely spanks my iMac's 680MX! Virtually 'real-time' compared to mine. I don't have Octane. But when I first got LW2018, the cpu renderer 'seemed fast' compared to 1997 3D software (extreme comparison...but I remember going away to make tea for 30 minutes and coming back...now it's 'just' a a minute-ish depending.) Now it's gpu rendering in seconds...not minutes! Insane. I hope in LW2019 Newtek can deliver a gpu/cpu hybrid that burners the rubber on the tyres.

    2080Ti massive upgrade. A rig to be pleased with. I hope it's killing it for your productivity...

    But didn't Nvidia really 'bump' the prices this time around? That's what happens when you don't have competition in the GPU arena. Either way, time is money. I'm sure the card will pay for itself...

    In that regard, I hope AMD/Radeon get back in the game next year and offer some competition.

    Any chance you can do some gpu renders eg. The white tiger image from Newtek's LW gallery? Or anything else?

    If this is what GPUs are doing this year, next year's should be even more impressive when I'll be looking to upgrade this '2014' iMac (8 gigs of ram, i7 4 core, 680Mx gpu) to something else.

    Kyle.
    Last edited by kyleprometheus; 10-11-2018 at 10:38 AM.

  9. #9
    LightWave documentation BeeVee's Avatar
    Join Date
    Feb 2003
    Location
    Pessac
    Posts
    5,051
    The problem with the Mac, when it comes to Octane, is that you're locked into what Apple choose for you. I have a Mac friend who's looking into getting an external box with Thunderbolt for a pair of NVidia cards, but it's big money.

    B
    Last edited by BeeVee; 10-11-2018 at 04:44 PM. Reason: doubled
    Ben Vost - NewTek LightWave 3D development
    LightWave 3D Trial Edition
    AMD Threadripper 1950X, Windows 10 Pro 64-bit, 32GB RAM, nVidia GeForce GTX 1050Ti (4GB and 768 CUDA cores) and GTX 1080 (8GB and 2560 CUDA cores) driver version 430.86
    AMD FX8350 4.2 GHz, Windows 7 SP1 Home Premium 64-bit, 16GB RAM, nVidia GeForce GTX 1050Ti (416.34, 4GB and 768 CUDA cores)
    Dell Server, Windows 10 Pro, Intel Xeon E3-1220 @3.10 GHz, 8 GB RAM, Quadro K620
    Laptop with Intel i7, nVidia Quadro 2000Mw/ 2GB (377.83 and 192 CUDA cores), Windows 10 Professional 64-bit, 8GB RAM
    Mac Mini 2.26 GHz Core 2 Duo, 4 GB RAM, 10.10.3

  10. #10
    Quote Originally Posted by BeeVee View Post
    The problem with the Mac, when it comes to Octane, is that you're locked into what Apple choose for you. I have a Mac friend who's looking into getting an external box with Thunderbolt for a pair of NVidia cards, but it's big money.

    B
    i'm running a cheap hp refurbished and small Z600 workstation on windows pro 7 with external PCIE expansion box for this together with my macs. works very well! even IPR response is fast.

    inside the macpro workstations (modular versions) 980ti cards are working without any issues. nvidia does a good job by supporting them with drivers. maybe time to replace them with newer gear... of course, imacs or later macpro's don't have that expandibility. a new modular macpro
    is supposed to be released 2019, however. fingers crossed.
    Last edited by 3dworks; 10-12-2018 at 02:42 AM.
    3dworks visual computing
    demo reel on vimeo
    instagram

    OSX 10.12.x, macpro 5.1, nvidia gtx 980 ti, LW2015.x / 2018.x, octane 3.x

  11. #11
    LightWave documentation BeeVee's Avatar
    Join Date
    Feb 2003
    Location
    Pessac
    Posts
    5,051
    Fingers crossed indeed I do like Macs, but Apple seems intent on dumbing them further and further down. My Power Mac G5 tower is still the best Mac I ever had.

    B
    Ben Vost - NewTek LightWave 3D development
    LightWave 3D Trial Edition
    AMD Threadripper 1950X, Windows 10 Pro 64-bit, 32GB RAM, nVidia GeForce GTX 1050Ti (4GB and 768 CUDA cores) and GTX 1080 (8GB and 2560 CUDA cores) driver version 430.86
    AMD FX8350 4.2 GHz, Windows 7 SP1 Home Premium 64-bit, 16GB RAM, nVidia GeForce GTX 1050Ti (416.34, 4GB and 768 CUDA cores)
    Dell Server, Windows 10 Pro, Intel Xeon E3-1220 @3.10 GHz, 8 GB RAM, Quadro K620
    Laptop with Intel i7, nVidia Quadro 2000Mw/ 2GB (377.83 and 192 CUDA cores), Windows 10 Professional 64-bit, 8GB RAM
    Mac Mini 2.26 GHz Core 2 Duo, 4 GB RAM, 10.10.3

  12. #12
    Registered User
    Join Date
    Aug 2006
    Location
    UK
    Posts
    21
    Quote Originally Posted by BeeVee View Post
    Fingers crossed indeed I do like Macs, but Apple seems intent on dumbing them further and further down. My Power Mac G5 tower is still the best Mac I ever had.

    B
    I've got everything crossed. Apple creative users have been crying out for a G5 Tower...with a choice of Nvidia or AMD card (to taste...) for a modest price. It was state of the art. The perfect tower design. They're making hard work of updating it for 2019. Dell, HP et al can all do a tower. It's not rocket science. Credit due to HP, see some of their radical tower designs on their site. That fun sense of drama and colour that Apple seems to have lost on their way to boutique desktop designs.

    I like my iMac. No complaints. Super quiet. 4-5 years old and still going strong. It can handle LW3D alright. At least in terms of what I'm doing with it.

    Don't get me wrong, I like the iMac 5k and the iMac Pro (where you can only upgrade the ram at purchase? Colour me cynical.) My first Mac was a 'clone' and it was a tower (equivalent to the Mac 8500.) Was a 604e chip in it. Great stuff. Photoshop, Poser. Raydream Studio. Painter. Back then I was eyeing up Lightwave 4...

    Those days were fun. I loved the G3, G4 and G5 Power Mac towers. All kick-*** machines.

    I'd still buy that kind of design IF Apple provided one for the prices they used to!

    Apple seems more keen on locking down machines for upsell reasons to maximise shareholder value. That's what happens when the marketing guys take over rather than the product guys. They get a little greedy. (Yes, we know Apple have never been the cheapest...) But when they jacked the price of the 'can' 'Mac Pro' to £2500 (without keyboard and mouse...*Looks skyward.) you kinda get the sense that they'd lost touch with reality and the creative folks in that £1000-£2000 price bracket who don't have millions in stock options. And then they recently apologised for dropping the ball with it...when it was obvious to all that we just wanted a reasonably priced tower with some flexibility on upgrading parts.

    I get the iPhone is their cash cow...but the dust collecting neglect of the Mac Mini and Mac Pro are criminal for a company with 1 trillion capitalisation.

    I haven't ruled out augmenting my Mac set up with an AMD (how many cores!?) and a dual GPU set up. I'll see how the competitive landscape is in 2019 when the new 'Mac Pro' is finally released. (But I'm betting they go down the starting at £3-5 grand route. And that simply won't do for me.)

    Sorry, that turned into a rant.

    At least they're allowing egpus over Thunderbolt. A double chassis tower could augment a 5k iMac, I guess..?

    Regards,

    Kyle.

  13. #13
    Registered User
    Join Date
    Aug 2006
    Location
    UK
    Posts
    21
    Quote Originally Posted by BeeVee View Post
    The problem with the Mac, when it comes to Octane, is that you're locked into what Apple choose for you. I have a Mac friend who's looking into getting an external box with Thunderbolt for a pair of NVidia cards, but it's big money.

    B
    Apple lost sight of the value it used to give with the G5 Power Mac towers. Sure, you COULD pay upto £3 grand or more for a fully decked out one...but you could also get on the bottom of the ladder for £1,500 instead of the eye watering £2,500!

    Apple seemed to have locked out Nvidia for political reasons. Only they will know. But after a class action regarding the gpus in their MacBooks some while back, Nvidia gpus stopped appearing in Macs. Just a coincidence?

    I have an Nvidia 680MX in my iMac. It was in the top ten gpus of its time. Miles behind now though...

    Apple are trying to lock things down. (Using tape in their iMacs and making them sealed units...with no ram access on the iMac Pro..?) It's simple, Apple. Give us an affordable tower that we can put SDDs, ram, gpus in...with 'just a little' choice. It's not a trend I like. Because you get squeezed for extra ram Apple price style at time of purchase or a premium for a Terabyte SSD. Premium? Or creamium?

    I'll see how things are in 2019. Maybe they'll use Britain's eventual Brexit as an excuse to jack up prices even further...

    Regards,

    Kyle.

  14. #14
    Super Member OlaHaldor's Avatar
    Join Date
    May 2008
    Location
    Norway
    Posts
    996
    Quote Originally Posted by kyleprometheus View Post
    Any chance you can do some gpu renders eg. The white tiger image from Newtek's LW gallery? Or anything else?
    Is that a demo scene in LW 2018? Or do you know where I can get it? I'd love to give it a shot.
    I don't have LW 2018, I don't see a big reason for me to upgrade over 2015 as it stands now.
    3D Generalist
    Threadripper 2920x, 128GB, RTX 2080 Ti, Windows 10

  15. #15
    Registered User
    Join Date
    Aug 2006
    Location
    UK
    Posts
    21
    Quote Originally Posted by OlaHaldor View Post
    Is that a demo scene in LW 2018? Or do you know where I can get it? I'd love to give it a shot.
    I don't have LW 2018, I don't see a big reason for me to upgrade over 2015 as it stands now.
    https://www.lightwave3d.com/community/gallery/

    Hello Ola,

    The link above is on the main Lightwave 2018 page...on the 'Check out the Gallery' section. Model: Mauro Corveloni Render: Lino Grandi.

    Click image for larger version. 

Name:	lightwave_gallery_2018_4x2.jpg 
Views:	26 
Size:	124.2 KB 
ID:	143086

    I couldn't find it on YouTube. Nor in my LightWave 2018 scene or obj files (unless I over looked it...) But maybe an email to Lino may yield results for the source file to test. It's a fine image.


    https://www.youtube.com/watch?v=4cd14roSHr4
    https://www.youtube.com/watch?v=mhqf1n2xq80
    https://www.youtube.com/watch?v=Zqfj0535hQk
    https://www.youtube.com/watch?v=LYDAnxrT92g

    These links show Octane. From 2010-latter days with 4xNvidia Titan cards on a 'lego' test object/scene.

    Intriguing stuff. I have looked at Octane before on youtube i.e. some of the videos above. I didn't quite 'get it' looking at the older youtube. But looking at the 4 titan example, I sure get it now!

    Hope this helps.

    Regards,

    Kyle.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •