Page 1 of 2 12 LastLast
Results 1 to 15 of 22

Thread: Octane for LW 2 on a RD macBookPro

  1. #1
    Big fan of coffee raw-m's Avatar
    Join Date
    Jul 2003
    Location
    London
    Posts
    2,342

    Octane for LW 2 on a RD macBookPro

    Anyone got it on a 2012 Retina Display macBookPro & if so, how does it perform? Would you recommend?

    I'll be getting a new macPro in the next few months and assuming it'll be good on that. In the meantime I would like to get up and running on my MBP. Haven't tried the demo yet, don't want to get my hopes up too early!

  2. #2
    creacon
    Join Date
    Nov 2005
    Location
    Belgium
    Posts
    1,300
    If you're talking about the new model of the MacPro (better known as trashcan), that would be completely useless for Octane, Octane is written in Cuda C, and is only for NVidia cards.

    And a portable is not a good idea for GPU rendering either.

    creacon

  3. #3
    Big fan of coffee raw-m's Avatar
    Join Date
    Jul 2003
    Location
    London
    Posts
    2,342
    Thanks creacon. That's, er, disappointing.

  4. #4
    Perhaps, but understandable with the tech.

    A Titan is not so small.
    Lots of metal melting heat, too.

  5. #5
    Big fan of coffee raw-m's Avatar
    Join Date
    Jul 2003
    Location
    London
    Posts
    2,342
    So does it just not work or is performance so bad it's counter productive?

  6. #6
    LightWave documentation BeeVee's Avatar
    Join Date
    Feb 2003
    Location
    Pessac
    Posts
    5,125
    It just doesn't work if you have an AMD graphics card. It only works with NVidia.

    B
    Ben Vost - NewTek LightWave 3D development
    LightWave 3D Trial Edition
    AMD Threadripper 1950X, Windows 10 Pro 64-bit, 32GB RAM, nVidia GeForce GTX 1050Ti (4GB and 768 CUDA cores) and GTX 1080 (8GB and 2560 CUDA cores) driver version 430.86
    AMD FX8350 4.2 GHz, Windows 7 SP1 Home Premium 64-bit, 16GB RAM, nVidia GeForce GTX 1050Ti (416.34, 4GB and 768 CUDA cores)
    Dell Server, Windows 10 Pro, Intel Xeon E3-1220 @3.10 GHz, 8 GB RAM, Quadro K620
    Laptop with Intel i7, nVidia Quadro 2000Mw/ 2GB (377.83 and 192 CUDA cores), Windows 10 Professional 64-bit, 8GB RAM
    Mac Mini 2.26 GHz Core 2 Duo, 4 GB RAM, 10.10.3

  7. #7
    Big fan of coffee raw-m's Avatar
    Join Date
    Jul 2003
    Location
    London
    Posts
    2,342
    Cheers Ben. I'll perhaps think about seeing if the next MacPro update has nvidia options (I know, extremely optimistic!).

  8. #8
    LightWave documentation BeeVee's Avatar
    Join Date
    Feb 2003
    Location
    Pessac
    Posts
    5,125
    It seems like the MacBook Pro has a GeForce 750M (from a Google search result from Oct 2013), which is nice for display, but it only has 384 CUDA cores, which is the number you need to look at for Octane. That's twice as many as my Quadro 2000M-powered Windows laptop, but I'd rather use the desktop machine in my sig that has 1344 cores for a graphics card that cost less than 200 quid new.

    B
    Ben Vost - NewTek LightWave 3D development
    LightWave 3D Trial Edition
    AMD Threadripper 1950X, Windows 10 Pro 64-bit, 32GB RAM, nVidia GeForce GTX 1050Ti (4GB and 768 CUDA cores) and GTX 1080 (8GB and 2560 CUDA cores) driver version 430.86
    AMD FX8350 4.2 GHz, Windows 7 SP1 Home Premium 64-bit, 16GB RAM, nVidia GeForce GTX 1050Ti (416.34, 4GB and 768 CUDA cores)
    Dell Server, Windows 10 Pro, Intel Xeon E3-1220 @3.10 GHz, 8 GB RAM, Quadro K620
    Laptop with Intel i7, nVidia Quadro 2000Mw/ 2GB (377.83 and 192 CUDA cores), Windows 10 Professional 64-bit, 8GB RAM
    Mac Mini 2.26 GHz Core 2 Duo, 4 GB RAM, 10.10.3

  9. #9
    Mike, in Monsters Inc Tartiflette's Avatar
    Join Date
    Feb 2003
    Location
    Montpellier, South France
    Posts
    596
    I would advise NOT to use any laptop (even with an nVidia, but this brand is mandatory with Octane anyway) to do GPU rendering, first because of overheating you may encounter and second because you really need at least 2 cards to do any serious GPU rendering, one for the display and the other one for the rendering only.

    Overheating is the first problem, even more with a MacBookPro retina, which is really slim and has to deal with a big display area already, and can lead your laptop to age more quickly.

    And GPU rendering with only one card is such a mess that you don't want to try that to make your mind about this kind of rendering solution. a GPU doesn't have any mechanism to deal with a lot of task at the same time (not as a CPU and that's also why it's so fast) and when you're rendering with it, be prepared to have a sluggish (to say the least) GUI. Everything takes ages to redraw, the basic UI becomes unresponsive and you end up frustrated with the whole thing, whereas it's a joy to use when you have a GPU entirely dedicated to the rendering process.

    Hope it helps.


    Cheers,
    Laurent aka Tartiflette.

  10. #10
    Quote Originally Posted by Tartiflette View Post
    I would advise NOT to use any laptop (even with an nVidia, but this brand is mandatory with Octane anyway) to do GPU rendering, first because of overheating you may encounter and second because you really need at least 2 cards to do any serious GPU rendering, one for the display and the other one for the rendering only.

    Overheating is the first problem, even more with a MacBookPro retina, which is really slim and has to deal with a big display area already, and can lead your laptop to age more quickly.

    And GPU rendering with only one card is such a mess that you don't want to try that to make your mind about this kind of rendering solution. a GPU doesn't have any mechanism to deal with a lot of task at the same time (not as a CPU and that's also why it's so fast) and when you're rendering with it, be prepared to have a sluggish (to say the least) GUI. Everything takes ages to redraw, the basic UI becomes unresponsive and you end up frustrated with the whole thing, whereas it's a joy to use when you have a GPU entirely dedicated to the rendering process.

    Hope it helps.


    Cheers,
    Laurent aka Tartiflette.
    I work on a MacBook Pro retina and even though I don't know about Octane, I seriously doubt that Apple would build the machine in a way that would make it risk overheating. (Except if you cover it with something). I have owned it since it's release and it's almost perfect in every way. When using other software that is GPU intense it accelerates the fans just as it does when the cpu is at high pressure. Just my thoughts.

  11. #11
    Member toeknee's Avatar
    Join Date
    Oct 2005
    Location
    Houston,Texas
    Posts
    245
    HenrikSkoglund dude, you don't need to be so sensitive. The advice that BeeVee is giving is solid advice. He is not hating on Mac. I would think he is saying this because the licensing is very clear about no refunds. It says something about New Zealand law and that they cannot give a refund. I have an ASUS laptop with a GTX 560M with 3GB ram and when I use octane it gets very hot. You can turn down the priority on the render but then you are intentionally slowing down your render. This is why I added an extra GTX 760 to my desk top. I know that this and the original GTX 570 are not the best but I do see a dramatic difference with the two card solution. If you look on line you can find people who use four Titans in one system. The four titan solution is insanely fast. This also brings up the other issue that is why I use PCs for my 3D work. First off I work in an all Mac office and I have two Mac book pros and one large Mac system at the house. The point is that Octane is more powerful the more and faster GPUs you throw at them. Apple is all about giving their customers what they can make the most money with. I say this with one huge caveat. On the support side the hardware they use is good but it is not very fast and especially if you are looking at what you get for what you spend. I think that the operating system has many advantages especially for artist but when you are building a render farm from GPUs they will only frustrate you. Because you can't change the motherboard to take advantage of very many video cards.
    The LW Beast From the East

  12. #12
    Super Member spherical's Avatar
    Join Date
    Dec 2004
    Location
    San Juan Island
    Posts
    4,686
    Different strokes.... but I have been surprised at the number of users who run LightWave on a laptop. I did use one as a render box, when my workstations were ancient Athlon Thunderbirds. Now that the new workstations are on line, the laptop has returned to its normal functionality; that of being a unit that is portable and gets the few tasks done in a pinch that become necessary (email, word processing) when traveling. I do know a couple of users who just leave the house every day, go to a coffee shop and work there. Appealing to not be disturbed by the daily interruptions that happen in the studio (I need an office with a door) but the drive to the nearest espresso bar is about an hour. So, I make my own and my own workstations, too. Room to grow, heat not a problem nor are speed and screen real estate.

    Of course, budget is a concern, at least for most of us. But, given the choice of getting a new laptop, I would build a workstation instead and continue to use the laptop more in its own realm of capability.
    Last edited by spherical; 08-24-2014 at 05:25 PM.
    Blown Glass · Carbon Fiber + Imagination

    Spherical Magic | We Build Cool Stuff!

    "When a man loves cats, I am his friend and comrade, without further introduction." - Mark Twain

  13. #13
    LightWave documentation BeeVee's Avatar
    Join Date
    Feb 2003
    Location
    Pessac
    Posts
    5,125
    It has to be said as well that tablets have assumed many of the uses laptops were put to before (says he trying to use a laptop to edit something when he *really, really* needs a multi-screen setup and proper keyboard).

    B
    Ben Vost - NewTek LightWave 3D development
    LightWave 3D Trial Edition
    AMD Threadripper 1950X, Windows 10 Pro 64-bit, 32GB RAM, nVidia GeForce GTX 1050Ti (4GB and 768 CUDA cores) and GTX 1080 (8GB and 2560 CUDA cores) driver version 430.86
    AMD FX8350 4.2 GHz, Windows 7 SP1 Home Premium 64-bit, 16GB RAM, nVidia GeForce GTX 1050Ti (416.34, 4GB and 768 CUDA cores)
    Dell Server, Windows 10 Pro, Intel Xeon E3-1220 @3.10 GHz, 8 GB RAM, Quadro K620
    Laptop with Intel i7, nVidia Quadro 2000Mw/ 2GB (377.83 and 192 CUDA cores), Windows 10 Professional 64-bit, 8GB RAM
    Mac Mini 2.26 GHz Core 2 Duo, 4 GB RAM, 10.10.3

  14. #14
    Quote Originally Posted by Tartiflette View Post
    I would advise NOT to use any laptop (even with an nVidia, but this brand is mandatory with Octane anyway) to do GPU rendering, first because of overheating you may encounter and second because you really need at least 2 cards to do any serious GPU rendering, one for the display and the other one for the rendering only.

    Overheating is the first problem, even more with a MacBookPro retina, which is really slim and has to deal with a big display area already, and can lead your laptop to age more quickly.

    And GPU rendering with only one card is such a mess that you don't want to try that to make your mind about this kind of rendering solution. a GPU doesn't have any mechanism to deal with a lot of task at the same time (not as a CPU and that's also why it's so fast) and when you're rendering with it, be prepared to have a sluggish (to say the least) GUI. Everything takes ages to redraw, the basic UI becomes unresponsive and you end up frustrated with the whole thing, whereas it's a joy to use when you have a GPU entirely dedicated to the rendering process.

    Hope it helps.


    Cheers,
    Laurent aka Tartiflette.
    100% agreed!

    as a 'heavy' octane tester i'd like to add some info's here as well. i'm using octane on a mid 2010 mac pro with an external netstor PCI expander box, with three GTX 780 (one internal to macpro) and one GTX 680 4G card. for what i could test until now, octane renders fast and with excellent quality all kind of simple to medium complexity interior and exterior scenes. however, i found maxwell running on the a dual 6 core CPU mac pro rendering a bit faster when processing a very heavy interior scene. i was testing this on a shopping mall scene with 12 million of polys and large textures, which already was at at the limit of what the 3G of VRAM in the nvidia's will support inside GPU rendering. i guess the reason is that maxwell is extremely well optimized for xeon CPU's. apart of speed, there are still other advantages when rendering with maxwell over octane, like the ability to render in multilight mode and always getting antialiased material, id, depth and other technical layers all in one go, or being able to resume and refine a rendering or a sequence at any time.

    summing it up, octane is a good option for certain but not for all scenes. exterior scenes, object renderings, scenes with instances, deformations and in general for animation work well, but even then it is not always a 'speed wonder'. the interactive previewer is very fast, bringing back that old 'fprime feeling', but final rendering can take a long time for refinement, especially when aiming for the highest quality level using pathtracing and not the direct lighting kernel. in any case you will need at least a fast last generation nvidia GPU or - much better - two of them, as one will always partially be used to process your scene in opengl.

    as for using the mac pro with an external PCI enclosure - like netstor or cubix, be warned, as only up to OSX 10.9.1 all CUDA devices will be detected correctly! there is a known bug since 10.9.2 onwards which makes a mac ignore some (usually the last) device(s) in a PCI-E chain! only black titan cards apparently don't show this problem on recent OSX versions. just google for the keywords if interested, there are lots of threads in the specific blackmagic, netkas and macrumors forums. hoping that 10.9.5 or yosemite will fix this.

    last but not least, octane is running smoothly and totally stable on mac LW, as juanjo did an excellent job when coding the plugin and cleaning it up for OSX!
    Last edited by 3dworks; 08-25-2014 at 07:57 AM.
    3dworks visual computing
    demo reel on vimeo
    instagram

    OSX 10.12.x, macpro 5.1, nvidia gtx 980 ti, LW2015.x / 2018.x, octane 3.x

  15. #15
    Mike, in Monsters Inc Tartiflette's Avatar
    Join Date
    Feb 2003
    Location
    Montpellier, South France
    Posts
    596
    Quote Originally Posted by HenrikSkoglund View Post
    I work on a MacBook Pro retina and even though I don't know about Octane, I seriously doubt that Apple would build the machine in a way that would make it risk overheating. (Except if you cover it with something). I have owned it since it's release and it's almost perfect in every way. When using other software that is GPU intense it accelerates the fans just as it does when the cpu is at high pressure. Just my thoughts.
    Don't worry i didn't mean overheating in a sense of risking anything with the machine (or if it sounded like this, my apologies, that wasn't meant this way), i have had pretty much every MacBookPro that Apple has been releasing over the years (i'm a Mac freak !) and i know they are good machines.

    I just wanted to say that the performance would go even lower because of possibilities of overheating and in that case the CPU or GPU just reduce its speed by itself.
    That plus the fact i hate hearing my MacBookPro's fans going crazy (i'm a bit of over-emotional, i know ! ) and you'll see why i'm saying it's not recommended to use Octane on a laptop.


    Cheers,
    Laurent aka Tartiflette.

Page 1 of 2 12 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •