PDA

View Full Version : 5 million + polygons...



mikkelen
05-10-2012, 07:05 PM
I run OSX and are currently working on a 2-3 million polygon project on an iMac (pain) -> second hand MacPro in the mail... I guess my next project will be at least at 5 million or more polygons... What kind of GFX-card is needed to run such scenes real-time in Layout?

oobievision
05-10-2012, 07:29 PM
well I could only think of one. the Nvidia QUadro 6000 6gb ddr5 card. though its not cheap around $4000 US. PNY also makes a version of this card.

mikkelen
05-10-2012, 08:52 PM
I see, it's quite expensive and not OSX compatible yet. How is the Quadro 4000? It's quite cheap, but will there be much improvement from an ATI Radeon HD 5770?

Rayek
05-10-2012, 10:39 PM
A Quadro is a waste of money, in my opinion. And according to this:
http://www.cgchannel.com/2011/10/review-professional-gpus-nvidia-vs-amd-2011/

you will not see much of a viewport performance increase. Tests performed in LW10, a pair of six-core 32nm Xeon X5680 CPUs running at 3.33GHz, 18gb ram. Not exactly a light-weight rig.

One thing that bugs me about Layout (let alone Modeler), is that the opengl viewport seems so slow: in Blender I can easily go into the tens of millions, and still have a smooth viewport. In Layout (11 trial) I can hardly go up to 1/3 of that amount, and the view starts to lag. In 9.6 it is much, much worse.

Anyway, I am digressing. I can tell you from experience the Quadro will not really improve your performance that much in Lightwave if at all (or any 3d app without custom performance drivers). On my old 5870 I got better or equal opengl framerates in Lightwave and Cinema4d than the best results mentioned in the review. Then again, that's on a Windows machine.

Unfortunately, on the Mac you do not have much choice regarding video cards. But I would stay away from the Quadro 4000. Not a good investment: one user in the Blenderartist forum bought into a Quadro 6000, and was bitterly disappointed. At my work we have Quadro 2000 machines, and those lag quite bad as well at higher poly counts. At least, that is my experience on Windows machines.

CaptainMarlowe
05-10-2012, 10:52 PM
Hi, I have an iMac i7 2.8, not the latest generation, and I'm currently working on a 6.5 millions polygon project (http://forums.newtek.com/showthread.php?t=127846), not counting instances. Of course, I sometimes jump back to wireframe on bounding box mode to change a view or set a camera path, but otherwise I don't have so much problems (only I deactivate in OGL the grass instances preview, because there are tens or thousands of them).
What are your iMac specs ? I am thinking of changing the graphic card in my iMac, this said (hazardous operation, I know), but finding a seller for a Radeon HD 6970M (since drivers for this card are provided in Snow Leopard and Lion) is impossible in France.

OnlineRender
05-11-2012, 03:42 AM
you want one of these bad boys GeForce GTX 670

http://www.ozone3d.net/public/jegx/201205/nvidia-geforce-gtx-670-board-04.jpg

zapper1998
05-11-2012, 08:53 AM
you want one of these bad boys GeForce GTX 670

http://www.ozone3d.net/public/jegx/201205/nvidia-geforce-gtx-670-board-04.jpg

ditto x2 SLI mode

those are the 2gig ram on cards also

GTX-560 Ti (x2) in SLI mode Awesome cards 12 million polgons no problem

mikkelen
05-11-2012, 11:53 AM
Thank you for all the feedback. Just bought a GTX 580 as they can be run in OSX, my 8-core workstation with 64 GB ram will arrive on monday. Bought everything on e-bay, except for the Ram. Under 3500 USD total, not a bad interim workstation until the Ivy Bridge Xeon workstations arrive (hopefully by Apple).

My current iMac (first gen i7 version) is driving me mad at this point...

Layout uses about 3-5 seconds to switch from bounding box mode to shaded mode... The pain! :p
http://dl.dropbox.com/u/20747143/Testrender/fasade_ivy_.jpg

monovich
05-11-2012, 01:40 PM
SLI doesn't accelerate LW does it?

JonW
05-11-2012, 04:21 PM
SLI doesn't accelerate LW does it?

No it doesn't.

Captain Obvious
05-12-2012, 03:34 AM
Are you people crazy? Admittedly, I use Windows rather than Mac OS X, but still... The performance difference can't be that big. My 1500 ThinkPad laptop can push 30+ million triangles at useful frame rates, even in GLSL mode. It's got a 2 gig Quadro 2000M card. I'm sure a cheap stationary Geforce card would provide the same level of performance. There is no need to spend thousands of dollars on a graphics card for a mere 5 million polygons.

edit: I uploaded a screenshot of a Layout viewport. Look at the number of triangles. That's over 46 million, and it's still running at 15+ frames per second.

http://forums.newtek.com/attachment.php?attachmentid=104253&stc=1&d=1336815393

OnlineRender
05-12-2012, 04:02 AM
kinda OT , but I wish LightWave handled like Zbrush

JonW
05-12-2012, 05:12 AM
Are you people crazy? Admittedly, I use Windows rather than Mac OS X, but still... The performance difference can't be that big. My 1500 ThinkPad laptop can push 30+ million triangles at useful frame rates, even in GLSL mode. It's got a 2 gig Quadro 2000M card. I'm sure a cheap stationary Geforce card would provide the same level of performance. There is no need to spend thousands of dollars on a graphics card for a mere 5 million polygons.

edit: I uploaded a screenshot of a Layout viewport. Look at the number of triangles. That's over 46 million, and it's still running at 15+ frames per second.


How is it with Modeler?


This is where Lightwave makes a snail look like F1 in comparison!

rsfd
05-12-2012, 07:52 AM
@mikkelen
I would completely agree with Rayek about the Quadros: they just don't cut in on OSX.

As I assume, you have bought a PC GTX580, I would be interested in getting some info about the performance once you managed it to run properly.
The GTX580 is told to run on Lion without Injectors or other hacks, just with the new Nvidia drivers released earlier this year. Afaik, you will not get the OS boot screen, but it would be very interesting to read about the performance!

@Captain Obvious
the difference *is* that big.
It's due to several reasons: poor graphics drivers from Apple, poor OSX drivers from AMD and Nvidia plus poor implementation in applications like LW and others.
OSX Lion holds newer drivers, but nearly no application is actually using it. They are all tied to the legacy OGL 2.1 drivers, which are still part of OSX for compatibility reasons.

With a little luck, we (Apple users) will see improvements with upcoming Mountain Lion and probably rumors will be acknowledged that Apple will use standard PC Graphics cards in the future…

mikkelen
05-12-2012, 01:12 PM
@rsfd
I will get my workstation up and running within a week, including all components, ram, gfx etc... if they are not delayed in customs... You are right, it's not a specific Mac version of the GDDX580, but I'll terrorize both Apple and nVidia customer support until it works properly. The drivers are officially made by nVidia and Apple has opened up their GFX-card policies, so there should be no real issues in theory. I'll post my experiences in this thread.

Rayek
05-12-2012, 01:20 PM
I, for one, would be deeply interested to hear from you whether that will work. A good friend of mine, owner of a 11 months old Mac Pro workstation, has been in contact with both Nvidia and Apple over the last two months to get clear and obvious information regarding video card upgrade options. So far only the Quadro was suggested, and Apple's support was decidedly less than stellar - he is still confused, and Apple was no help at all. Nor was Nvidia.

I wish the best of luck to you: please keep us informed whether or not you get that 580 working! It could make all the difference to my colleague as well (he is considering dropping mac for once and for all, although being a fervent mac user since the very first Apple machines)


@rsfd
I will get my workstation up and running within a week, including all components, ram, gfx etc... if they are not delayed in customs... You are right, it's not a specific Mac version of the GDDX580, but I'll terrorize both Apple and nVidia customer support until it works properly. The drivers are officially made by nVidia and Apple has opened up their GFX-card policies, so there should be no real issues in theory. I'll post my experiences in this thread.

Captain Obvious
05-12-2012, 01:58 PM
How is it with Modeler?
What, people still use that? I haven't used Modeler for anything serious since modo 103. Hang on, I'll check... edit: Modeler is pretty slow. Let's just leave it at that. modo is usable up to about 100 million triangles when working on subdiv meshes on my laptop. Modeler up to about 10 million. Yikes.




@Captain Obvious
the difference *is* that big.
It's due to several reasons: poor graphics drivers from Apple, poor OSX drivers from AMD and Nvidia plus poor implementation in applications like LW and others.
OSX Lion holds newer drivers, but nearly no application is actually using it. They are all tied to the legacy OGL 2.1 drivers, which are still part of OSX for compatibility reasons.

With a little luck, we (Apple users) will see improvements with upcoming Mountain Lion and probably rumors will be acknowledged that Apple will use standard PC Graphics cards in the future…
I don't believe that. modo and Maya are both faster on Windows, but not much faster. I'm not saying you're wrong about Lightwave on the Mac, but I wouldn't be so quick to point the blame. Other software performs fine on OS X. Why doesn't Lightwave?


And besides, it seems just as crazy to recommend a $4000 graphics card, when a $150 operating system would make a bigger performance difference. There's no shame in dual-booting.

mikkelen
05-12-2012, 03:49 PM
On Modo and other 3D-softwares... I'm not interested in learning any other package. I'd rather spend my time on other non-technincal things, like writing... If my primary occupation was doing 3D graphics, I would be using Z-brush, Modo, Maya, Motion Builder and other high-end software... But since I had too little to do when I was 14 years old, I know this software, and thrust Newtek to develop it further.

Captain Obvious
05-12-2012, 04:12 PM
Well, if what everyone seems to say about Lightwave's performance on the Mac is true then I have to recommend that you invest in Windows, rather than graphics hardware.

If you dual-boot and use Windows for Lightwave and nothing else, you won't really have to put up much with Windows' eccentricities.

mikkelen
05-12-2012, 07:43 PM
I'm not interested in using another operating system than OSX. I just invested in a second hand MacPro to avoid switching to Windows - it will also be used for color grading, and I just deleted my Windows partition, better used for scratch-disk area. I've been very happy with the performance until this project became complicated. I'm running a first gen 27" i7 iMac... it's quite expected to become slow at this point. Anyways, there is no reason for LightWave to under-perform on OSX, if that is the case, I hope Newtek is working on it...

JonW
05-13-2012, 01:17 AM
What, people still use that? I haven't used Modeler for anything serious since modo 103. Hang on, I'll check... edit: Modeler is pretty slow. Let's just leave it at that. modo is usable up to about 100 million triangles when working on subdiv meshes on my laptop. Modeler up to about 10 million. Yikes

I only have my PCs for Lightwave, so LW is pretty privileged to have them all to itself. Modeler is pretty ordinary, but for my purposes LWCad makes all the difference. But if one needs to, eg rotate a 100 layers for an architectural building, I can go & make a coffee & bake a cake!

It would be nice if LW used more than one core, such a waste having 15 cores sitting of their hands. What does help a bit, but barely, is an SSD.

Rayek
05-13-2012, 02:44 AM
Just found this Mac Quadro 4000 review:
http://arstechnica.com/apple/2011/05/ars-reviews-the-quadro-4000-mac-edition-nvidias-sole-mac-offering-a-promising-start/2/

Outperformed/equal performance by/with a 5770. Plain sad.

mikkelen
05-13-2012, 07:32 AM
It would be nice if LW used more than one core, such a waste having 15 cores sitting of their hands. What does help a bit, but barely, is an SSD.

Are you kidding me? Is LightWave only using one core!!!???

Captain Obvious
05-13-2012, 11:26 AM
Are you kidding me? Is LightWave only using one core!!!???
Depends on what you're doing. Certain tasks are incredibly difficult to multi-thread, including many modelling tasks, dynamics, et cetera. The Bullet dynamics library apparently runs faster in a single thead than with multiple threads. The overhead for thread-to-thread communication is greater than the performance boost for splitting the work across many cores, I guess. While I'm sure there are plenty of areas in Lightwave that could do with a bit of multi-threading, you shouldn't get your hopes up too high. Most 3D software is mostly single-threaded. It's only really rendering that threads well, and Lightwave's render engine is fully threaded.

Andy Meyer
05-13-2012, 05:36 PM
modeler is 100% outdated, but still usefull for my everyday work.
modelers problem is not only threading. it's the architecture, data structure and the 20 years old program code.
disassembled amiga source code combined with 16bit barrier hacks will never do a decent job on a 16core 64bit cpu ;-)
[just kidding, but not so far from the truth i guess]

i work all day with modeler. if you know how to work around modelers limits it's not bad for many tasks.

LW needs a new up to date method for modeling.
NT: just keep modeler as a legancy poly edit tool and present a modern way to model inside LW12 (layout).

JonW
05-13-2012, 06:52 PM
modeler is 100% outdated, but still usefull for my everyday work.
modelers problem is not only threading. it's the architecture, data structure and the 20 years old program code.
disassembled amiga source code combined with 16bit barrier hacks will never do a decent job on a 16core 64bit cpu ;-)
[just kidding, but not so far from the truth i guess]

i work all day with modeler. if you know how to work around modelers limits it's not bad for many tasks.

LW needs a new up to date method for modeling.
NT: just keep modeler as a legancy poly edit tool and present a modern way to model inside LW12 (layout).

Until Modeler is sped up, if one has a spare computer it would be best to have a fast CPU with less cores, & overclocked! Turn off hyperthreading & have an SSD, plus a fast graphics card.

If you use it for rendering with HT off it will be about 25% slower, but this maybe a price worth paying if one is getting frustrated with Modeler, & Layout for that matter.


Edit:

I just tried the scene http://forums.newtek.com/showthread.php?t=105742&page=11 here are my benchmarks for LW11.0 1:54 (114) HT on, & 2:25 (145) HT off. (I haven't got around to updating LW11)

So if you have a 60 minute render with HT off. It's only going to take 47 minutes with HT on. Obviously it's will vary a bit from render to render. Having HT off, is not the end of the world, & will help in other ways!

LW 10.1: 3dspeed "teapots"

HT on
1:03 (63)

HT off
1:11 (71)

53 minutes v 60 minutes. I going to leave HT off unless I have days of renders!

mikkelen
05-13-2012, 09:27 PM
I thought Core were supposed be/become a re-write?

I've not used LW much the last years, but I still like it, BUT too bad that LightWave always feels/is outdated. As long as I've used LightWave, this has been the case, it's not been exactly cutting edge, maybe except for the HyperVoxels feature in LW6?

mikkelen
05-13-2012, 09:33 PM
...to Newtek's credit, LW11 is much faster than LW10! It's moving in the right direction.

Rayek
05-13-2012, 10:28 PM
Yes, I noticed - I installed the trial, and opengl in Layout is at least twice as fast compared to 9.6. VPR is also nice.


...to Newtek's credit, LW11 is much faster than LW10! It's moving in the right direction.

Andy Meyer
05-14-2012, 02:08 AM
Until Modeler is sped up, if one has a spare computer it would be best to have a fast CPU with less cores, & overclocked! Turn off hyperthreading & have an SSD, plus a fast graphics card.

If you use it for rendering with HT off it will be about 25% slower, but this maybe a price worth paying if one is getting frustrated with Modeler, & Layout for that matter.


JonW, if you have HT off then you can overclock better coz HT off uses less voltage!

JonW
05-14-2012, 04:48 AM
JonW, if you have HT off then you can overclock better coz HT off uses less voltage!

Unfortunately I can't OC my MB. It's a dual CPU Supermicro server MB (2 x W5580).

rsfd
05-14-2012, 04:00 PM
@mikkelen:
thank you for your will to post your findings about the GTX580 later on!
Looking forward to your post(s)!
According to Netkas, the GTX580 should run on Lion just with Nvidia's new drivers (http://www.nvidia.com/object/macosx-270.00.00f01-driver.html).
Obviously, you will not get Apple's grey startup screen, but from the login everything should be fine out of the box. I would be interested in getting info e.g. about the power supply: does the 580 need a 2nd power chord and how easy is it to get one that is Mac compatible?


@Captain Obvious:
OSX's OGL performance is a long standing annoyance.
And it's no secret that many bigger software developers had orphanaged their Mac support during the last years. It seems to improve since the iOS boom with a little halo to OSX, but there is still a lot to catch up in several regards, one being the support of more capable gfx cards, which can e.g. handle just more polygons compared to what's possible with the small range of officially available Mac-gfx-cards atm.

On my machine, LightWave performs significantly better under Windows, the difference with modo is smaller.
But as I don't find the dual boot solution very convenient and with LW being the only app that forced me to install Windows, I decided to stay away from the 11 upgrade in favour to modo601. It just was the more useful update for my needs. I'll stay away from LW in the future, keeping an eye on its development, but maybe will just add Maxwell to modo and use that combination in the future (Still / Print work).

btw, I never recommended a $4000,- gfx-card. And I would never recommend a Quadro to any Mac user at this time. But for OSX users who don't want to dual-boot (!), the option to plug a PC gfx card into a MacPro is a light a the horizon. A relatively small investment could boost a MacPro's performance significantly.

Darth Mole
05-15-2012, 02:17 AM
I am hugely interested in how the GTX card runs on the Mac. I have an ATI HD 5870 which is top-of-the-range for a Mac, but a dinosaur compared to the latest PC cards. If it works for you with no issues (fan noise, performance), I'll go and treat myself!

Darth Mole
05-20-2012, 09:01 AM
Bumping this up!

mikkelen
05-20-2012, 10:17 AM
I've recieved the GTX 580, but are waiting for the PCI power cables; 6-pin-mini to 6-pin and 6-pin to 8-pin.

However. The first time I ran Lightwave on my new MacPro, it was super smooth. But on the second and sequential runs, it's back to being extremely slow. I'm trying to find the source of the problem. I'll update to the latest version and see if it helps.

lwanmtr
05-21-2012, 02:48 PM
I would be interested also...am wanting to upgrade from my 8800gt to something more modern...and if there is a way to use a card for pc, I'd love to know how.

rsfd
05-22-2012, 04:39 AM
@mikkelen
lately, I've found some infos indicating, that the GTX580 does *not* run at least in a MacPro 3,1 (early 2008), because it needs too much power, making an additional power supply necessary (such as this one (http://www.epowertec.com/power_cd.html)). Don't know, if later MacPro models have a bigger power supply, though.
Also, it might be that the 580 runs with an additional cable from the mainboard, but might get unstable when demanding tasks are to do.
The 570 seems to work with the default power supply from the mainboard.
A standard problem with Nvidia cards is the too small EEPROM chip, which makes flashing with Mac EFI ROM for full support (boot screen, monitor plugs…) impossible. There is one guy near L.A. selling flashed cards with exchanged EEPROM chips, which makes them nearly fully compatible (incl. boot screen, full PCI-2 support…).
This thread (http://forums.macrumors.com/showthread.php?t=1360927) might give some more informations about the 570/580 in a MacPro. (btw, it's the L.A. guy, who started it)
For someone outside the U.S., it's probably more rewarding to wait at least until Mountain Lion is available, as it will have much improved graphics support over Lion, making possibly more possible ;)

mikkelen
08-25-2012, 09:37 AM
Sorry for not being around to give an update on the issue. The problem with layout-speed has something to do with Maxwell Render...

On GTX580: It works great!