PDA

View Full Version : Lightwave graphic card performance



fazi69
07-12-2013, 03:03 PM
I just finded test of few pro GFX cards in lightwave : http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-12.html
Take a look at this table :
115578
It is clear that as long as we use OpenGL in LW3d we will be under dictate of the two devils. Both AMD and Nvidia clearly block performance
on the driver level.
Take notice that fastest "gaming card" is few times faster than "pro card" in any game or D3D app but then magic happen and
in OGL app slowest "pro card" is faster.
Also take notice that there is no real performance difference between otherwise very different cards ( q4000, q5000, q6000) but there is
differences between card classes .... probably written in drivers.

It is real theft ! Products are artificially blocked and prices are growing every year in this duopoly.

What you think ? We should try to fight this ? Maybe just switch to Direct-X in future Lightwave versions ?

Tartiflette
07-12-2013, 03:24 PM
DirectX isn't an option for Mac users (i konw, i speak mostly for me as a Mac user, but still...), where OpenGL is the only way to display 3D content.

As for this situation, it's been like this for years now, since nVidia (and AMD by extension...) discovered that some talented hackers could get "gaming" cards to be recognized like "pro" cards by "just" rewriting some parts of the drivers.
Nowadays they have introduced limitations at the core level to ensure this kind of situation won't happen anymore, and you're right, they are more than abusing regarding the prices they sell their "pro" cards, but i don't think we can do a lot about it, i'm afraid. :(

The "good" news being that LightWave is more limited by the way it interacts with polygons than by its OpenGL speed anyway, so you wouldn't notice that much of a difference on heavy scenes/objects... :D


Cheers,
Laurent aka Tartiflette. :)

fazi69
07-12-2013, 03:45 PM
The "good" news being that LightWave is more limited by the way it interacts with polygons than by its OpenGL speed anyway, so you wouldn't notice that much of a difference on heavy scenes/objects... :D



I know that Lightwave have some limitations in OpenGL but still take a look at graphs I attached. It is clearly visible that heavy "software chocking" is taking place here and no amount of work on the Newtek side
will change anything.

It seams that there is a need to find some kind of workaround because there will be never a gain in performance without permission from AMD or NVIDIA.
Tests clearly shows that performance is rigged at the power of the slowest card in the series so there is no reason to buy anything else than that.

This will end like monopoly in software .... sooner or later Nvidia and AMD will ask Us to buy monthly licenses and clouds like Adobe and Autodesk.
I`m angry. I wanna lynch some CEO`s. Off with their heads kinda mood .

Amurrell
07-13-2013, 03:02 AM
You should look at the other benchmarks in the article. OpenGL performance sucks with gaming cards, but simulations, CUDA performance, and even some of the CAD performance is much better with a gaming card. I guess the question really is, what are you looking for performance wise, and where do you want it? Render wise, it looks like gaming, OpenGL professional, now all we need is something like the camera industry has with its semi-professional line, and translate that over to graphics cards.

Of course I would love to see a DirectX toggle in LW like it has OpenGL and Legacy modes, but probably be too difficult to implement.

fazi69
07-13-2013, 07:39 AM
http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-6.html - Here, same professional app and performance is like it should be. So .... I need straight answers .... who is to blame ? Newtek or AMD/NVIDIA tandem ?
If AMD/Nvida is to blame ... I want Direct-X port in LW3d !!!

DogBoy
07-13-2013, 12:04 PM
http://www.tomshardware.com/reviews/best-workstation-graphics-card,3493-6.html - Here, same professional app and performance is like it should be. So .... I need straight answers .... who is to blame ? Newtek or AMD/NVIDIA tandem ?
If AMD/Nvida is to blame ... I want Direct-X port in LW3d !!!

I very much doubt LightWave will go DirectX, because it would mean feature disparity between Mac and PC. Before we all jump in and say we don't care about Mac performance, remember there are a lot of Mac users of LightWave and that going the DirectX route will cause LW3DG headaches with their code going forwards, as their code forks more and more between platforms.

DirectX would be nice, but it isn't really an option I think.