Results 1 to 6 of 6

Thread: Lightwave graphic card performance

  1. #1
    Registered User
    Join Date
    May 2003
    Location
    Poland
    Posts
    299

    Lightwave graphic card performance

    I just finded test of few pro GFX cards in lightwave : http://www.tomshardware.com/reviews/...d,3493-12.html
    Take a look at this table :
    Click image for larger version. 

Name:	03-OpenGL-SPECViewperf11-03-Lightwave-01.png 
Views:	319 
Size:	33.3 KB 
ID:	115578
    It is clear that as long as we use OpenGL in LW3d we will be under dictate of the two devils. Both AMD and Nvidia clearly block performance
    on the driver level.
    Take notice that fastest "gaming card" is few times faster than "pro card" in any game or D3D app but then magic happen and
    in OGL app slowest "pro card" is faster.
    Also take notice that there is no real performance difference between otherwise very different cards ( q4000, q5000, q6000) but there is
    differences between card classes .... probably written in drivers.

    It is real theft ! Products are artificially blocked and prices are growing every year in this duopoly.

    What you think ? We should try to fight this ? Maybe just switch to Direct-X in future Lightwave versions ?

  2. #2
    Mike, in Monsters Inc Tartiflette's Avatar
    Join Date
    Feb 2003
    Location
    Montpellier, South France
    Posts
    596
    DirectX isn't an option for Mac users (i konw, i speak mostly for me as a Mac user, but still...), where OpenGL is the only way to display 3D content.

    As for this situation, it's been like this for years now, since nVidia (and AMD by extension...) discovered that some talented hackers could get "gaming" cards to be recognized like "pro" cards by "just" rewriting some parts of the drivers.
    Nowadays they have introduced limitations at the core level to ensure this kind of situation won't happen anymore, and you're right, they are more than abusing regarding the prices they sell their "pro" cards, but i don't think we can do a lot about it, i'm afraid.

    The "good" news being that LightWave is more limited by the way it interacts with polygons than by its OpenGL speed anyway, so you wouldn't notice that much of a difference on heavy scenes/objects...


    Cheers,
    Laurent aka Tartiflette.

  3. #3
    Registered User
    Join Date
    May 2003
    Location
    Poland
    Posts
    299
    Quote Originally Posted by Tartiflette View Post

    The "good" news being that LightWave is more limited by the way it interacts with polygons than by its OpenGL speed anyway, so you wouldn't notice that much of a difference on heavy scenes/objects...
    I know that Lightwave have some limitations in OpenGL but still take a look at graphs I attached. It is clearly visible that heavy "software chocking" is taking place here and no amount of work on the Newtek side
    will change anything.

    It seams that there is a need to find some kind of workaround because there will be never a gain in performance without permission from AMD or NVIDIA.
    Tests clearly shows that performance is rigged at the power of the slowest card in the series so there is no reason to buy anything else than that.

    This will end like monopoly in software .... sooner or later Nvidia and AMD will ask Us to buy monthly licenses and clouds like Adobe and Autodesk.
    I`m angry. I wanna lynch some CEO`s. Off with their heads kinda mood .

  4. #4
    Professional amateur Amurrell's Avatar
    Join Date
    Dec 2005
    Location
    Lincoln, NE
    Posts
    1,350
    You should look at the other benchmarks in the article. OpenGL performance sucks with gaming cards, but simulations, CUDA performance, and even some of the CAD performance is much better with a gaming card. I guess the question really is, what are you looking for performance wise, and where do you want it? Render wise, it looks like gaming, OpenGL professional, now all we need is something like the camera industry has with its semi-professional line, and translate that over to graphics cards.

    Of course I would love to see a DirectX toggle in LW like it has OpenGL and Legacy modes, but probably be too difficult to implement.
    --Andrew
    Bono Animo Esto


    Intel Core i5 3570k
    nVidia GTX 970
    8GB DDR3 1333MHz
    Windows 10 Pro 64bit

  5. #5
    Registered User
    Join Date
    May 2003
    Location
    Poland
    Posts
    299
    http://www.tomshardware.com/reviews/...rd,3493-6.html - Here, same professional app and performance is like it should be. So .... I need straight answers .... who is to blame ? Newtek or AMD/NVIDIA tandem ?
    If AMD/Nvida is to blame ... I want Direct-X port in LW3d !!!

  6. #6
    Grumpy Faux-Waver DogBoy's Avatar
    Join Date
    Feb 2003
    Location
    Central London
    Posts
    1,549
    Quote Originally Posted by fazi69 View Post
    http://www.tomshardware.com/reviews/...rd,3493-6.html - Here, same professional app and performance is like it should be. So .... I need straight answers .... who is to blame ? Newtek or AMD/NVIDIA tandem ?
    If AMD/Nvida is to blame ... I want Direct-X port in LW3d !!!
    I very much doubt LightWave will go DirectX, because it would mean feature disparity between Mac and PC. Before we all jump in and say we don't care about Mac performance, remember there are a lot of Mac users of LightWave and that going the DirectX route will cause LW3DG headaches with their code going forwards, as their code forks more and more between platforms.

    DirectX would be nice, but it isn't really an option I think.
    fauxWaver \foʊ-wāv-er\, n. one who likes LightWave, but shuns car-anologies.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •