View Full Version : Does Lightwave take advantage of hardware OpenGL?

10-11-2004, 09:47 AM
I'm looking to buy a new graphic card to replace my Redeon 9200 that has a lot of problem now. I'm comparing the two, nVidai FX 5900 nd the Quadro FX1100. Quadro has the hardware OpenGL, but I am not sure that Lightwave was writen to take any benefit from hadrware OpenGL or not.
So, anyone out there please give me this technical answer.

Thanks :confused:

10-11-2004, 10:21 AM
Nope, it doesn't. Just get a good consumer level card, you will spend less and be just as happy.

10-11-2004, 10:34 AM
Don't buy an high-level card if LightWave is your only intensive OpenGL application! ;)

10-11-2004, 12:28 PM
Thanks guys,

10-19-2004, 07:36 PM
An PNY Nvidia GeForce 6800 GT

10-20-2004, 05:53 AM
Just get a good consumer level card...

That pretty much meets the criteria of "good consumer level card".

10-22-2004, 10:29 AM
Matt, oh great sage of the newsgroup, you seem close to being the only regular contributor to understand LW's interaction with graphics cards;
What would you suggest as the biggest 'Bang for Bucks' (better than 2560x1024, 'full colour' & millions of polys) for dual 21" crt running LW, VT3 & Photoshop ?

10-22-2004, 10:46 AM
LOL!!! You give me too much credit! :o

Expanding on what I said above, just get the most expensive (consumer level) nVidia card you can afford. If you want to run high-res on two monitors, you will definitely want a card that has 256mb of RAM.

10-22-2004, 01:15 PM
I just created a thread under Feature Requests about OpenGL performance, please go add your two cents to let Newtek know how you feel.

11-02-2004, 07:52 AM
Hi all,

Im with Mattclary on this one defy need to have 256 mb RAM, I actually have a 6800 GT 256mb and its really good for LW, and with nvidias drivers going from strength to strength nvidia can only get better.