PDA

View Full Version : Video Cards and LW 8.5



maxxwv
08-26-2005, 08:18 AM
Ok - I think it was Ben Vost on the Siggie vid's talking about the OpenGL updates coming in 8.5, and he mentioned that it works with certain NVidia cards (I don't remember which one right now) and up - I currently have a GeForce4 Ti4600 and am happy with it until I decide to upgrade my entire system (not for a while yet). Anybody happen to know if this is in the range of cards that'll allow me to reap the OpenGL benifits?

Sorry if this seems like a dumb question - I hit the NVidia web site, but couldn't find a "timeline" comparison about what cards are better than others, etc. In fact, I didn't even see the GeForce4 Ti4600 in the Desktop, Workstation, Video/Audio, or Legacy sections. GeForce2, 3, MX, and 6 are there, but no 4 that I could find.

Anyway, just checking, and thanks in advance for advice/comments

Karmacop
08-26-2005, 09:16 AM
I think one of the videos state that it'll work on an nvidia 5200 and greater, or an ati 9700 or greater. So basically any card with pixel shaders I think. So no, your 4600 wont supportit :(

PS I need a new card too :help: :p

Lewis
08-26-2005, 09:28 AM
Ben mentioned that GFX 5200 will be enough (faster the better) so your older GF4 4600 might be little weak 'coz AFAIk it not supports more than OpenGl 1.3 (1.4Max) and new LW utilizes 2.0.

But let's wait and see how it works :).

Gaze
08-26-2005, 10:02 AM
:thumbsup:Cooo

I needed an AGP card (in a pinch) a few days ago, and without much time to research, picked up a card on sale at Fry's for $59. Just checked and turns out it features: 'GeForce FX5200'

may just work out okay :)

coremi
08-26-2005, 10:07 AM
i think GF 4600 Ti wil work just fine it is way much faster at OpenGL than fx 5200. And your does Pixel Shader and Vertex shader but this is an DirectX option so it really doesn't matter. Stay put until Lightwave 9 and u should see what to buy, i think this is not the moment for a new card for you.

maxxwv
08-26-2005, 10:36 AM
Thanks for the replies guys and gals (?). Looks like it's sit tight and wait time, then a quick trip to Best Buy if it doesn't work... :D

Scott_Blinn
08-26-2005, 03:21 PM
Any video card that has full driver support for OpenGL 2.0 should work- and of course you get what you pay for (the faster the card the faster the LW display).

--SB

BeeVee
08-26-2005, 04:41 PM
Although the 4600 is a better model of the Geforce 4 series than the FX5200 is of the FX5 series, it won't display all the things that LightWave will be able to show in OpenGL since it's OpenGL perfomance isn't "high-level" enough. I have a Quadro 4 580XGL, which is based on the Quadro 4 chipset (the pro version of the 4xxx series of GeForce cards) and although OGL performance is good, I don't get procedural textures, etc.

B

cresshead
08-26-2005, 06:30 PM
ooh, looks like ben's running '9'! :D

Karmacop
08-26-2005, 09:28 PM
i think GF 4600 Ti wil work just fine it is way much faster at OpenGL than fx 5200. And your does Pixel Shader and Vertex shader but this is an DirectX option so it really doesn't matter.

I should have explained my pixel shader comment before. I meant programmable pixel shaders, which the 4600 can't do. But coremi is right, the 4600 is faster than the 5200 (from my memory), but the 5200 has more features.

BeeVee
08-29-2005, 02:47 AM
ooh, looks like ben's running '9'! :D

Well, duh, how do you think I was demoing it at Siggraph! :D

Just kidding,

B

Earl
08-30-2005, 03:06 PM
So I'm guessing a Quadro4 700 GoGL card probably won't support the new display goodies in 8.5 and beyond? If only it were possible to replace the vid card in a Dell laptop... :cursin:

Scott_Blinn
08-30-2005, 04:57 PM
So I'm guessing a Quadro4 700 GoGL card probably won't support the new display goodies in 8.5 and beyond? If only it were possible to replace the vid card in a Dell laptop... :cursin:

In some Dell laptops you can, you may want to look into it.

Earl
08-30-2005, 05:39 PM
Really? Perhaps I'll give them a call tomorrow and see if mine is one of them. That would be nice (though I'm still betting they'll say no - my good luck at play).

Earl
08-31-2005, 04:00 PM
Anyone know what this means? Does it mean that a Quadro4 700 Go GL could in fact support OpenGL 2.0 fully with a driver update? Or is it total nonsense? :stumped: This was taken from a specsheet/PDF file from NVIDIA's site, located here:
http://www.nvidia.com/object/LO_20030203_7731.html (Quadro4 700 Go GL Product Overview)

http://www.agoracle.com/files/quadro4-700GoGL-OpenGL20.jpg

Karmacop
08-31-2005, 10:03 PM
I can't be 100% sure, but it looks as if it would with the right drivers.

maxxwv
09-01-2005, 06:57 AM
So here's an even dumber question as per graphics cards - is the OpenGL compliance dictated by the card or by the motherboard? My system's getting on in years, but I'm not quite able right now to replace it. So, if I upgrade to an OpenGL2.0 compliant card, will it run fine on the current system or would I simply be wasting cash that could be better used by putting in the poke to upgrade the entire system? I think that my mobo is AGP2x or 4x compliant, if that makes a difference. Honestly, though, I'm not sure - and can't remember for the life of me what mobo I've got in this sucker right now... :foreheads

Earl
09-01-2005, 08:38 AM
OpenGL 2.0 is determined by your video card. The motherboard won't play a role in whether or not it will work. If the video card supports it, you're good to go. Unless of course your motherboard simply doesn't work with the specific video card. Most modern AGP video cards will work with AGP x2 or x4, etc. However I would check with your motherboard manufacturer (or a local specialist) first before buying the new graphics card.

maxxwv
09-01-2005, 09:05 AM
Excellent! Thanks Earl - 'preciate the info.

Earl
09-01-2005, 03:28 PM
Anyone know what this means? Does it mean that a Quadro4 700 Go GL could in fact support OpenGL 2.0 fully with a driver update? Or is it total nonsense? :stumped: This was taken from a specsheet/PDF file from NVIDIA's site, located here:
http://www.nvidia.com/object/LO_20030203_7731.html (Quadro4 700 Go GL Product Overview)

To answer my own question: no I don't think it will support the new lighting and effects in 8.5 and 9.0. The card does support OpenGL 2.0, but then so does pretty much all of NVIDIA's cards that have updated drivers - even back to the GeForce4s (this is based upon the unified driver specs, claiming OpenGL 2.0 support). However supporting OpenGL 2.0 isn't enough - I think it needs programable vertex and pixel shading in order to take full advantage of NewTek's enhancements. Time will only tell, but no use getting my hopes up. I'm just gonna have to wait until I can get a new laptop at work.

Wickster
09-07-2005, 07:55 PM
Quick question:
Is OpenGL 2.0 hardware dependent or Driver dependent? Cause I remember NT saying o siggraph that at minimum I need a Radeon 9X00 or a GeForce 5200 to take advantage of 8.5's OpenGL shading...

Now I'm looking for a good graphics card but I found one...GeForce 6600 but on the box it says supports OpenGL 1.5 and below. Can I fix this with a recent Nvidia Driver?

Thanks in advance.

connerh
09-07-2005, 08:42 PM
6600's will support OGL2.0 with updated drivers. I'm not sure how far down the line the OGL2.0 support goes, but I'm assuming it's down to the 5200, as that is the minimum requirement.

Wickster
09-07-2005, 09:05 PM
6600's will support OGL2.0 with updated drivers. I'm not sure how far down the line the OGL2.0 support goes, but I'm assuming it's down to the 5200, as that is the minimum requirement.
Sweet! Thanks. Now I'm off, with the credit card on hand!

Fausto
09-07-2005, 11:44 PM
I just upgraded my personal system, and one of the things that I added to it was a PCIE based NVidia 7800 GTX, man it's incredible to watch realtime particle based smoke and fire. So far so good!

Scott_Blinn
09-08-2005, 01:24 AM
I just upgraded my personal system, and one of the things that I added to it was a PCIE based NVidia 7800 GTX, man it's incredible to watch realtime particle based smoke and fire. So far so good!

Hehe, congrats. I have one of those cards at work and it is an awesome card. :thumbsup:

Wickster
09-16-2005, 12:56 AM
I found this question and answer on a download page, when looking for some info on the latest driver for my graphics card (nVidia 6200). Hope it helps.


Q. I thought it enabled Opengl 2.0. It doesn't though.
A. OpenGL Shading Language 2.0 can only be enabled on 55.xx and higher Forceware drivers. It had not yet been implemented in older Detonator or pre 55 Forceware drivers.


Here is the site:
http://downloads.guru3d.com/download.php?det=815

FYI: Forceware is the name of the nVidia Display drivers, which I think is now on the 78.XX version.

Hope it sheds some light on the topic.

mattclary
09-16-2005, 04:33 AM
This may clear things up a little. This is from a pdf document from nVidia. Someone linked to it in the OpenGL shaders thread. Basically, 5xxx cards support OGL via software, the 6xxx and above do it in hardware. So I think it would definitely behoove one to own a 6xxx or better card when 9 becomes available. You might want to wait for reports from other users before dropping a buttload of cash on one of the uber-high-end cards though.


A. Distinguishing NV3xGL-based and NV4xGL-based
Quadro FX GPUs by Product Names
As discussed in section 2, while NV3x- and NV3xGL-based GPUs support OpenGL 2.0,
the NV4x- and NV4xGL-based GPUs have the best industry-wide hardware-acceleration
and support for OpenGL 2.0.
For the consumer GeForce product lines, GeForce FX and GeForce 6 Series GPUs are
easily distinguished based on their product names and numbering. Any NVIDIA GPU
product beginning with GeForce FX is NV3x-based. Such GPUs also typically have a
5000-based product number, such as 5200 or 5950. GeForce GPUs with a 6000-based
product name, such as 6600 or 6800, are NV4x-based.
However, the Quadro FX product name applies to both NV3xGL-based and NV4xGLbased
GPUs and there is no simple rule to differentiate NV3xGL-based and NV4xGLbased
using the product name. The lists below will help OpenGL 2.0 developers
correctly distinguish the two NV3xGL- and NV4xGL-based Quadro FX product lines.
NVIDIA OpenGL 2.0 Support
22 of 22
A.1. NV3xGL-based Quadro FX GPUs
Quadro FX 330 (PCI Express)
Quadro FX 500 (AGP)
Quadro FX 600 (PCI)
Quadro FX 700 (AGP)
Quadro FX 1000 (AGP)
Quadro FX 1100 (AGP)
Quadro FX 1300 (PCI)
Quadro FX 2000 (AGP)
Quadro FX 3000 (AGP)
A.2. NV4xGL-based Quadro FX GPUs
Quadro FX 540 (PCI Express)
Quadro FX 1400 (PCI Express)
Quadro FX Go1400 (PCI Express)
Quadro FX 3400 (PCI Express)
Quadro FX 4000 (AGP)
Quadro FX 4400 (PCI Express)
Quadro FX 3450 (PCI Express)
Quadro FX 4450 (PCI Express)

Earl
09-16-2005, 08:27 AM
Hmmm, that does help. Thanks for the info. I guess it wouldn't make sense for NVIDIA to be upfront about it on the driver information page, but that seems pretty clear.

zapper1998
09-18-2005, 02:01 PM
Ok i read all the posts and there is some great info here thanks all

Butt
I hava a ?

what about those dual card setups with the bridge connecting them, the "SLI" thechnology, I read about????? I was thinking of trying 2 cards the high end Nvidia cards with the SLI bridge connection.. 2 cards working as 1..

is agp better or pci ??
I have allways run agp..

Is PCI cards better???

KorbenD
09-18-2005, 05:26 PM
Ok i read all the posts and there is some great info here thanks all

Butt
I hava a ?

what about those dual card setups with the bridge connecting them, the "SLI" thechnology, I read about????? I was thinking of trying 2 cards the high end Nvidia cards with the SLI bridge connection.. 2 cards working as 1..

is agp better or pci ??
I have allways run agp..

Is PCI cards better???

For SLI, you'd have to have a motherboard that has PCI-Express slots. Regular PCI won't do it. AGP won't do it. In fact, the motherboard would have to have *two* 16X PCI-e slots.

PCI-e is better than AGP, and AGP is better (for video) than PCI. Whether any video cards available now will take full advantage of PCI-e is debatable.

mattclary
09-19-2005, 06:23 AM
In a nutshell, SLI won't work with LightWave. Serach the forums to find out why.

Even if LW worked with SLI, I don't think it would be worth it. Even when 9 comes out and LW takes full advantage of OpenGL, do you really need 100+ frames per second in LW? One nVidia 7800 card should sufficiently kick *** for our purposes.

Now if you plan to game, by all means, I encourage you to do it. It kicks @ss in games.

Karmacop
09-19-2005, 08:38 AM
I bet Lightwave would slow down if you used a lot of procedurals and lights ;)