View Full Version : New GFX card recommendations/dissuasions pls

03-28-2003, 02:08 AM
Im getting annoyed with the old 'not enough desktop realestate' problem everybody encounters sooner or later, and have decided to go multimon!
My box is quite new, Dual Athlon MP 2000+, 1 GB ddr 2700, and currently running a GF3 ti200.
I've been looking around for the best (and fastest) multiview solutions and have narrowed it down to 3 cards I know of (if there are other choices, pls tell me):
Geforce 4 Ti4800 with nView.
Radeon 9700 Pro Dualhead.
Matrox Parhelia 512 128MB tripleview.

I have to say the Parhelia blew me away the first time I saw one in operation, and I am definitely leaning towards it even with the steepish pricetag.
My PC isn't just for work/hobbies, I also play a lot of games on it, Halflife, q3a, ut2k3 etc. so opengl and d3d performance is an area I dont want to skimp on.
In reviews I've seen the Parhelia perform around the Radeon 8500 mark, so its good, but when i compare it to the GF4 and new Radeon I feel less thrilled.

I'd love to hear from anyone running a Parhelia and tell me how it is etc (especially surround mode in games :) ).


03-28-2003, 03:04 AM
If multi-monitor performance is what you need then I'd recommend nVidia cards - that GF4 4800 is a good card but my personal favourite is the Albatron GF4 Ti 4200 Turbo - a 4200 GPU with a 4600 board - fully clockable to 4600/4800 speeds, well built and nice 'n cheap...

ATI's cards are better performers but lots of people seem to have problems using LightWave on dual monitor setups with ATI boards.

I'd steer well clear of the Parhelia - tripleview is v.cool but I'd be surprised if LightWave works properly on it - Matrox's OpenGL is supposed to be pretty poor and you'll certainly get worse performance than the nVidia/ATI options.

03-28-2003, 05:01 AM
Hm, since i had a G200 i won't ever trust Matrox's OpenGL drivers ever again i guess :D
(well the first year there weren't any at all!)

But on the other side their multi monitor software is said to be the best out there...and nView...well...not really that great. Recently tried to set a GF4 Ti to have two desktops, one on a beamer and one on a monitor but that stupid thing would always span the task bar, show popups half on one and half on the other etc...
the function to use it as two video cards was not available in the drivers and apparently you had to edit the registry to change that *pissed off*

With ATI i read a few times that OpenGL won't work on the second monitor at all, donno if they fixed that.

03-29-2003, 03:40 AM
Originally posted by Lynx3d
With ATI i read a few times that OpenGL won't work on the second monitor at all, donno if they fixed that.

Not really.
You can't run Layout on the first and modeller on the scondary monitor, but you can run the application on one monitor and keep all the windows (numeric, surface, graph editor....) on the secondary monitor.
That's not exactely what I hoped for when I got my 9700, but it's better than having it all on one monitor.

This display quality of the ATI 9700 is very good too.

If you play games and work with lightwave, consider the 9700 as an option.

03-29-2003, 09:13 AM
Just wondered if anyone tried SoftFireGL...will that make OpenGL work on both monitors? (If it works at all)

I wish LW had support for special drivers like 3ds max, that would smoke! A faster ("professional") video card would finally make viewports faster.

03-29-2003, 11:02 AM
SoftfireGl (9700 pro to fireglx1) doesn't change the opengl perfs in lw at all, and according with others guys from an other forum (guru3d), I have some glitches that I don't have with original radeon driver like grid selected ??, unable to see points selection in some viewport, zbuffer errors in the layout, and few ... whatever the driver I use !

Would be interresting to have some feedback from real firegl x1 owner !!!!!!!

05-22-2003, 08:59 AM
...well you can use nView to send one program to the other(second) monitor... dont know if its possible to make that automatically, like with an batch file or something...

As long as your second card/output is capable of openGL there should be no problem doing that...

and you can also tell the driver to either "confine" the windows to each monitor if they are overlapping, so they jump to the side where the biggest part of the window is... or you tell the driver to allow overlapping...

dont know the english drivers so i unfortunately cannot tell you how this assistant is called in english...

I use nVidia Detonator FX (44.03) so maybe its not working with earlier versions... but the 44.03 works fine in all modes...

05-22-2003, 08:10 PM
I've got a GeForce4 Ti 4200 with a CRT on the analog port and an LCD on the digital port. The good news is that it basically works. Both screens are fully functional. I can put Layout on one screen and Modeler on the other. I'm taking Larry Shultz's Intro to LightWave course, so I'm usually watching a QuickTime video on the CRT while playing with Layout or Modeler on the LCD.

The bad news is that there are quirks. Layout and Modeler always start on the primary screen. Pop-up menus sometimes look weird if they're on the secondary monitor. For example, the menu will extend past the screen boundaries, but won't scroll. I have experienced several system crashes since adding the LCD.

The most annoying thing, at least under Windows 98, is that it's difficult to set the primary monitor. I'd like the LCD to be primary, but then the CRT's refresh rate is stuck at 60 Hz.

The ideal solution would probably be two identical LCDs, each on a digital port. I used to think LCDs were over-priced yuppy bait. Now, I can barely stand the CRT. (To be fair, I think there actually is something wrong with the CRT.)

If you like to play games, I would not recommend Matrox. I have heard less than stellar reviews of ATI's drivers. I'm a fairly satisfied two-time NVIDIA customer.

05-23-2003, 01:47 AM
Originally posted by LSlugger
If you like to play games, I would not recommend Matrox. I have heard less than stellar reviews of ATI's drivers. I'm a fairly satisfied two-time NVIDIA customer.

If you like to play games then you choose the NVIDIA or the ATI card - the Matrox card is clearly the absolute last choice there.

You may have heard less than stellar reviews of ATI drivers, but that's long long gone (I do not assume you care for websites testing pre-release drivers).

If a user needs a good image quality he should either get an ATI or a Matrox card - both offer unmatched display output quality.

If you don't care about IQ go with NVIDIA (never the MX line) - their GF4 cards are dirt cheap right now, so nothing is wasted. You can also get a GFFX card, but you invite evil with it.

If I where about to upgrade I'd wait till fall - new cards will come out showing more than core refreshes.

05-23-2003, 07:10 AM
If you want to work with OpenGL stay away, and i mean REALLY REALLY FAR away form Matrox cards, yes, the Parhelia might have a nice FP internal buffer, but so do all the new nVidia and Ati cards...

Ati has really worked really hard on their drivers and i personally think it is starting to get really tough to choose between nVidia and Ati right now, go with any of those, whatever makes you more comfortable (make sure though some retailer doesn't screw you, the names of the cards have become really confusing, and the chances are you buy a MX or downgraded Ati card have become really big since you can't see by the name anymore what card it really is, just check some comparisments and reviews on Toms Hardware or Overclockers...)

05-23-2003, 08:17 AM
You can also get a GFFX card, but you invite evil with it.

:confused: that's not exactly fair...
The NV30 was never really available, so talking about that is futile.
And I can't see anything evil in NV31 cards (FX 5600 and FX 5600 Ultra), the new detonator FX finally also fixed the anisotropic filtering, and the new Ultras will be clocked at 400/400 and should be a good alternative to Ti4200 or 4400 with full directX9 features and far superior FSAA performance.
Also read that signal quality of the tested cards finally improved over the often blurry GF3/4.
FX 5200 (NV24) is just a GF4 MX replacement, however this one has full directX9 features too, so you really get the features the "FX" promises, unlike the GF4 MX which has pretty much nothing to do with a GF4 Ti, rather a boosted GF2...

If you want to work with OpenGL stay away, and i mean REALLY REALLY FAR away form Matrox cards,

I still remember having a G200 card without ANY OpenGL driver at all for the first year, what a fun! Although Matrox even sees their Parhelia and Millennium Px50 cards as entry level CAD capable i'd really never try to use them for that...performance and driver wise.

05-23-2003, 09:12 AM
Originally posted by Lynx3d
:confused: that's not exactly fair...

Depends on the point of view.
The GFFX (all available and unavailable generations) cover also the first PCi slot with their cooler.
Some motherboards are configured in a way that the first PCI slot is pre-assigned to cover some special PCI cards, so you'll be desperately lost with a GFFX.

The signal quality depends on the filters the individual card manufacturers use. It has improoved, but still it's far away from being the reference of quality.

Raw speed is also questionable with GFFX cards - the latest OpenGL wrapper from some MIT students allow ATI users to run the fancy "NVIDIA only" tech demos and they turned out equally fast and in some parts faster on the ATI 9800 pro.

Not to send out a wrong impression - I have one rig running with an ATI card and a second rig with a Geforce3.
I don't have any complaints about both, but right now I'd recommend ATI (even tho that it doesn't support OpenGL on two monitors - you can still herd your LW windows there).
I initially wanted to have Layout on one monitor and modeller on the other, but I found out that I rarely use both at the same time.
But what I really use is one monitor for Lightwave work and the secondary monitor for reference images, textures, doc's, movies...

In the end I assume it's the very best not to buy a new graphics card now. Both, NVIDIA and ATI may try to either get their new generations ready for christmas or they'll release the very best they can squeeze out of the current architecture (NV3x or R3xx).

05-23-2003, 08:03 PM
The GFFX (all available and unavailable generations) cover also the first PCI slot with their cooler.

Most FX 5200 (Ultra) and 5600 (Ultra) don't cover an additional PCI slot. The 5200 reference design actually suggests passive cooling. I haven't seen any 5900 (Ultra) designs yet, but i think there will be cards that won't block the adjacent PCI slot too.

And that Dawn tech demo...I don't really buy my cards based on the framerate it achieved in a tech demo...especially one written for a chip that never really made it to the stores.

I'm not really trying to be pro nVidia, the NV30 definitely was a shot in the foot, but the NV31, NV34 and NV35 don't look too bad to me. Sure, as long as no 5900 cards are available yet, the Radeomn 9700 and 9800 are the fastest (for games), and the 5600 Ultras are meant to compete with the 9600pro (which is actually slower than a 9500pro, despite it's name).
Unless the Radeon 9600Pro and the GF FX5600 Ultra and FX5900 (value and Ultra) are available in large quantities it's hard to judge about the price/performce ratio of FX cards...
But i expect these cards to fight strong:
FX 5200 (Ultra) against Radeon 8500, 9000, 9100 and 9200 cards
FX 5600 against 9500 and 9600
FX 5600 Ultra against 9600 Pro
FX 5900 value against 9700 and 9700 Pro
FX 5900 Ultra against 9800 Pro.

05-23-2003, 08:35 PM
Wildcat VP990 Dual monitor support 512MB

05-28-2003, 07:54 AM
There with you, WizCracker. Going to order a VP 970 next week.

05-28-2003, 09:40 AM
The GFFX (all available and unavailable generations) cover also the first PCi slot with their cooler.

To be fair, I've always had problems with interrupt sharing issues if I put a PCI card in that slot anyway, so it's not as big a loss as it sounds. I wouldn't buy an FX myself, but unless you really need that slot the size of the board isn't really a valid reason to look elsewhere.

05-28-2003, 10:48 AM
And here we have the first FX5900 Ultra that won't take 2 Slots...
Asus V9950 (http://www.hartware.de/showpic.php?type=news&id=33188&nr=1&path=http://gfx.turtled.com/n/33000/33188_1b.jpg)
But i guess you better put a slim PCI card below that, or it will block the air intake...but that's actually already a problem with many GF4 and Radeon cards... :(
And yes, IRQ sharing is often a problem, luckily not with my 760MPX board so far (20 IRQs baby :P )

Hm how are those Wildcat VP990 clocked? 512MB alone won't help you at all unless you do some serious killer texturing...and it's still the same old P10 VPU that can at best compete with a Quadro 950XGL...besides the fact that Lightwaves current OpenGL optimizations don't really justify any professional video card IMHO...i HOPE this will change with LW8!

05-28-2003, 12:26 PM
3D Labs doesn't publish clock speeds. Yeah, the VP 970 seems comparable to a Quadro 950, but you can get it for $370.

05-28-2003, 01:20 PM
I have an XFX nVidia GeForce 4 Ti 4200 and love it. I run Lightwave 7.5b on dual 19" monitors in Windows XP Pro. I have had 3 nVidia cards on various PCs and loved all of them..great quality, great looking graphics...lots of cool little options in nView and most of all the best drivers I've seen from any company, period. I've had my share of ATI cards here at work and hate everyone of them. I wish they hadn't gotten them, but they did. And these MACs here at work are all dual 1ghz G4s, so its not the computer itself. I wouldn't recommend anything but nVidia stuff anymore. And www.tigerdirect.com has the XFX GF4 Ti 4200 for $109 right now. :)

05-28-2003, 01:39 PM
Yeah, there's no way in hell that an ATI card was going into one of my machines. The choice was between Nvidia and 3D Labs.

05-28-2003, 01:41 PM

Game cards give you high frame rates at the expense of mis-draws (render problems)

Work station cards give you higher accuracy at the expence of less frame rates.

So: do you want a card to play with or work with or both?

Good work station cards are wildcat vp line from 3Dlabs. I have to saythe best work staion card for $200 is WildcatVP 560 the entry model. It supports open GL on 2 monitors and and supports quite a few apps. Any "VP" card is gonna do you justice for work. I bench marked the wildcat vp560 in 3dmark 03 and came up with a score of around mid 800's (pee-u) but it after all is a workstaion card.

Game cards for work and play are Nvidia. Any of the geforce4 ti line is good for dx8 cards, but for a wee bit more money you can get a radeon 9700 dx9 card too! You will probably have to comprimise for a do-it-all card. Toms hardware and extremetech are good sites for vid card reviews form all angles (work & play)

Have fun!

05-28-2003, 02:15 PM
OK I run a Ti4700 (ka ching) and it runs layout and modeler fine in two moniters (granted one is 14 the other 19!). The drivers are fine and this is with 98 too. Both are on digital ports with the option to export to TV it makes for DVD watching fun!

Of course, if you happen to have far to much money get yourself a GeForce Quadro 2000 a snip at around 1000! (total overkill!)

05-28-2003, 07:54 PM
When are all you nVidia heads gonna turn your GF4's into Quadro4's? ;)

I love these debates about pro cards because there is little if anything to distinguish between the pro and consumer cards - to keep costs down more often than not they are built around the same GPU - the only difference is in the drivers:

GF2,3 -> Identical GPU to Quadro2, Quadro DCC
GF4 -> Minor differences to Quadro4 (hardware antialiased lines)

Radeon 9500 Pro -> Identical GPU to FireGL Z1
Radeon 9700 Pro -> Identical GPU to FireGL X1

Admittedly, the Wildcats ARE designed for DCC (at the expense of DirectX performance) but in the tests I've seen, even the VP970/990 Pro are at a pretty similar performance level to the Quadro4 980 XGL - some tests are better on the 3DLabs cards, some on the nVidia.

With a little bit of creative tinkering you can turn either the ATI Radeons or the nVidia GeForce 4's into pro-level cards and save yourself a heap of cash at the same time. I'm using an Albatron GF4 Ti4680 Turbo AGP x8 - It's a Ti4200 overclocked to faster than Ti4600 speeds and SoftQuadro'ed making it a near match to the Quadro 980 XGL - cost of the card? 140 (supposedly under $150 in the US).

The Albatron's are designed for overclocking (massive copper coolers and heatsinks on the 3.3ns RAM) and RivaTuner takes care of the rest - Don't believe the hype - these companies make all their money from consumer cards - would they really invest a similar amount of cash developing completely new designs for their niche pro-level cards? save your hard-earned for LightWave 8 (of course, if work is buying a Quadro4/FireGL/Wildcat for you, that's another thing altogether) :D

05-28-2003, 08:12 PM
And incidentally - for those of you using a SoftQuadro'ed A2 revision GF4, the NV25AALines patch makes a helluva difference in LightWave (even on NV28 boards).

One more thing - www.tomshardware.com has a nasty scandal about falsified benchmarks hanging over it, Digit Life (http://www.digit-life.com) seem pretty good (although I have issues with their LightWave benchmark accuracy too)... *sigh*

05-29-2003, 07:06 AM
Aegis, thanks for that link to Digit Life, I had lost it and it was one of the prime factors in my decision to go with the VP 970.


05-29-2003, 08:06 AM
Glad to be of service Matt :)

For those that are interested, I've just updated my "SoftQuadro for Dummies" guide - you can get the latest version (a zipped .PDF) here (http://www.darksunworld.com/LightWave3D/SoftQuadroGuide.zip)

05-29-2003, 09:27 AM
Sweet guide for performing that softmodding! I'll have to try that tonight. :D I have an XFX GeForce 4 Ti 4200 with 41.90 Detonator drivers for XP Pro.

Will this have any ill-effect on my DirectX-based games like Soldier of Fortune 2 or Battlefield 1942? I'm also looking forward to Half Life 2 in september. I use my PC for both Lightwave and games, so I dont wanna ruin the PC for either of them.


05-29-2003, 10:15 AM
The author of RivaTuner, Unwinder states that installing Quadro drivers will have a negative impact on DirectX performance as the drivers are optimized to do different things however in my experience (playing America's Army and Unreal Tournament 2003 anyway) I've yet to notice any significant drop in performance - all my DirectX stuff seems to run just as fast as it did when my GFX card was just a boring vanilla Ti4200 - I suspect you'd have to benchmark with 3D Mark 2001 to find any concrete performance differences. So yes - there will be a performance drop in DirectX but it's likely to be so slim that you'll never notice it...

As a special treat, I've uploaded a SoftQuadro patched version of the 42.51 Detonator drivers (18mb) - to use them:

1. Download the latest RivaTuner from www.guru3d.com/rivatuner

2. Download the patched drivers from here (http://www.darksunworld.com/LightWave3D/SoftQuadro42.51.zip).

3. Install RivaTuner and unzip the drivers to a folder on your hard disk.

4. Follow the "SoftQuadro for Dummies" guide from step 9 using the patched drivers.

Have fun!

05-29-2003, 10:22 AM
Soft modding a card from what I have read does affect games so your frames will probably take a hit. What you are doing is enabeling more GL extensions to run from the GPU. I did the softhack on a gforce2mx and seen quite a bit more performance with GL based apps.

Read all the notes associated and you will find out all the particulars on performance. Yopu can always re-install the regular drivers and see which ones work the best for you :)

06-07-2003, 02:51 PM
I'm more or less in the same boat... I am running LW on a Mac right now but will probably be building a PC some time this summer to run games and LW.

The only reason I am going to run LW on the PC is the low price of putting together a fast PC and the available 3rd party-PC only plugins...

What I really want to know is how IS the Wildcat card for gaming???? :confused:

06-10-2003, 07:09 AM
LeeK, probably depends on the game more than anything else. I loaded up Jedi Knight II with my new VP 970, set the resolution to 1280x1024, turned on all the pretty features and it worked great! Nice, smooth frame rates. I haven't given it a serious workout yet, but based on what I saw in my limited test, I'm confident it will do fine. I also play Neverwinter nights, had some slight problems, but after applying game and OS patches, everything is right as rain. FYI, Neverwinter Nights is OpenGL based. The VP comes with a list of optimized drivers for both OpenGL apps, (all professional apps and a generic "Open GL games" setting) AND it comes with a list of Optimized DirectX GAMES! When I first ran Icewind Dale II, it was pretty much unplayable, but after I found the optimized setting specific to it, it worked great! I'm really pleased with my decision to go with the VP 970.

06-10-2003, 08:37 AM
Well I still suggest an Nvidia based GPU if you play FPS games and still do modeling. For overall balance of opengl and direct x go nvidia.

If ya run 3DMark 2003 nvidia cards will probably do better than a 3Dlabs card.

I still have my reservations with ATI for open gl. They have gotten better but I have a sneaky suspicion that Nvidia is overall better suited for work/games card.

Hey Matt, run a 3dmark03 and see what you get for a score :)

06-10-2003, 08:47 AM
Thanks for the info Matt...
Primarily I am playing Unreal Tournament, the original for now and UT2003 when I get a PC.

I also hear the next Doom is going to be system taxing as well :)


06-10-2003, 09:41 AM
Darttman, where can I get the benchmark? I've poked around in the past and had a hard time finding it.

LeeK, I think UT will do fine on the VP. I THINK Jeddi Knight II uses the Unreal engine. Beware!! Doom III will use DirectX 9 if I remember correctly, which the VP does not support.

06-11-2003, 09:58 AM
here the freebie software link for 3dmark03

3dmark03 is directx9 compatable

3dmark01 is dx 8 compatable

bench away!

Most of the newer games are being designed so if you turn on all the bells and whistles on your machine it might will bog down. The reason is they want to extend the shelf life a bit more so technology does'nt outpace the game. Halflife 2, Doom 3, Galaxies online and EverquestII are gonna be those games:)

06-11-2003, 02:38 PM
Man, how circuitous! Finally have the thing downloading after following 10 links! :)

Now, will the '03 require I install DX9? Or will it just run more slowly without it? The VP series does not support DX 9, so ain't no way in hell I'm actually going to install it. Should I not even bother with the '03 benchmark?

06-13-2003, 09:46 AM
3dmark03 needs dx9 so........... don't use it :>)

you might be able to benchmark using it without dx9 but you might only be able to run a few of the tests.

I put dx9 on my box which has a wildcat vp560 (not an uber card like yours) and the 3dmark03 worked barely.....I got like mid 500's P-U

have fun

06-13-2003, 11:35 AM
OK, I'll just load the '01 benchmark. Hopefully I can do it this weekend.

06-13-2003, 01:13 PM
Couple of things about the GeForceFX line. Some of the information I've seen dosen't seem to be correct. I've used the FX 5600 256m now for a few weeks, replacing a GeForce4 440 and before that a Radeon 7000 then before that a Matrox G550. The FX 5600 does not cover the 1st pci slot. The QuadroFX cards do I believe but not the 5600. As for performance, I've been blown away, (especially usind Discreet Combustion, OpenGL hog of a program) Lighwave just trucks along. If you really want performance, go Quadro. And about the NView, don't use it, use the Windows dual monitor support (in XP at least), less problems, no spanning of the task bar.