PDA

View Full Version : would you recommend this?



skywalker113
01-18-2011, 11:56 PM
I am thinking about getting a new video card. Its a Quadro FX 3800.

I am starting to create more complex scenes with millions of polygons, and alot of textures. Test renders are taking a very long time. I would like to decrease the render times and make my workflow go more quickly and smoothly.

The image shows my computer properties.

Here is the link to the video card.
http://www.nvidia.com/object/product_quadro_fx_3800_us.html

UnCommonGrafx
01-18-2011, 11:59 PM
For the purpose of which you speak, no as it will do nothing to speed up your renders as LW uses all cpu for rendering.

Only way to do as you please is to upgrade your cpu.

Lewis
01-19-2011, 12:56 AM
As Robert already said you rendering won't be faster if you buy new GFX card. You have 2 "easy" options to speed up your renderings:

1. Cheap option - overclock your CPU to 4GHz (most of 920s go to 4GHz easily with voltage increase) and you'll easily get 40-45% faster renders.

2. Expensive option - Sell 920 and buy Six Core 970/980 :).

3DGFXStudios
01-19-2011, 02:32 AM
I would never recommend a quadro only if you have a lot of money and don't know what to do with it or you have special software that uses a lot of CUDA cores. Rendering isn't going to change of course because it has nothing to do with GPU's. I you have money enough you can buy one but then I would only buy the fastest one. Buy a geforce 460 instead.

skywalker113
01-19-2011, 01:32 PM
Newtek recommended two high end SLI gaming cards.

wrench
01-19-2011, 01:49 PM
Two? I guess if you have two monitors you'll get the best performance by giving them a car each, but SLI won't help in the slightest since it's only for non-Windowed applications (ie games).

B

UnCommonGrafx
01-19-2011, 01:55 PM
That's funny!!

Two cards for your desires, huh? Boy, wonder what their reasoning for such a suggestion would be considering you wanted sped up renders.

Me thinks someone wasn't listening to you.

3DGFXStudios
01-19-2011, 02:32 PM
One is for spare parts :D

mattclary
01-21-2011, 11:35 AM
Newtek recommended two high end SLI gaming cards.

LightWave (still) doesn't make optimum usage of OpenGL, so your video card gives diminishing returns the more you spend on it. SLI is darn sure not going to gain you anything. They are probably recommending that for the use of dual monitors, but still kind of over-kill, since a good card will have dual heads anyway.

skywalker113
01-21-2011, 08:43 PM
I have a overclocked i7 proccesor with 12 Gig of RAM, and a asus p6 motherboard. Everything is working propery. My video cards are 2 years old. They are Nvidia 8800's. I should be getting lightwave 10 soon and Newtek told me it takes advantage of OpenGL.

I'm thinking about getting a pair of Nvidia GTX 570's. I hope im doing the right thing what do you think?

JonW
01-22-2011, 01:35 AM
http://www.videocardbenchmark.net/high_end_gpus.html

I’d go for a GTX 460, it has a good price point. & replace your CPU with a 980 if you have the inclination to spend.

2 x GTX 570 are just not going to help with LW. My GTX 280 is just fine driving a Dell 3007. If you want VPR to update quickly it's done with the CPU.

CharlieL
01-23-2011, 01:52 PM
I thought that changing the video-cards should make a big difference as I had two very
ordinary Nvidia cards. After a change to two GTX480 with SLI-connection I must say I
feel a little disappointed. They did not give the boost I expected.

There are rendering solutions that use the grahics card, but with 1GB available against
16 GB for the processor I think it will fall short.

As I am on Vista 64 it might affect my experience. That is a really bad product.

Rayek
01-23-2011, 11:08 PM
I thought that changing the video-cards should make a big difference as I had two very
ordinary Nvidia cards. After a change to two GTX480 with SLI-connection I must say I
feel a little disappointed. They did not give the boost I expected.


Of course not: only games make use of SLI to accelerate the 3d view, and, additionally, the GTX480s are optimized for making GAMES (directx) run fast, not the openGL 3d view. This, by now, has been established by both users and Nvidia themselves. The older GTX280 actually runs most opengl much faster than the 4xxx line. Nvidia crippled their consumer (read: games oriented) line of cards in favour of their workstation Quadra cards. Personal experience - I had to return my 480 due to abysmal opengl 3d view performance, no matter what I tried. (Now running a soft-modded ati 5870-->v8800.)



There are rendering solutions that use the graphics card, but with 1GB available against
16 GB for the processor I think it will fall short.


Depends on the project. And your workflow. But yes, huge scenes are out of the question.



As I am on Vista 64 it might affect my experience. That is a really bad product.

No, it is not nearly as bad as it was made out to be at the beginning - you should not believe the negative hype. I have been working on a vista 64bit machine for the last 15 months, and experienced perhaps 3 or 4 critical crashes (some caused by games, others by my 'experimenting'). It runs fast and stable on my overclocked core I7 - though Vista64 loves its RAM (I got 12gb at the moment). Stability is beyond improved compared to the XP setup I was using before.

Your bad viewport performance is due to the inefficient video card setup you have - SLI does not help, nor does the 4xx line. Sad, but true. They work wonders for Octane, or a similar CUDA renderer, though.

Some people say they solved the performance issue by:
- turning off double sided poly lighting
- using msi afterburner to force the cards to not drop 50% clock speed in opengl
- change software (some 3d apps aren't nearly as much affected as others)

By the way, according to this (http://blenderartists.org/forum/showthread.php?t=191597&page=10) thread on BlenderArtists, the 570 fermi seems to have similar problems.

Having read just about every article and post about this issue, I am still left standing confused about the whole Nvidia 4xx/5xx opengl mess - Nvidia should send a clear message to everyone regarding these issues.

I would be interested in real LW/core numbers on different graphic cards. Is there some kind of (semi)official Lightwave benchmark to test the opengl performance in modeler/layout?

wibly wobly
01-25-2011, 02:01 PM
Does anyone know if doing the bios hack to turn cards into Quadros fixes this? I was looking at getting a 460 for my aging q6600. I'd get a 285 if I could find one in my area. I'm not all that keen on going second hand if I can help it. I'm starting to get worried about my 8800 dying.

Rayek
01-25-2011, 08:43 PM
You cannot hack a 4xx card into a Quadro. No bios hack, no soft mod at present, and probably never will be possible. The g80 was the last mod-able Nvidia card (8800gts/gtx).

Hopper
01-25-2011, 09:02 PM
As a small, but odd comparison - I replaced my 9800GX2 for a GTX480 last month (tired of babysitting the heat) but so far CORE has been significantly smoother with the 9800GX2. Granted there's still only a limited set of driver versions that the 480 is compatible with, but still... Give it a few more revisions and I'm sure things will change.

With a similar comparison, there's no real noticeable difference between the two with LW10, 9.x, and 8.x as far as OpenGL speeds (in both texturing modes).

So far, with the limited gaming and general usage of the 480, the only real difference I've seen so far between the two is that my machine is no longer a nuclear reactor pumping out 105c heat under my desk. I can set all my games to max detail, resolution, etc... and get the same frame rates from both (granted the 9800gx2 is actually 2 cards).

wibly wobly
01-26-2011, 06:50 AM
Core is one of the other reasons why I was looking at upgrading (beside the high failure rate that I keep hearing about from the 8800s, I know two people that have had them die recently). I'm right at the bottom of the video card list for what's compatible for Core. The old machine is still kicking and I'd like to just throw a couple of bucks at it to keep it good as a backup machine when I do upgrade, probably sometime this year.

If the 4xx are bad and potentially so are the 5xx. This is pretty frustrating. I have been using nvidia cards for a long time without any real issue.

Rayek
01-26-2011, 02:32 PM
Core is one of the other reasons why I was looking at upgrading (beside the high failure rate that I keep hearing about from the 8800s, I know two people that have had them die recently). I'm right at the bottom of the video card list for what's compatible for Core. The old machine is still kicking and I'd like to just throw a couple of bucks at it to keep it good as a backup machine when I do upgrade, probably sometime this year.

If the 4xx are bad and potentially so are the 5xx. This is pretty frustrating. I have been using nvidia cards for a long time without any real issue.

Unfortunately, I read in the Blender Artists forum that the 5xx seems to suffer from the same problems. And, (but I am just guessing here) it is not driver related, but hardware related. Nvidia support told me that the bad performance of my 480 was 'expected behavior' in opengl apps (that answer took about a month after the first support ticket). Suffice to say, I got so frustrated (could hardly animate in Blender), that I decided to return the 480 and switch to ATI for the first time in 12 years. The soft-modded drivers work rather well!

No more Octane for me, though :-(

wibly wobly
01-27-2011, 09:35 AM
So are we going to start using Intel and Matrox if we don't want to spend 10x the money on Quadros?