PDA

View Full Version : radeon HD 3870 x2 or N8800?



scratch33
03-12-2008, 04:52 AM
Hello,

I want to buy a new graphic card. What is the best choise?

- Asus extreme N8800GTX/HTDP 768 MB

or

- HIS Radeon HD 3870 X2 1 GB?

Thank you for help.

mattclary
03-12-2008, 06:02 AM
Neither.

8800GT 512

scratch33
03-12-2008, 06:41 AM
8800 GT 512mb has better performance than the other 2?

COBRASoft
03-12-2008, 09:04 AM
What card(s) does one have to buy to attach 2x 30" LCD monitors on? This 8800 with Dual DVI can handle 2 30" monitors of 2560x1600 resolution each?

DragonFist
03-12-2008, 09:13 AM
I have a 3870 and am pretty happy with it. The problem with the x2 is that Lightwave (and most 3D apps) don't get any benefit from multi- GPU solutions which the x2 is.

I've never really been a fan of nVidia price/performance wise. IQ is better on ATI cards and while the nVidia cards are without a doubt faster this generation, I don't think that speed gain is worth the nearly double price.

For less that $200 the 3870 is not as fast as the $400+ cards but they aren't twice as fast.

scratch33
03-12-2008, 09:31 AM
The x2 is not used??? but in the bencharkt I thing it are used. Is it a lightwave particularity? And if I put 2 graphic cart in crossfire mode, are they used ?

DragonFist
03-12-2008, 09:41 AM
I am not an expert with this. But part of my understanding is that SLI configuration have problems with changing textures while rendering and snag on this. This doesn't happen that much in games, but definitely happens say, while you are surfacing.

Anyhow, there are other issues. I don't think that it is a problem of "are they used". I think it is more of a problem of "are they used effectively".

And to my knowledge, this is not a LW only thing. 3D apps are not games. And the SLI solutions are aimed at games and not even all games play well with them.

mattclary
03-12-2008, 09:57 AM
I have a 3870 and am pretty happy with it. The problem with the x2 is that Lightwave (and most 3D apps) don't get any benefit from multi- GPU solutions which the x2 is.

I've never really been a fan of nVidia price/performance wise. IQ is better on ATI cards and while the nVidia cards are without a doubt faster this generation, I don't think that speed gain is worth the nearly double price.

For less that $200 the 3870 is not as fast as the $400+ cards but they aren't twice as fast.


I'm seeing a $40 difference. What are you looking at? The 8800GT 512 is currently the sweet spot for price/performance.


http://www.newegg.com/Product/Product.aspx?Item=N82E16814133205

http://www.newegg.com/Product/Product.aspx?Item=N82E16814140087

mattclary
03-12-2008, 10:01 AM
oops

mattclary
03-12-2008, 10:09 AM
Scratch33, go here and look at benchmarks. I'm not sure which of the tested apps use OpenGL (which is what matters for LightWave), but you can get a decent idea of what the various card choices will deliver. Once you have an idea of performance level for each card, you just need to figure out if spending the extra cash is worth it to you.


http://www.anandtech.com/video/showdoc.aspx?i=3209&p=1

Personally, as I already said, the 8800GT is the current sweet spot, and will perform well. If you want to drop the extra cash on something, get a better CPU to speed your renders up.

DragonFist
03-12-2008, 10:12 AM
Fair enough. The price difference quite a bit different when I was shopping for my new card a couple months ago.

However, the performance difference is not that much a few frames here and there and some cases where the 3870 wins. And the IQ is a factor.

And $40 is still $40. But at least that's a closer match for the better frame rates.

mattclary
03-12-2008, 10:29 AM
I need to hold my watch at arms length to read the date, so image quality for both already exceeds my sensor resolution. ;)

DragonFist
03-12-2008, 10:49 AM
LOL. Understood. Though, for me, I was also taking gaming into account, not just Lightwave and dx10.1 support factored in.

Anyhow, what I can say is that I was not disappointed in the HIS 3870 IceQ I got.

Still wouldn't recomment a multi-GPU part from either company for 3D apps at this time without direct support for the technology by said 3D apps (which may come in the future if the workstation versions start supporting it.)

mattclary
03-12-2008, 12:41 PM
Pretty sure the 8800gt supports DX 10.1

DragonFist
03-12-2008, 12:51 PM
Pretty sure the 8800gt supports DX 10.1
http://www.fudzilla.com/index.php?option=com_content&task=view&id=3904&Itemid=1


Gainward Geforce 8800GT is a PCI-Express 2.0 card, fully compatible with PCI-E slots on all motherboards. If you’re aiming at DirectX 10.1 and planning a long term investment, then you should know that this card doesn’t support it. For a normal user that might not be that big a deal, especially knowing that most people do their gaming on Win XP that supports just DirectX9. However, DirectX 10 is slowly gaining ground and the lack of DirectX 10.1 and Shader model 4.1 support is a big minus for Nvidia.

scratch33
03-12-2008, 01:33 PM
Thank you to all for help


Scratch33, go here and look at benchmarks. I'm not sure which of the tested apps use OpenGL (which is what matters for LightWave), but you can get a decent idea of what the various card choices will deliver. Once you have an idea of performance level for each card, you just need to figure out if spending the extra cash is worth it to you.


http://www.anandtech.com/video/showdoc.aspx?i=3209&p=1

Personally, as I already said, the 8800GT is the current sweet spot, and will perform well. If you want to drop the extra cash on something, get a better CPU to speed your renders up.

We can see in this bench that ati 3870 is the faster in single card mode. Only a pair of 8800gt in sli mode hare better and not so better.

I don't know. I thing I'm going tro try the radeon but not sure. I don't like very much the ati drivers.

DragonFist
03-12-2008, 01:38 PM
I think that you are looking at the scores for the 3870 x2 as opposed to a normal 3870.

Though it is one card, it is two gpu's on one card. I don't want you to get one and find that it doesn't perform well in 3D apps. I would investigate this aspect more. Don't go by what I say even. I only know what I've read on the internet on this and I think we can all agree that the "internet" should not be considered a reliable source of verified information.

mattclary
03-12-2008, 02:51 PM
I stand corrected on DX 10.1

RedBull
03-12-2008, 03:27 PM
Out of the cards you listed:

N8800GTX/HTDP 768 MB: Is much faster than any 8800GT/S or 3870.
When it comes to LW, you should be looking at the Bandwidth in Gbps to give you an idea of what will perform the best.

Example: actual bandwidth figures......

8800GTX = 86.7
ATi 3870 = 72.0
8800GTS = 64.0
8800GT = 57.6

As you can see with the 320Bit bus and 768Mb of memory the 8800GTX is faster, note that if you were to be playing games the 8800GTS or 3870x2 would be the faster videocards, but for LW the 8800GTX is.

mattclary
03-12-2008, 06:22 PM
"Much" faster?

http://www.anandtech.com/video/showdoc.aspx?i=3175&p=3

And bandwidth is not going to be a factor with LightWave.

RedBull
03-12-2008, 07:11 PM
"Much" faster?

I'd say 64Gbps compared to to 86.7 Much faster, the 320bit bus and larger memory make a significant difference.


And bandwidth is not going to be a factor with LightWave.

Why is that? Actually bandwidth is the only factor that affects LW, and overall performance of a Videocard, any other numbers are meaningless.. You may wish to recheck your facts, before correcting mine.

The reviews you have are for games, which i clearly stated will be faster on the 8800GTS, because it's a G92 core with higher clocks, and 128SP....LW does not use any SP for anything and therefore Quake benchmarks are totally useless, as are any other gaming benchmarks.

If you play games the 3870x2 or 8800GTS would be my picks, if Mudbox and LW are your thing the GTX is faster and better equipped.

mattclary
03-13-2008, 07:19 AM
Maybe I'm not clear on how bandwidth comes into play. If you are playing a video game with lots of textures and geometry and trying to get as many frames/sec as possible, seems like that would make more of a demand on memory bandwidth that LightWave. Granted, the same objectives apply, but seems like the game would put a much greater stress on the bandwidth than anything we could do in lightwave.

DragonFist
03-13-2008, 12:50 PM
Well, for one thing, games are often specifically written to only move textures into or out of memory during level loads, not during ogl render.

For another, games also max the amount of texture memory in use at any given time. 3D has as many textures as the artist puts at the size the artist puts it. And the artist isn't performing calculations of "does all this fit in my video texture memory space?".

Relatively simple Lightwave scene could be moving way more texture data around than a game ever would.

AbnRanger
03-13-2008, 05:33 PM
Pardon me if I'm missing something here, but I distinctly remember reading articles about the 3870 and 8800GT being relatively close in price and performance...THEN ATI released the 3870 X2...and Tom's Hardware (not much of an ATI fan) actually had it as the fastest production card currently on the market. So, if the regular 3870 held it's own with the 8800 GT, how is it that a card with 2 3870 chips on it is inferior to the 8800 GT.
I'm missing the logic here.
The fastest card on the current market for a few hundred less than it's closest competitor....? Hhhhmmm, that's a tough decision to make. :stumped:....not really :screwy:

This is essentially a crossfire config on one card, but with better interoperability between the 2 GPU's. That's really nice for anyone with 2 monitors.

DragonFist
03-13-2008, 05:53 PM
The x2 is not an inferior card. Never said that it was. However, I am pointing out that I have read an article stating that multi gpu solutions (sli, crossfire) don't play well with pro 3D apps for some technical reasons and can perform badly in these environments. One of the reasons that the pro cards are not multi-gpu.

Now, I could be wrong about this or my data may be out-dated. I am simply warning that this should be investigated before laying out $ on a multi-gpu solution that may perform great in games but badly in Lightwave.

I have a 3870 in the machine that I am on. I am quite happy with it and have had only an issue with the cooling fan auto-speed not being at settings that I prefer (it is tuned for silence rather than high load performance). Easily handled with ATI Tray Tools or Riva Turner.

Other than that, I can play Source games at 1900 x 1200 and get 167 fps.

Oblivion with all settings on high is as smooth as butter. Lightwave is responsive way more responsive than it was with my old card.

I couldn't really ask for more at the price I got it.

And for games the x2 nearly doubles the experience. But I would check on the mulit-gpu thing, that's all.

RedBull
03-13-2008, 06:02 PM
So, if the regular 3870 held it's own with the 8800 GT, how is it that a card with 2 3870 chips on it is inferior to the 8800 GT.
I'm missing the logic here.

Well if we are talking specifically games and not LW, than a few things come into play, the 3870 was a fair bit slower in most benchmarks, and the odd game it came close, the 3870x2 really only adds typically on average 25% performance increase over the single card, and generally only at high resolutions. Games and XFire/Sli generally need to be optimized to take full advantage and many are not. Also keep in mind the DDR4 ram and speed of ram is different on the 3870x2, as well, which is only DDR3, as are some limitations from being a 1+1 card.

Nvidia really clocked their G92 core cards high, to really put the nail into the Ati coffin this round. The 8800GT/S cards do usually perform certain functions faster. AA/AF for example.


The fastest card on the current market for a few hundred less than it's closest competitor....? Hhhhmmm, that's a tough decision to make. :stumped:....not really :screwy:

I'm not sure which benchmarks make you believe the 3870x2 is consistantly faster. It's certainly nice bang for buck if you are a big games player, using high resolution, on a fast CPU based PC, but the price you pay for some games to have some performance increase is a bit of a con, but a good way for Ati and Nvidia to design crapper cards, and sell us two of them.


This is essentially a crossfire config on one card, but with better interoperability between the 2 GPU's. That's really nice for anyone with 2 monitors.

Well a single Nvidia or Ati card works well for dual monitor users, It's only while in Crossfire or Sli mode that Dual Monitors don't work.