PDA

View Full Version : Lightwave and Geforce 280?



Paul Brunson
01-05-2009, 01:04 AM
Could use some input on this. Currently I'm running a Geforce 7800 GTX and I'm working on a project with really high poly counts; like 2 million+ polygon count models and its a slow process

Does anyone have any experience or input on how much a difference upgrading to the latest Geforce 200 series card would make? I'm just looking to get viewport redraws and openGL interactivity up.

(I realize the Quardo FX cards are the ultimate for 3D work, but they are simply way outside my price range.)

AbnRanger
01-05-2009, 08:13 AM
I just recently upgraded my video card, and since I've owned ATI's for the most part with no problems (I'm stunned to hear others complain when I haven't experienced any troubles with them), I was prone to go with the Under Dog once again. However, I was very tempted to go with NVidia for the CUDA and PhysX capability.

Nevertheless, ATI always seems to offer the most bang for your buck everytime I do some research on the cards. This time was no exception.
They just released software (embedded in the latest Catalyst Driver 8.12) that includes a free Video Encoder (Avivo...NVidia's Badaboom cost $30) that uses its 800 stream processors to cut the task down to a fraction of the time it takes using just the CPU. It enables a host of applications to use the stream processors (on the ATI HD 4000 series)...including Adobe CS4, MicroSoft Office, SilverLight, ArcSoft, etc.
http://www.atomicmpc.com.au/News/128373,ati-puts-stream-everywhere.aspx

The 4870 is the sweetspot...so much so that before I made my decision, I noticed the price on them at Newegg (all brands) went up about $30-$50 just after the Holidays. The GTX 260 is in the same price range, and is very close to the 280 (make sure to get the 260+ because they recently added more stream processors to help it compete with the 4870...from 192 to 216, and I think it clocks a bit higher), so you'll not likely notice any tangible difference by spending the extra $80-$100 on a 280.

I read that NVidia just put out a bunch of bad chips (believe it was the 8800/9800 series), and is damage-control mode with many of it's vendors and customers. That had a little bit to do with me sticking with ATI this time. Plus, ATI keeps NVidia on their toes and FORCES them to price their cards more reasonably (the GTX 280 was $600-$700 when ATI released the HD4870 X2...which currently dominates the market and is priced between $450-$500...now the GTX280 is between $350-$400).

I also remember hearing that PhysX has/is being ported to ATI cards
http://www.tgdaily.com/html_tmp/content-view-38137-135.html

I use Max mostly, and ATI is close to enabling Havoc Physics (engine most developers use, and the engine for 3ds Max's Reactor) on its Stream processors...so that will be a bonus for me. With OpenCL just recently being ratified, both cards should be able to use GP-GPU processing on a growing number of applications.
http://www.bit-tech.net/news/2008/12/11/amd-exec-says-physx-
will-die/1

Wonder if Jay, Chuck or anyone in LW development can enlighten us if they are working with either or both ATI and NVidia to enable stream processing for LW. Imagine if even LW dynamics was enabled? What if they enabled RENDERING?! You think a few seats might get sold then?

By the way, I went with an ATI 4850. Works like a champ.

IMI
01-05-2009, 11:42 AM
Could use some input on this. Currently I'm running a Geforce 7800 GTX and I'm working on a project with really high poly counts; like 2 million+ polygon count models and its a slow process

Does anyone have any experience or input on how much a difference upgrading to the latest Geforce 200 series card would make? I'm just looking to get viewport redraws and openGL interactivity up.

(I realize the Quardo FX cards are the ultimate for 3D work, but they are simply way outside my price range.)

I'm not sure about the 200 series, but I'm using an Nvidia 8800 GTS with 640 MB, and it works great. The downside is that my previous card was a 7950 GTS and when I upgraded I didn't really see much difference in LW OGL. LW's OpenGL is handled more with your CPU than your GPU.

The card is great for games, but I wouldn't upgrade to a 200 series just for LW. You can get an 8800 or 9800 for far less money.

As for ATI, no idea, I've never owned one.

Stooch
01-05-2009, 11:46 AM
i upgraded to asus GTX 260
(the super quiet one) and didnt really see much change from my passively cooled ati 8800

the gaming is nice though.

Mitja
01-06-2009, 08:11 AM
As said, LW handles ogl via cpu mainly.

Otterman
01-06-2009, 09:53 AM
Ive got one of those monster NVIDIA Quadro FX 5600 card and lightwave still struggles with 2million+ polys! They are not all they are cracked up to be.

Money better spent elsewhere i say!

AbnRanger
01-07-2009, 08:48 AM
Ligthwave really needs to shift to DirectX because OpenGL is really a dying standard...that's explicitly why it perfroms much better and has more features in 3ds Max than OpenGL.
One of the main benefits of DirectX 10 was its vastly improved communication between the CPU and GPU.

DirectX 11 is supposed to really open things up and help with GP-GPU stream processing too.
http://www.tomshardware.com/reviews/opengl-directx,2019.html

COBRASoft
01-07-2009, 09:09 AM
Although I'm not an Apple user, switching to DirectX would not please Apple users. DirectX -> MS -> Windows only!

avkills
01-07-2009, 09:21 AM
Direct X is not an open standard, so it makes no sense for LW to move to it. It is mainly for games; and like CobraSoft mentioned, it would hose LW's Apple base.

-mark

mattclary
01-07-2009, 10:07 AM
that's explicitly why it perfroms much better and has more features in 3ds Max than OpenGL.

No, OGL sucks in LightWave because it is poorly implemented.

jayroth
01-07-2009, 10:18 AM
Ligthwave really needs to shift to DirectX because OpenGL is really a dying standard...that's explicitly why it perfroms much better and has more features in 3ds Max than OpenGL.
One of the main benefits of DirectX 10 was its vastly improved communication between the CPU and GPU.

DirectX 11 is supposed to really open things up and help with GP-GPU stream processing too.
http://www.tomshardware.com/reviews/opengl-directx,2019.html

Eh? OGL is THE standard; DX is available only for Windows. If you check, we also run on the Mac...

AbnRanger
01-07-2009, 10:25 AM
Notice I mentioned that 3ds Max allows the user to choose OpenGL or DirectX...it just so happens that with it's footprint in the gaming industry, there are more than a few benefits of using DirectX...such as the ability to convert standard Max materials to DirectX materials so a developer can see right in the viewport how that shader will look in a game engine:

http://download.autodesk.com/media/3dsmax/directx_max8_380k.mov

More DirectX goodness:
http://download.autodesk.com/us/3dsmax/siggraph2007/demos/Review.mp4

AbnRanger
01-07-2009, 10:33 AM
Eh? OGL is THE standard; DX is available only for Windows. If you check, we also run on the Mac...

Is that so?

Quote from previously linked article:
"...Meanwhile, Microsoft was starting from scratch, and the learning curve was steep. So, for several years, Direct3D’s capabilities were beyond the curve, with an interface that many programmers found a lot more confusing than OpenGL’s. But nobody can accuse Microsoft of being easily discouraged. With each new version of Direct3D, it gradually began to catch up with OpenGL. The engineers in Redmond worked very hard to bring performance up to its rival API’s level.

A turning point was reached with DirectX 8, released in 2001. For the first time, Microsoft’s API did more than just copy from SGI. It actually introduced innovations of its own like support for vertex and pixel shaders. SGI, whose main source of revenue was the sale of expensive 3D workstations, was in a bad position, having failed to foresee that the explosion of 3D cards for gamers would prompt ATI and Nvidia to move into the professional market with prices so low (due to economies of scale) that SGI couldn’t keep up. OpenGL’s development was also handicapped by bitter disputes among its proponents. Since the ARB—the group in charge of ratifying the API’s development—included many different, competing companies, it was hard to reach agreement on the features to be added to the API. Instead, each company promoted its own agenda. Conversely, Microsoft was working solely with ATI and Nvidia, using its weight to cast a deciding vote if there was disagreement.

With DirectX 9, Microsoft managed to strike a decisive victory, imposing its API on developers. Only John Carmack and those who insisted on portability remained faithful to OpenGL. But their ranks dwindled. And yet a reversal of fortunes was still possible. It had happened with Web browsers, after all. Even when a company has maneuvered itself into a near monopoly, if it rests on its laurels, it’s not all that rare for a competitor to rise from his ashes. So when the Khronos group took over OpenGL two years ago, many hopes were rekindled with all eyes on the upcoming SIGGRAPH conference that year.

...We were expecting a lot from OpenGL 3, and as you can tell by reading this article, we’re disappointed—both in the API itself (with the disappearance of promised features) and in the way it’s been handled (a year-long delay and a lack of clear communication on the part of the Khronos group). With this version, OpenGL barely keeps up with Direct3D 10, and at a time when Microsoft has chosen to publicize the first details of version 11 of its own API."

jayroth
01-07-2009, 10:38 AM
Yes, it still is, like it or not, Don. A standard, BTW, does not indicate the best or the latest or greatest. LW is a multiplatform application; as DX is limited to Windows, , that really prevents us from using it.

hrgiger
01-07-2009, 10:43 AM
Yes, it still is, like it or not, Don. A standard, BTW, does not indicate the best or the latest or greatest. LW is a multiplatform application; as DX is limited to Windows, , that really prevents us from using it.

So, us windows users are being held back because of Lightwave support for the Mac?(Going off the assumption that DX would be better then OGL).
Is it possible we could support both? I remember when I used Animation Master that you could choose either DX or OGL from your display options.

jayroth
01-07-2009, 10:46 AM
Sure. How long would you like to wait for it?

IMI
01-07-2009, 10:49 AM
So, us windows users are being held back because of Lightwave support for the Mac?(Going off the assumption that DX would be better then OGL).
Is it possible we could support both? I remember when I used Animation Master that you could choose either DX or OGL from your display options.

You can choose between DX and OGL in max. DX, actually being recommended.
You can also choose between DX and OGL in Deep Exploration (CAD edition). I think DX actually performs better there than the OGL, but not for all things.

Well, maybe LW 10 or 11? :)

avkills
01-07-2009, 10:59 AM
I think Newtek would much rather spend their time making rendering faster, updating the various modules that still have old code in them and adding new features to make LW an appealing platform. DirectX support isn't going to be a earth-shattering addition that is going to make every one on Window's ditch their current toolset for LW, IMO.

Besides, once all of the old legacy code has been replaced -- it is probably something that would be a lot easier to implement.

-mark

AbnRanger
01-07-2009, 11:03 AM
I think Newtek would much rather spend their time making rendering faster, updating the various modules that still have old code in them and adding new features to make LW an appealing platform. DirectX support isn't going to be a earth-shattering addition that is going to make every one on Window's ditch their current toolset for LW, IMO.

Besides, once all of the old legacy code has been replaced -- it is probably something that would be a lot easier to implement.

-markSeeing that DX is fast closing the gap between what you see in your viewport and what the final renderer will look like (and it's use of stream processing), I wouldn't underestimate what it in fact can do

COBRASoft
01-07-2009, 11:18 AM
Jayroth: why not make the support for OGL better instead of implementing DirectX. It would take less time I guess and everybody would benefit from it. Put DirectX as an addition toLW 10.1 :)

jayroth
01-07-2009, 11:24 AM
Hey, that's a great idea! :)

avkills
01-07-2009, 11:36 AM
Seeing that DX is fast closing the gap between what you see in your viewport and what the final renderer will look like (and it's use of stream processing), I wouldn't underestimate what it in fact can do

I didn't think I was implying what DX can or can't do. I'd eat it up in a heartbeat if Microsoft would license it to Apple to implement in OS X, but I just do not see that happening any time soon.

I guess that is one of the negatives for many of working in a cross platform package. But it can also be a positive in many ways more.

-mark

kfinla
01-07-2009, 11:38 AM
The good and bad is LW's OpenGL has lots of room for improvement, most if not all other 3d apps on the same hardware perform better. Take the same heavy model (obj) an tumble it in modo, maya, LW, etc and see.

Also of note, AFAIK Microsoft went out of its way in Vista to hinder openGL access/performance, I forget the details. But basically it takes more steps to do the same operations. So it is no surprise that DirectX (a MS technology) performs often better on windows than openGL does.

Lightwolf
01-07-2009, 11:44 AM
(and it's use of stream processing)
Which really isn't an argument either. DX11 still needs to come and then one might as well use OpenCL, which is cross platform and supports a lot more than just GPUs for processing.

Cheers,
Mike

AbnRanger
01-07-2009, 11:50 AM
Also of note, AFAIK Microsoft went out of its way in Vista to hinder openGL access/performance, I forget the details. But basically it takes more steps to do the same operations. So it is no surprise that DirectX (a MS technology) performs often better on windows than openGL does.From the article, you glean that it's not a matter of Micro Soft trying to cripple OpenGl capability, but instead that they WORKED HARD to provide a better API.

When competing companies within the Kronus Group can't agree on what to include or improve, while MS keeps chugging away, you can see how that is a losing cause.

kfinla
01-07-2009, 12:19 PM
I am no expert in GPU and OS interaction. I just recall an article a few years ago that made it sound like there was a definate preferrence given to DX over OpenGL in vista. It sounded like DX was deeply routed and openGL was a 2nd class citizen. I aslo didn't get the impression this was the result of DX improvement but an engineering choice that put openGL in another time zone compared to the access DX had to processes.

And yes as scary as it sounds.. all the Windows users that are complaining about openGL performance actually have better performance than their mac counterparts do in openGL.

Ivan D. Young
01-07-2009, 12:42 PM
I will add a vote for OpenCL as well. might as well, it is after all a sister app to OpenGL and looks like it will have very similar DX 11 functionality.

MrWyatt
01-07-2009, 01:57 PM
I vote for OpenBS.
well never mind
:D

Matt
01-07-2009, 02:10 PM
Can we have a version where LW whips out a pencil and paper and ACTUALLY draws the preview on screen?

No one else has that! :D

avkills
01-07-2009, 02:58 PM
Can we have a version where LW whips out a pencil and paper and ACTUALLY draws the preview on screen?

No one else has that! :D

Now that would ROCK!

-mark

Lightwolf
01-07-2009, 03:08 PM
I vote for OpenBS.
well never mind
:D
We already have that, it's called marketing (and I am not pointing any fingers anywhere...).

Cheers,
Mike

Paul Brunson
01-07-2009, 04:09 PM
lol, didn't realize I'd start a discussion on DX vs OGL.

So if I understand correctly Lightwave doesn't really use the graphics card much, so if I want better OpenGL performance I'm better off saving up for a new processor / system using the new core i7 processors. Is that the general concessus? (Currently running a dual processor dual core opteron machine.)

Also, an off the cuff question, is there a way to monitor how much lightwave pushes the graphics card? Like monitoring CPU utilization in task manager? In other words GPU utilization monitoring?

Stooch
01-07-2009, 05:34 PM
Can we have a version where LW whips out a pencil and paper and ACTUALLY draws the preview on screen?

No one else has that! :D

i wouldnt be surprised if it will be faster than the current implementation.

AbnRanger
01-08-2009, 01:44 AM
lol, didn't realize I'd start a discussion on DX vs OGL.

So if I understand correctly Lightwave doesn't really use the graphics card much, so if I want better OpenGL performance I'm better off saving up for a new processor / system using the new core i7 processors. Is that the general concessus? (Currently running a dual processor dual core opteron machine.)

Also, an off the cuff question, is there a way to monitor how much lightwave pushes the graphics card? Like monitoring CPU utilization in task manager? In other words GPU utilization monitoring?Maybe Chuck, Jay or someone else can weigh in with more specific answers, but I'm afraid you've discovered one of LW's current weaknesses. Mesh editing speed remains an area that is in very sore need of a re-write as Modeler didn't get much love in the 9.x cycle. Knowing that LW is going through a modular re-write, perhaps Modeler's day will come with v10. We'll see, but just using a beefier card or CPU will not fix the problem.

Matt
01-08-2009, 02:22 AM
i wouldnt be surprised if it will be faster than the current implementation.

Whoohoo! Dum dum tsssk! :D

Oedo 808
01-08-2009, 05:29 AM
Isn't Nvidia's answer to the 4870X2, the GTX 295 coming out around now? I wonder what if anything that and the 285 will do the price of existing cards.

I saw that the GTX 280 was going for £225 a few days ago at one retailer, can't see that being the norm but at that price I probably would have chosen it above the 4870 if I were in the market for a card and had £225 to spend.

If nvidia do shuffle their prices I hope ATI don't get caught with their pants down, I remember before the 48xx series were released last year I didn't see anyone recommending an ATI card, and things could go that way very quickly again. And I guess rightly so if the price/performance is off, which it certainly was for the 38xx if you were a gamer that liked to add a bit of AA.

It's dangerous enough having only two providers for such a major product, but can you imagine prices in an Nvidia only market place? :eek:

AbnRanger
01-08-2009, 06:19 AM
Isn't Nvidia's answer to the 4870X2, the GTX 295 coming out around now? I wonder what if anything that and the 285 will do the price of existing cards.

I saw that the GTX 280 was going for £225 a few days ago at one retailer, can't see that being the norm but at that price I probably would have chosen it above the 4870 if I were in the market for a card and had £225 to spend.

If nvidia do shuffle their prices I hope ATI don't get caught with their pants down, I remember before the 48xx series were released last year I didn't see anyone recommending an ATI card, and things could go that way very quickly again. And I guess rightly so if the price/performance is off, which it certainly was for the 38xx if you were a gamer that liked to add a bit of AA.

It's dangerous enough having only two providers for such a major product, but can you imagine prices in an Nvidia only market place? :eek:
It's my understanding that the X2 cards (dual GPU on a single card) don't benefit programs like LW since they are essentially a crossfire or SLI setup...and content creation programs like LW, Max and Maya can't make use of the second GPU.

IMI
01-08-2009, 06:51 AM
It's my understanding that the X2 cards (dual GPU on a single card) don't benefit programs like LW since they are essentially a crossfire or SLI setup...and content creation programs like LW, Max and Maya can't make use of the second GPU.

Seems kinda strange...
On one hand, you have the game manufacturers treating us as us second class citizens after the consoles...
And then you have the GPU manufacturers making first SLI and Crossfire technology, then actual dual GPU cards our apps can't use fully...

Maybe we should just ask for LW to be ported to Playstation or Xbox. :D

Oedo 808
01-08-2009, 11:42 PM
It's my understanding that the X2 cards (dual GPU on a single card) don't benefit programs like LW since they are essentially a crossfire or SLI setup...and content creation programs like LW, Max and Maya can't make use of the second GPU.

You could well be correct, I have no idea on that one. I was more thinking about if the new cards would herald a next step on the ATI or Nvidia pricing roadmap, probably not though. From what I've read, I kinda thought that LightWave would run out of steam before an X2 would, even with one GPU disabled, or not far off. Or is that completely false?

Jarno
01-09-2009, 01:57 AM
All the viewport graphics is done through OpenGL. Where possible, models are placed on the graphics card and rendered efficiently (note the "Where possible"). If GLSL shaders are enabled, the texturing and lighting is done using GLSL. There isn't much CPU-based drawing involved (in contrast to mesh deformation and animation, which is CPU-heavy in LW).

Switching to DX does not magically make your graphics look prettier. Implementing prettier graphics makes your graphics look prettier.

For what we need in LW, the capabilities of OpenGL and DX are equivalent. They are just different APIs on top of the same hardware. OpenGL has the advantage as it is cross-platform, there is vastly more OpenGL expertise in the developer team than there is DX, and it's not like we have 10 people available to maintain two different graphics systems fulltime.

---JvdL---

blueshift
01-09-2009, 06:49 AM
So, us windows users are being held back because of Lightwave support for the Mac?(Going off the assumption that DX would be better then OGL).
Is it possible we could support both? I remember when I used Animation Master that you could choose either DX or OGL from your display options.

My thoughts exactly I had nothing but a good experience with my ATI card in my previous machine but my new machine "AMD" Quad Core came with a low end Nvidia 8500gt. I'd Like to upgrade to a good ATI card and be able to run LW useing the best possible GPU config.

Paul Brunson
01-09-2009, 05:16 PM
All the viewport graphics is done through OpenGL. Where possible, models are placed on the graphics card and rendered efficiently (note the "Where possible")

Think you could shed a little more light on the "where possible" part Jarno? You've got me thinking, my current Geforce 7800 GTX has 256 mb of graphics ram. If I had 1gig of ram from something like the Geforce 280 would that allow more / larger models to be placed on the graphics card and be accelerated?

Oedo 808
01-09-2009, 07:29 PM
Think you could shed a little more light on the "where possible" part Jarno? You've got me thinking, my current Geforce 7800 GTX has 256 mb of graphics ram. If I had 1gig of ram from something like the Geforce 280 would that allow more / larger models to be placed on the graphics card and be accelerated?

I'd be interested in the answer to this too, I'm sure I've read people talking about some apps like LightWave being limited internally, an old wives' tale?

Jarno
01-09-2009, 07:39 PM
The amount of memory on the card isn't much of an issue. Even 256MB should be able to accommodate millions of polygons.

There are some conditions which currently require the mesh to be regenerated or reshaped on the CPU whenever something changes. For example, you could have a nodal displacement which depends on the object's position, or on the position of another item. Such dependencies are not always tracked with sufficient accuracy to maximize the reuse of the mesh on the gfx card.

In order to ensure that the mesh that you see is up to date, we have to sometimes be quite pessimistic about the mesh's constancy. That requires that the mesh be remade on the gfx card when something changes in the scene that *may* affect the mesh, in constrast to *will* affect the mesh.

---JvdL---

RedBull
01-09-2009, 08:38 PM
LW is a multiplatform application; as DX is limited to Windows, , that really prevents us from using it.

It didn't once upon a time, LW5.5 introduced Direct3D 3.0 support way back when (way before Max had a selectable option, LW did), and unfortunately the LW6.0 ground up rebuild left many good innovations lost forever, D3D support being one of them.



Switching to DX does not magically make your graphics look prettier.

It certainly won't hurt, it does seem to me that DX has some nicer and fancier options easily exposed to them that often OGL does not. The gap is certainly further than it once was.


For what we need in LW, the capabilities of OpenGL and DX are equivalent. They are just different APIs on top of the same hardware.

Except the hardware is optimized for DX/D3D at a driver level, and this is increasingly going to make the difference. DX is utilizing more modern approaches these days than OGL in general, and the drivers reflect this.


OpenGL has the advantage as it is cross-platform, there is vastly more OpenGL expertise in the developer team than there is DX, and it's not like we have 10 people available to maintain two different graphics systems fulltime.

---JvdL---

As mentioned D3D was added alongside OGL back in the 5.x series, but yes it makes much more sense to stay with OGL for NT/LW. But i believe much of the 3D industry will move to DX if they aren't looking for the Mac market in the future. it's where Nvidia and Microsoft want us to go, and who is going to argue with them? I also believe the Khronos group have not helped OGL at all since taking the reigns unfortunately, nor have Linux and Apple advocates pressured the industry enough to keep it moving as quickly as the competition. Time will tell.

AbnRanger
01-10-2009, 03:18 AM
Here's a little nugget on the graphic card front:
http://www.fudzilla.com/index.php?option=com_content&task=view&id=9809&Itemid=34

If you own a newer model ATI (HD 3000 or 4000 series), they released a driver last month (Catalyst 8.12), which also comes with a free video encoder (NVidia makes you pay $30 for it's version, Badaboom), that utilizes the new stream processing technology...similar to NVidia's CUDA.

Photoshop CS4 acceleration works on ATI
Written by Fuad Abazovic
Tuesday, 07 October 2008 10:19



Premiere CS4, as well

We spoke with AMD about ATIís potential ability to accelerate Photoshop CS4 and Premiere CS4 effects on a GPU and we were right about that lead.

ATI has confirmed that Radeon HD 4000 and HD 3000 series will help accelerate many things in Adobe's Creative Suite 4 pack.

ATI Radeon HD 4000 will accelerate Premiere and in Nvidia's case, you need an ultra-expensive Quadro for that. This makes a big difference and ATI told Fudzilla that HD 3000 series will help accelerate video transcoding that will lead to faster video editing, 3D effects and transition, something that you use regularly in video edition are also accelerated on the GPU. ATI HD 4000 series might definitely sell well in this video editing market, as you pay a fraction of the price of Quadro and still get the acceleration. It will be fun to compare Nvidia vs. ATI on this particular application set.

Photoshop CS4 extended is also accelerated on both Radeon HD 4000 and 3000 series. Both cards will help accelerate image and 3D model previewing including panning, zooming and rotation. ATI can accelerate 3D manipulations on photos (map an image onto a 3D object such as a sphere, for example).

The less widespread, professionally oriented After Effects CS4 is also heavily accelerated on HD 4000 and HD 3000 cards. ATIís GPU will help accelerate 3D effects in films, video and DVD editing in this popular video post-processing package.

We were quite surprised to hear all this, as ATI didnít make a lot of noise about this, and weíve heard that even Photoshop CS3 gets accelerated by some ATI cards. After all, it looks that AMD has a very strong relationship with Adobe, but this was something that AMD failed to pass through.

CS4 acceleration on a new Quadro CX (basically what a cheap ATI Radeon 4000 series card gives you)
http://www.nvidia.com/object/builtforadobepros.html

How to use ATI Stream (ATI version of CUDA) accelerated (free) converter:
http://support.ati.com/ics/support/default.asp?deptID=894&task=knowledge&questionID=21793

IMI
01-10-2009, 07:24 AM
speaking of video cards and drivers, nvidia just released driver version 181.20 on January 8th, if anyone is interested.