PDA

View Full Version : Time to throw away the chips and just use GPU's?



gerry_g
05-29-2008, 05:20 PM
Like the dude says, one GPU processor can be forty times faster than your CPU

http://www.youtube.com/watch?v=DnIvodB2RzU

COBRASoft
05-29-2008, 05:37 PM
Hehe, and it is from Antwerp, Belgium :)

I didn't even know we could already buy the 9800 here...

AbnRanger
05-29-2008, 05:59 PM
From an article I read a while back, it seems that ATI and NVidia saw the handwriting on the wall with multi-core processors....meaning Intel and or AMD would soon designate one or more cores strictly for graphics, and essentially do away with graphic cards altogether. So, the card makers attempts to remain relevant is to beat the CPU makers to the punch. I think this is where AMD was headed with its acquisition of ATI...to integrate the two...and that is probably how they plan to eventually stick it to Intel. AMD execs said that the next generation of AMD processors architecture would be nothing like it's current Barcelona.
Sounds like they may have something dandy up their sleeve after all. For competition's sake...let's hope so.
Can you imagine how much faster LW renderings would be with 2-4 GPU's running the show? Might not need FPrime much.

This is beautiful news too for Autodesk haters, cause their cash cow systems, Flame, Flint, Inferno, Smoke, etc...won't be needed anymore. They only exist to provide real-time compositing. But Multi-GPU (CUDA type config's) setups like this would make them obsolete.

Hopper
05-29-2008, 06:15 PM
It's funny this should come back in this day and age.

This isn't anything new, just new for this generation of systems. For the longest time, GPU's (ala RAMDAC + DMA controller) were well known for being able to process calculations at a much higher rate than the system's CPU due to the fact that the graphics subsystem didn't have the instruction set overhead that the CPU had. It looks like we're moving back into that direction once more.

Way back when, you used to be able to have your CGA, EGA, or VGA graphics card and add a second monochrome card because it used the B800 address space that the color system never touched. You then forced all your heavy calcs through DOS interrupt 10h and on to the monochrome card because it beat the snot out of the wimpy 25-33MHz CPU's

Ahhhh.. the good ol' days... back when you actually had to know what you were doing to assemble and configure systems.

Lightwolf
05-29-2008, 06:23 PM
This isn't anything new, just new for this generation of systems. For the longest time, GPU's (ala RAMDAC + DMA controller) were well known for being able to process calculations at a much higher rate than the system's CPU due to the fact that the graphics subsystem didn't have the instruction set overhead that the CPU had.
The big difference here is programmability. You can actually run more or less proper code on a GPU, something that wasn't possible way back then (I don't consider display lists or similar pre-compiled, non-branching and non-logical sequences as code).

However, coding for the GPU still requires completely new concepts for a renderer (both in terms of what can be done, making the most of the hardware as well as making data fit within the very limited confines of a gfx board.).
It may be programmeable... but still no way as general purpose as a CPU and only lends itself to certain computations. Some of which makes sense in rendering... others don't. A production renderer would certainly be an interesting hybrid.

Cheers,
Mike

Hopper
05-29-2008, 06:53 PM
The big difference here is programmability. You can actually run more or less proper code on a GPU, something that wasn't possible way back then (I don't consider display lists or similar pre-compiled, non-branching and non-logical sequences as code).
Yeah, true.

We got all our A/D conversions and cycle variations done through the DAC instead of converting voltages to float values which sucked unless you had a math-coprocessor at the time. The PIC was a mighty cool thing to play with - now you can't even access it unless you use assembler in a true DOS window. :(


** A true geek can f*ck with co-workers using debug. :)

Hopper
05-29-2008, 07:02 PM
And of course he never mentioned "and we can only use it for 5 minutes at a time because there's no cooling system and we have them stacked on top of one another like Cubans in a bass boat". :D

Really cool project though. True geekdom. If he would have talked with a lisp, I would have thought it a joke.

AbnRanger
05-29-2008, 07:12 PM
And of course he never mentioned "and we can only use it for 5 minutes at a time because there's no cooling system and we have them stacked on top of one another like Cubans in a bass boat". :D

Really cool project though. True geekdom. If he would have talked with a lisp, I would have thought it a joke.I thought I saw a water cooling system joined to that.

Hopper
05-29-2008, 07:26 PM
It's very possible. I didn't see it, but then again, I wasn't looking for it. At best he could only be cooling 2 of the four. I've got one on my 9800 and it mounts on the side - those things are pretty sandwiched.

I'll bet at nominal processing speed they'd do fine, but MAN are those things hot. I can't imagine four of those things at once. I'm guessing they keep the case open.