PDA

View Full Version : new graphics chips make games seem real



LW3D
06-19-2008, 04:21 AM
"First, the model number of the new family of graphics card, which AMD code-named the RV770. AMD says it will sell two models starting June 25: The ATI Radeon HD 4850 will cost $200 and the more powerful ATI Radeon HD 4870 will cost $300. AMD's Cinema 2.0 site has a few more details."

Check these pictures..

http://farm4.static.flickr.com/3085/2585849022_2dc42296fe_b.jpg

http://farm4.static.flickr.com/3148/2585016023_341219cf32_b.jpg

and these videos

http://www.amd.com/us-en/assets/content_type/DigitalMedia/AMD_Ruby_S04.swf

http://www.gametrailers.com/player/usermovies/231380.html

Mitja
06-19-2008, 05:56 AM
Nice, but I have the feeling that NVidia is always a step before ati/amd.

jameswillmott
06-19-2008, 06:20 AM
Who cares how good it looks, it's how it behaves which will make it feel truly real. :)

COBRASoft
06-19-2008, 06:43 AM
I will soon know where NVidia is more or less, I just ordered my 9800GTX :)

sammael
06-19-2008, 07:37 AM
Considering the performance/price ratio I find these new Ati cards quite interesting. I always prefered Ati stuff until Nvidia started blowing them out of the water.

COBRASoft
06-19-2008, 08:57 AM
Neverko: I only have PCI-Express (v1, not v2). I will use it mainly for LightWave (3D) and paint programs (mostly 2D). So, no real reason to buy the top of the top. I will by this 9800 for less than 195 (without taxes)... I wanted a 8800GTX, but they don't seem to be available anymore around here.

LW3D
06-19-2008, 09:57 AM
Actually, I think these cards are different and there are something new... if you watch second video I sent (http://www.gametrailers.com/player/usermovies/231380.html), some guy talking about voxel based render. I think it is different.... voxel based renderer can let realtime raytrace possible...

http://www.youtube.com/watch?v=_PCEN-RG2lk

Andrewstopheles
06-19-2008, 10:25 AM
The 280 is especially potent on large resolutions with AA enabled, it's performace increases almost exponentially compared to older tech.
It's a must have in my next system!
If Newtek works with NVIDIA to enable the LW renderer to use the CUDA technology our render times could get a very big boost with this technology.
The future's so bright, I gotta wear shades.

The Dommo
06-19-2008, 11:25 AM
I've got a 7950GT in my home machine and a Quadro FX 5500 here at the studio. Both are awesome, especially the Quadro.

But if LW could use CUDA tech, yes, that would be just STONKING!! :D

Mitja
06-20-2008, 03:23 AM
I found this very interesting, so I share here.
60031
Considering the price difference between 4850 and 9800, ATI would be my choice. Now.

SP00
06-20-2008, 09:08 AM
I found this very interesting, so I share here.
60031
Considering the price difference between 4850 and 9800, ATI would be my choice. Now.

yeah, that is basically what I have been seeing. The ATI card gives you 9800 performance for about $50 - $100 less.

COBRASoft
06-20-2008, 11:15 AM
Maybe the ATI card gives the same performance, but what about drivers for Vista x64? Are the problems solved already for both Nvidia and ATI? I'm having a 6800 Ultra right now, and the drivers I have work good on my Server 2008 x64 :)

LW3D
06-23-2008, 02:12 AM
I think all of you missed the point...

No polygons used!! in that video...They are voxels...

Hopper
06-23-2008, 05:19 PM
That's last generation tech... the GTX 280 is where it's at now. :)
Not out yet is it? That blows ... Just started using my 9800GX2 and it's outdated already... go figure. Don't think LW's gonna care much though .. :)

LW3D
06-30-2008, 06:50 AM
Update... This topic isn't about performance... Something new in Realtime World...

http://www.tgdaily.com/content/view/38145/135/

praa
07-06-2008, 07:38 PM
http://uk.youtube.com/watch?v=Bz7AukqqaDQ

he is going to be bought by... autodesk :(

IMI
07-06-2008, 08:00 PM
I will probably jump on the GTX 280 once the price drops after the initial wave of sales. Maybe I'll even wait for the inevitable core upgrade, like the G80 to G92 die shrink.


You know, I've been thinking about that too. I'm building a new PC this summer and trying to decide what route to take, GPU wise.
I really don't understand it, but I can't find any reviews saying the 9800 or the GTX 280 is faster or better than the 8800 GTS. They all seem to be saying the 8800 GTS is still a better card.
Maybe as newer drivers come out for the 9800's and the GTX's that will change. I guess we just wait and see, but my current plan is to get a mobo with both PCI express x16 and PCI express x16 2.0 slots and another 8800 GTS.

RTSchramm
07-06-2008, 08:36 PM
The Radeon 4800 series supports a 100% ray-traced solution in real time. No other consumer card does this at this time.

There are pros and cons to this.

I imagine the 3D application would have to incorporate Radeon ray-traced driver resulting in a rewrite of the rendering module. I would also bet that the Radeon's ray-traced solution cuts a lot of corners and is not as accurate as LW's renderer which means its most likely used to developed cut scenes for games, instead of for production CGI.

Nvidia's solution using Cuda, uses the GPU's to render like the ATI, but its renderer is not in real time. Cuda is suppose to be as accurate as any current renderer on the market at this time. Cuda is supposed to be quick on orders of magnitudes compared to using the motherboards CPU.

I have used both ATI and Nvidia cards, and I believe that Nvidia has the edge when it comes to stable drivers.

I would also like to point out to any newbies reading this thread that LW cannot use any of the new ATI or Nvidia technologies at this time. I actually dumped my Quadro 4400 FX in favor of a Gefrorce 8800 GTX because the 8800 GTX was quicker when manipulating objects in LW's GUI that the Quadro. Having a Quadro or Geforce had no effect on actual rendering speed.

IF YOU ARE A NEWBIE KEEP THIS IN MIND:

When rendering LW doesn't use the Video card it uses the main motherboard processor, so if you want fast render times, you need to buy a computer with the fastest processor you can afford and a lot of memory. Note that fastest processor doesn't necessary equate to a higher Ghz rating. A 2.0 Ghz Core 2 Intel processor will blow the doors off a 4 Ghz Pentium Extreme.

RIch

IMI
07-07-2008, 12:14 PM
I would also like to point out to any newbies reading this thread that LW cannot use any of the new ATI or Nvidia technologies at this time. I actually dumped my Quadro 4400 FX in favor of a Gefrorce 8800 GTX because the 8800 GTX was quicker when manipulating objects in LW's GUI that the Quadro. Having a Quadro or Geforce had no effect on actual rendering speed.



I could have sworn there was a time LW's System Requirements (http://www.newtek.com/lightwave/system_requirements/) specified a Quadro. Maybe it's somewhere else on their LW pages, unless they changed it, but I just know they did at one point not too long ago.
Which always surprised me, because it seems the vast majority of people through the years have been using regular old multi-purpose graphics cards, and I've read many cases of people saying what you did here - that the Quadros aren't any kind of real improvement.
Can someone tell me then why the Quadros are so ridiculously expensive? For example, the NVIDIA Quadro FX 3700 is $900.00 or so. What do the Quadros do that the high end gaming cards can't, where it comes to simply using an app like LW, modeling, setting up scenes and so on?
I found a Quadro 4400 FX page with a Flash animation showing the card in real time comparison to an obviously weak card, although there was no mention what card they used for the comparison. Looking at it, I'd guess it was nothing more powerful than an AGP Nvidia GeForce 6600 or equivalent.

Well, they have the Official Autodesk Stamp of Approval, so I guess that alone is worth $500 to $1000 more than, say, an 8800 Ultra.
Is that what it is, just a scam? Because they call them workstation cards and sell to large studios?

RTSchramm
07-07-2008, 12:58 PM
The Quadros support high end features that non-dedicated OpenGL cards do not support. I believe that the OpenGL language is hard coded into the chip on the Quadro also, but I could be wrong on this. Also Quadros are designed to handle a larger screen resolution various types of clipping and hardware lighting that Geforce cards do not support.

The only 3D application that I know that is supposed to use only a Quadro is Maya, but if you go to Autodesk's Area forum, there are a lot of users using a Geforce card with Maya. I did have to turn off 3D threaded optimization for Maya to get Maya to work correctly.

Anyway, as far as speed is concerned there is I believe at the same resolutions I have experienced no differences in speed between the Quadro and a comparable Geforce.

And, Yes, I believe the Quadros are way over priced.

Rich

IMI
07-07-2008, 02:19 PM
Thanks for the explanation.

I've not come into a situation my 8800 GTS couldn't handle. You get something going like LW with FPrime or modo with its own real time render preview, and I can't see any real reason for any of the more advanced features. And since those features are just for working and not rendering, I don't see not having a Quadro as any kind of liability at all.

Red_Oddity
07-07-2008, 02:50 PM
http://uk.youtube.com/watch?v=Bz7AukqqaDQ

he is going to be bought by... autodesk :(

Who Jules Urbach? and what will he bring to the table (besides OTOY, but i wonder whatever Autodesk will use that concept for, as they make more money 'selling' their product range on a annual basis the way they have been doing for years)

I guess Jules Urbach programmed/managed the compression and handling of the voxel sets to keep it managable as a demo.

Cageman
07-08-2008, 02:58 AM
There's nothing really new there. It's the same pre-baked stuff as always. Just a little better... like every time they announce photo realism.

Nothing new you say?

What about "We are not using any polygons, each pixel has it's on depth and can essentially be rendered as voxels".

Sure, there have been voxelbased engines before, but not at that level.

Mitja
07-08-2008, 03:41 AM
Anyway, speaking about geforce vs quadro...
Both are based on same architecture, the only thing that changes are drivers. I found a guide, some time ago, on how to mod you geforce to become the equivalent quadro.
The same for ati from x series to firegl.
You have to edit drivers to enable the "advanced" functions.
I tryed myself to mod my mobility radeon x600, with no success though (since the mod for my card wasn't explained in detail).

LW3D
07-08-2008, 04:34 AM
Yeh, voxels = not new.

New stuff looks better and hardware is getting better, but the concept isn't new. My Amiga did voxel based graphics :p

I don't dispute the potential here and it looks nice. But I'd like to see an actual working implementation in a commercial product before getting my proverbial panties wet.

I've watched too many graphics card demos in my life to have a healthy dose of scepticism concerning anything in those demos.

Yes eveybody know what voxel is.. We have hypervoxels :) Actually we played voxel based games (comanche etc.).

But this time, new generation GPUs support voxel and we have realtime raytrace. I don't think, this is old. This will change realtime graphics...

Cageman
07-08-2008, 10:49 AM
Yeh, voxels = not new.

New stuff looks better and hardware is getting better, but the concept isn't new. My Amiga did voxel based graphics :p

Sure it did... but not at this level. No way... btw.. imagine this kind of speed in a 3D-app (lets say... LWs HyperVoxels).

Huge potential outside the gameworld as well.

Cageman
07-08-2008, 10:55 AM
I've watched too many graphics card demos in my life to have a healthy dose of scepticism concerning anything in those demos.

Yeah...so have I... and they seem to deliver. Playing Crysis on anything lower than a GeForce 8800GTS isn't going to impress anyone. But once you have such a card (or better) the game actually delivers very well.

New GFX-tech takes alot of time to reach the "normal" crowd. Many times, those demos are demoed on GFX-card tech that will be released 1-3 years later, sometimes even demoed on unfinished hardware...

warmiak
07-08-2008, 03:51 PM
I can count at least 3 video card demos doing realtime -fake- SSS on somewhat high poly models (for their time) in very specific and optimized scenes and yet I have seen no games ever do this kind of shading. Not even now that cards are several times faster than they were on the older SSS demos.

Note than I'm not denying that the ATI demo is nice. I'm simply not easily impressed or quick to get excited about these things.

Wake me up when you see a real world implementation of this. Be it a game or a 3d viewport in one of the major 3d applications.

The clock is ticking from the date that demo was shown. Revive this thread from it's tomb once there's something actually voxel rendered out there.

As far as fake but real-time SSS in a game - check out Crisis .

RTSchramm
07-08-2008, 09:07 PM
The problem with the ATI Ruby demos that I looked at so far is that all of the vertex locations are already pre-calculated and placed in a huge table that is optimized for that particular scene. By doing this they free of GPU and CPU for other graphically extensive eye candy.

If you look at the videos, the ATI demo had I over hundreds of millions of data point for the voxels. No normal computer could handle that much data in real time unless the data was recalculated and placed in a table ahead of time.

If it seems too good to be true, it most likely is.

Rich

Cageman
07-09-2008, 03:11 AM
Yeah...

But they also talked about interactions as well, so my guess is that they aren't there yet, but eventually will.

sammael
07-09-2008, 09:25 PM
heres an interesting take on the price/perfomance ratio of the new ati and nvidia cards for anyone who is looking at a new card http://www.tomshardware.com/reviews/radeon-hd-4870,1964-18.html