PDA

View Full Version : New Nvidia superfast directx 10 video/physics chips



DonJMyers
11-11-2006, 02:59 PM
Here are a couple of small videos of the new chip's amazing interactive skills. The directx 10 waterfall and ball of fog that is being yanked around inside a glass container is amazing!

http://www.nvidia.com/page/geforce8.html

Cageman
11-11-2006, 05:00 PM
Wow... impressive stuff... That waterfall-thing is just Amazing!

Sarford
11-11-2006, 05:06 PM
What about the chick? Does this clip mean that games finaly can start using sub-d characters in-game?

T-Light
11-12-2006, 06:00 AM
Cheers Don, excellent news, been wondering when we were going to see DX10 hardware.

Anyone fancy a guess when we'll see this tech in laptops?

Verlon
11-12-2006, 06:05 AM
probably next summer or so for laptops. Current power draw is substantial (and its only being released to the high end parts right now). Figure TSMC will improve, and possibly go 65nm next year some time to reduce the wattage needed to run one of these beasts, and cool it off some as well. Both 8800 series are 2 slot cards meaning a beefy fan/heatsink is required. Also note that it is like 11" long.

May even be another year for laptops, but I imagine Nvidia moving heaven and earth to avoid that if possible.

Captain Obvious
11-12-2006, 06:18 AM
What about the chick? Does this clip mean that games finaly can start using sub-d characters in-game?
It probably means you can use it to calculate the physics of boobs.

Verlon
11-12-2006, 06:29 AM
Yes, and we all have been waiting on the physics of boobs for quite some time :)

T-Light
11-12-2006, 06:45 AM
I know I have

T-Light
11-12-2006, 06:48 AM
However,
If I'm missinterpreting Sarford's initial query incorrectly, we'd like to know if we can calculate the physics of Sub'd boobs rather than just poly boobs, which is just soooo DX9.

DogBoy
11-12-2006, 07:21 AM
probably next summer or so for laptops.

From all accounts it'll be in February. Nvidia is planning on releasing around 9 more models of the G80 tech around then, so it figures that at least some of them are set for mobile solutions.

http://www.dailytech.com/article.aspx?newsid=4891

creativecontrol
11-12-2006, 08:52 AM
Now, can we just direct some of that power into the final render on lightwave. Just imaging being able to render near real time boobs with the LW renderer!:hey:

creativecontrol
11-12-2006, 08:57 AM
Those quantum effects could be SO useful in hypervoxels!!!

Ztreem
11-12-2006, 10:48 AM
Quite funny, soon games can do better smoke dynamics and hardbody dynamics than lw in realtime. I wonder when newtek will realize that the dynamics(clothfx,hardfx,particlefx,softfx) engine in LW sucks.

DonJMyers
11-12-2006, 10:58 AM
Of course games won't support the new cards for years. I just laugh when I hear about people who rush out and buy these new cards when they are expensive, no software takes advantage of it, and they don't work quite right yet.

Of course to take advantage of such cards for physics calcs LW (and all other apps) would need to be rewritten to offload calculations to the board(s).

So that means we might have:

LW Mac
32 bit - CPU
64 bit - CPU
32 bit - GPU
64 bit - GPU
WINDOWS XP
32 bit - CPU
64 bit - CPU
32 bit - GPU
64 bit - GPU
VISTA
32 bit - CPU
64 bit - CPU
32 bit - GPU
64 bit - GPU

Talk about version management!

RedBull
11-12-2006, 12:27 PM
My latest Nvidia PCI-X is the most buggy, and useless videocard ever.
Nvidia are making worse drivers than ever. (grrrrrrr)

They are cutting back OGL features and optimizations, and providing DX/D3D optimizations. They want you to buy Quadro's to use OGL and Gelato.
Microsoft and Nvidia are killing OGL for low end use.

Lightwave 5.6 once supported D3D,
so i would like to see it return.
D3D is defintely faster these days, Max9 looks good with it.
(because we support Macs, i won't hold my breath)

But realtime shaders are almost useless for production,
XSI, Maya, Max all support CG, GLSL, D3D, and i have never used one.

JamesCurtis
11-12-2006, 12:37 PM
I seem to think that the seeming pull away from OGL usage is being done on purpose.

Verlon
11-12-2006, 12:50 PM
The pull away from OGL is almost certainly on purpose. Microsoft has hated OGL for years. I really wouldn't mind everything going D3D (or OGL) as long as the features kept coming.

With programmable shaders, I think you could at least in theory, direct some of lightwave to the graphics card. That is something they have hinted at doing for years.

Red_Bull obviously doesn't play Oblivion. That game will cripple any other card (or 2 cards!) out there if you turn all the features on at a decent resolution. And its been out for months. Besides, this gets back to chicken and egg. The gamers won't buy new cards because nothing supports it, and the programmers won't write for features that aren't on cards yet. Viola, we are all playing in in 320*200* 4 color graphics.

RedBull
11-12-2006, 05:09 PM
Microsoft have extra insentive, XBOX using DX, Sony using custom OGL.
(Macs too)Microsoft will push games to be D3D based, (as has been the case)
which will in turn push developers to abandon OGL in time.
DX10 and Vista is a push to do this. Nvidia, and Ati will support this fully.

I don't expect OGL to die anytime soon, just become less attractive and slower than DX is/will be.

I'm really starting to dislike Nvidia as a company personally:devil:

I don't play many games anymore, but was recently playing AOEIII- War Chiefs, which uses DX9-SM3.0, and i'll be ****ed if those realtime shaders aren't pretty. (I notice plenty of graphics glitches too) but for LW, i'd rather
have faster previews or fprime to view non realtime shaders in realtime.

I do like what Mental Images are up to with MetaMill programmable shaders.

LW8.5 partially added GLSL to procedurals, but the problem remains needing to rewrite specific code for realtime vs rendertime.