PDA

View Full Version : GPU Rederer for Maya



creativecontrol
08-14-2009, 03:00 PM
http://furryball.aaa-studio.cz/

I find this very interesting. 50-300 times faster is their claim. I hope NT has been working toward this for the last several years because it's due.

cresshead
08-14-2009, 03:34 PM
so, out of the woodwork comes the other GPU enabled renderer's...it's not only Vray then?
there's been rumor's that Mental Ray has GPU assist...maybe see that in max2011 or maya2010 extension later in the year.

newtek..are you seeing this?..your KNOWN for your renderer...time to dig into gpu rendering i think for everyone.

GraphXs
08-14-2009, 03:35 PM
Wow, that nice, hey maybe Mr Worly can update FPrime to take advantage of the GPU???

How about it!:thumbsup::D

warmiak
08-14-2009, 03:36 PM
http://furryball.aaa-studio.cz/

I find this very interesting. 50-300 times faster is their claim. I hope NT has been working toward this for the last several years because it's due.

Well, a card like Nvidia 295 has 480 stream processors - if a task can be parallelized and is relatively branchless, these things will outrun any general purpose CPU by orders of magnitude.

creativecontrol
08-14-2009, 03:48 PM
All of this has been well know for years but I'm amazed at how slow software companies are to respond. Mental Ray will certainly have something soon and I think anyone who wants to compete will have to get on it rather quickly.

If an animation studio can do it on their own surely a company that writes 3D apps for a living could.

Sensei
08-14-2009, 04:05 PM
If an animation studio can do it on their own surely a company that writes 3D apps for a living could.

3d app must have support for the all nodes and shaders. That's whole problem. If you're making 3d game, you know that all shaders will be GPU executed shaders (that doesn't use ray-tracing, and bouncing)..

BTW, pictures from Mental Ray and GPU renderer above, don't match.. Hair has completely wrong shadows.. It looks like it's cheating as hell just to have speed, without taking care to mimics original results. Which means that it will be useless for final renders..

Tobian
08-14-2009, 05:33 PM
GPU's have had fast multi-parallel computing engines for ages.... which were only of any use to render GPU things up until relatively recently. OpenCL and CUDA are relativelly new off the block too (and competing protocols is not going to be helpful!)

There is also the major hurdle of sharing the data stream between the CPU and the GPU which is going to be a problem: self contained and only running Cg apps on the GPU is one thing, sharing or 'assist' solutions will be very taxing on your bus.

OpenCL could indeed bear interesting fruit in the long term too, because the coprocessor needn't be a graphics card, how about tile64 chip or something LOL. I can certainly see why Apple would push the idea of OpenCL as a lot of their workstations are sold to do graphics work, so it might make sense to include dedicated coprocessors as well as leverageing computing power from the GPU :)

Impressive demo though! :)

Dexter2999
08-15-2009, 03:08 AM
3d app must have support for the all nodes and shaders. That's whole problem. If you're making 3d game, you know that all shaders will be GPU executed shaders (that doesn't use ray-tracing, and bouncing)..

BTW, pictures from Mental Ray and GPU renderer above, don't match.. Hair has completely wrong shadows.. It looks like it's cheating as hell just to have speed, without taking care to mimics original results. Which means that it will be useless for final renders..

I see your point is valid on one level, but this sort of GPU intensive computing wouldn't really be a replacement for final renders but more as a replacement for OpenGL previews wouldn't it? And OpenGL isn't really any more accurate than this demo is it? I mean, you don't see hair at all in OpenGL do you?

This is certainly a huge step in previz and game design if nothing else. Isn't it?

creativecontrol
08-16-2009, 11:15 AM
I think there's no doubt GPU rendering will replace CPU rendering. They both crunch the same numbers so if the algorithms are the same the visual results are the same.

The reason this example doesn't match is obviously because they've had to re-write everything from scratch and try to match what has already been done. The Maya renderer and Mental Ray are not exactly open source so they can't just port it.

All the usual excuses have been eliminated. There is are C style programming interfaces (CUDA, OpenCL), and the GPU's do double precision math.

Now I hope we can get on with it! Progress in 3D has been painful slow and very incremental the last while. This is a chance to make a real leap forward.

Stooch
08-16-2009, 11:25 AM
wow i managed to make it half way through before the music made me leave the page. Vrays hardware renderer is far superior.