PDA

View Full Version : vray RT....frak!



cresshead
08-10-2009, 04:03 AM
http://www.cgarchitect.com/news/SIGGRAPH-2009-CHAOS-GROUP-GPU.shtml

5-7 frames per second on a $350 geforce game card....
makes fprime look like it's stuck in treacle! :D

it uses the GPU geforce gaming card to render and is about 20 times faster than a quadcore with hyperthreading....[8 cores]
so..that's lie a 20 cpu render farm [160cores] in one computer for a game card...uh..w..ww....wow!

it's an interesting approach...i still think fprime compares...it's like 3 or 4 years old now and they are only just catching up...

"Once a shipping product, the GPU rendering version of V-Ray RT will support distrubuted Rendering
The product will support more than one GPU"

3DGFXStudios
08-10-2009, 04:11 AM
porno :D

CORE wants this!!!

archijam
08-10-2009, 04:43 AM
I'm just surprised that it took so long to get there ...

http://www.newtek.com/forums/showthread.php?t=74936

It looked 'done' in 2007 ..

cresshead
08-10-2009, 04:54 AM
yeah i'm just giving some folks on CGtalk a battering with fprime...!...which came out like 4+ years back...DOESN'T use GPU..uses old faithful cpu which is poop really!

so fprime is really doing some magic...
just con't resist the poke to the ribs...4 years!...cpu...not gpu...renders deforms, volumetrics, dof, mo blur...

and STILL vray RT is not fully workingwinthin the host app...but it does look FAB and fast!
dang, i'm sure mr worley arrived via a monolith on mars!

cresshead
08-10-2009, 05:31 AM
just to clarify there are 2 Vray RT's...one based on cpu and it's intergrated within 3dsmax and the GPU version which is just a concept thing currentlybut will become part of vray renderer soon.

The Dommo
08-10-2009, 06:15 AM
Perhaps this is what FPrime 4 will allow... only time will tell :D

3DGFXStudios
08-10-2009, 06:28 AM
So vote for Vray (if you didn't already) http://www.newtek.com/forums/showthread.php?t=100836&page=4

Matt
08-10-2009, 06:32 AM
Nice!

Matt
08-10-2009, 06:37 AM
Check this out too, Jure put me onto this:

http://www.vraygrass.com/

Now we know what Graham has been up to! No LW version :(

geo_n
08-10-2009, 06:49 AM
Check this out too, Jure put me onto this:

http://www.vraygrass.com/

Now we know what Graham has been up to! No LW version :(

Even Happy Digital is on the AD bandwagon. Price is a bit steep. Grass is already ok on vray with displacement. Almost 300US just for grass. It should be hdi they release for max. I wonder how long that movie took to render.

Intuition
08-10-2009, 08:48 AM
Vlado and CO were here at DD last week. They showed the RT. Also mentioned how it will be integrated in maya before too long. Got to meet Vlado and Peter in person.

Many of us were asking for GPU rendering for years and Chaosgroup (real innovators) bring it to Vray (best render engine imho).

Vray/max and Vray/maya are about to leave a crater in the industry with the addition of GPU rendering.

Core should take a look at this (either GPU rendering OR integrating Vray).

I bet that the Luxology guys will be adding GPU use to modo's engine (another great pbr engine) before too long.

Core should focus on setting up the app's render side controls for automatic LWF rendering. DD has custom tools built into LW to force it into LWF. It pipes out the renders to nuke for proper gamma space viewing of the EXR files. Core should just jump to this as a standard workflow.

Svenart
08-10-2009, 09:12 AM
http://www.cgarchitect.com/news/SIGGRAPH-2009-CHAOS-GROUP-GPU.shtml

it uses the GPU geforce gaming card to render and is about 20 times faster than a quadcore with hyperthreading....[8 cores]
so..that's lie a 20 cpu render farm [160cores] in one computer for a game card...uh..w..ww....wow!



yieeehhaaaa.... sounds great so far

Larry_g1s
08-10-2009, 11:03 AM
Check this out too, Jure put me onto this:

http://www.vraygrass.com/

Now we know what Graham has been up to! No LW version :(Very nice.

I'm not against having Vray for LW/LW Core, but I do hope Kray starts seeing the love it deserves.

archijam
08-10-2009, 11:28 AM
Even Happy Digital is on the AD bandwagon.

Autodesk have a wagon?

Good for him! :)

and the pricing is relative- to what max/VRay/3ds plugins usually go for. The q is what people are willing to pay for a solution... I'm sure he's done his homework.

archijam
08-10-2009, 11:31 AM
No LW version :(

.. Of vray? Nope! This is a plugin for VRay more than max...

geo_n
08-10-2009, 11:45 AM
Many of us were asking for GPU rendering for years and Chaosgroup (real innovators) bring it to Vray (best render engine imho).



Did you try gpu rendering already? I tried the latest demo of RT vrayrt_demo_15001 and it was really slow. I couldn't render more than 15 objects and it has a timer built in for every press of a render. The demo experience was not satisfactory.

Intuition
08-10-2009, 11:58 AM
No, we just saw it in action. The Vray RT isn't really like f-prime speed but its nice to have it for placing reflector cards and lights. The GPU version will be what people want in speed and performance.

Andyjaggy
08-10-2009, 12:05 PM
GPU power is going to bring about a revolution. I can't wait.

Matt
08-10-2009, 03:24 PM
Is GPU rendering on the cards for CORE? It does seem things are heading that way.

Saying that, the open and flexible framework of CORE should mean adding it later is quite possible I guess.

pixym
08-10-2009, 04:22 PM
Is GPU rendering on the cards for CORE? It does seem things are heading that way.

I think you mean GPU Aware, and guess the reply is NOT. :(
I think the same first time i saw "GPU Aware" on Core features, but that only mean Open GL Display Meshes not Rendering (at least is what I understand so far for Core)
I really hope to see this kind of technology in Core. Renderfarms era is about to end… at least for CPU ones!

jayroth
08-10-2009, 04:57 PM
I think you mean GPU Aware, and guess the reply is NOT. :(
I think the same first time i saw "GPU Aware" on Core features, but that only mean Open GL Display Meshes not Rendering (at least is what I understand so far for Core)
I really hope to see this kind of technology in Core. Renderfarms era is about to end… at least for CPU ones!

Eddy, we will leverage as many GPU resources as we can (GPU SubDs in CORE are the beginning). That will increase as time goes on, tools get better, we get better a writing for the GPU, and so on.

pixym
08-10-2009, 06:52 PM
Eddy, we will leverage as many GPU resources as we can (GPU SubDs in CORE are the beginning). That will increase as time goes on, tools get better, we get better a writing for the GPU, and so on.
Jay, thank you for this good clarification ;)
Héhé, We will finally get GPU rendering, so it is time for me to buy a tesla gfx :D

wacom
08-10-2009, 09:19 PM
What can I say- I'm deeply impressed! C4D, Max, and Maya users are lucky to have such an option in the future.

Glad to hear NT is really thinking about this. It's funny how half the developers seem to pass off GPUs still as if they've already been there and done that "GPU fad thing"...while the other half are embracing them and the fruits of their labor show.

If it wasn't such a big deal then Intel wouldn't feel compelled to play catch up or "trip up" with their next set of CPUs.

Something tells me that maybe, just maybe mental images will take the stick out of...um...you get it...now that Nvidia is over them.

Personally I don't care WHO gets the GPU accelerated products out there first in this first wave because it puts it out there for people to see, use and experience and more pressure on dragging the rest of the community along out of the CPU only mind set!

3DC, this vray demo...there is a bright future ahead in terms of GPU acceleration!

I wonder what that Vray demo does with a few Tesla cards installed too...

erikals
08-10-2009, 10:45 PM
...but does Vray RT handle animation?
does the animation flicker?
are animated displacements supported?
how about motion blur, is it any good?

but no doubt, this is the future, i've said so for a loong time http://forums.cgsociety.org/images/smilies/smile.gif

hm, wonder what intel thinks about this,..

geo_n
08-10-2009, 11:07 PM
...but does Vray RT handle animation?
does the animation flicker?
are animated displacements supported?
how about motion blur, is it any good?

but no doubt, this is the future, i've said so for a loong time http://forums.cgsociety.org/images/smilies/smile.gif

hm, wonder what intel thinks about this,..

Its a real time previewer not a production renderer. It works off the vray software so I'm not sure if they will offer a stand alone ver that would work on its own since even if you have RT, how would you render what you previewed if you don't have vray, vray material, etc.

any mesh animation works like fprime or modo ipr. no particles, volumetrics. fprime is superior in features for now.

erikals
08-10-2009, 11:19 PM
i guess it will take some time, i saw the Mobo demo though, it's previewer looked very good too, much like FPrime

Matt
08-11-2009, 02:30 AM
I think you mean GPU Aware, and guess the reply is NOT. :(

No I meant GPU rendering! I was asking _if_ it was on the cards, not that it _is_ on the cards, but from Jay's response, never say never!

:)

erikals
08-11-2009, 03:23 AM
that didn't sound like a previewer http://forums.cgsociety.org/images/smilies/smile.gif
,guess the AA and the motionblur are some of the things that holds back then...

DiedonD
08-11-2009, 04:20 AM
So 3D apps will slowly be based on Graphic Cards.

And Graphic Cards will constantly improve. So the 3D apps will follow. On and on, for better quality and speed. Designed specifically to drain the customers pocket in a more continous basis than before!

Just like games ask for certain Graphic Cards, now so will the 3D apps that make em in the first place.

Wahooo :(

cresshead
08-11-2009, 04:25 AM
So 3D apps will slowly be based on Graphic Cards.

And Graphic Cards will constantly improve. So the 3D apps will follow. On and on, for better quality and speed. Designed specifically to drain the customers pocket in a more continous basis than before!

Just like games ask for certain Graphic Cards, now so will the 3D apps that make em in the first place.

Wahooo :(

the speed of preview rendering might be based on a graphics card for previews but the 'app' won't be...and then only the occasional cutting edge renderer..also the actual renderer will be cpu based until they sort out some limitations of going gpu such as textures, procefuals, quality etc

we're not 'there' yet..not by a long shot!:hey:

pixym
08-11-2009, 04:28 AM
I do not think it is a previewer because of this: "What seperate their solution from all others is that the GPU rendering output MATCHES the production render quality from a CPU rendered frame buffer exactly!"

DiedonD
08-11-2009, 04:36 AM
Previwerer only or renderer aswell per GPU?

pixym
08-11-2009, 05:22 AM
two of them, In a close future we will not have to wait for a render, and this is amazing.
The used gfx in the demo was a Nvidia 285, wich is far faster than the cpu (I guess an I7 920). Imagine what kind of speed you can get with a Tesla card monster :hey:

DiedonD
08-11-2009, 05:42 AM
Im worried that in the future, you cant have an LW20.x because you dont have the NVIDIA Briliant Design 4.X. Per Quadro Duplo Graphic card yet!

And that costs a fortune!

But if you did have it, you can make movies in a week with that LW20.x!

Thats the dreaded position! Similar to games!

erikals
08-11-2009, 05:57 AM
...But if you did have it, you can make movies in a week with that LW20.x!

heh, well, then there'd be Ultra HD,...
http://en.wikipedia.org/wiki/Super_Hi-Vision

DiedonD
08-11-2009, 06:08 AM
Yes Neverko, I get you.

I just have the image of LW in the future, with its brand new capabilities, which are higher in quality and faster than usual, and in that image an attached small word saying 'NVIDIA Briliant Design TXV 6 or above' which is the last solution!

Now, LW can work with quite little hardware capabilities. Then, in that case above, you are cutt out! You cant upgrade without buying the harware aswell! I know quite alot of games that I didnt get to play cause it asked THEE last graphic card! Half Life was one of them.

Now theyll only changed the image at the NVIDIA site. Instead of the next game to promote their sales, they may use LW for instance. The two become one, in selling each other! At our expense, cause ever since 2003 I never HAD to buy a graphic card for LW. Should them to unite, Ill be pushed to do so!

erikals
08-11-2009, 06:29 AM
i think it is inevitable.
we won't be relying much on cpus, but rather gpus.
however, it might be a very good thing, much easier upgrading a graphics card than a cpu.
i mean, on the motherboard i currently have, i'm pretty much stuck as far as cpu upgrade goes, i'd have to upgrade the whole computer.

then again, standard graphics cards are extreemly expensive...

umf, that's how it is i guess...

erikals
08-11-2009, 07:50 AM
absolutely, saw the demo some time back, looks like they improved it.
that'd be a fun experience, playing something like that on a large monitor :]

wacom
08-11-2009, 11:50 AM
I know everyone wants real time, and that's the goal, but I'll gladly take a grainy preview that resolves its self to near final quality software rendering over a realtime render that looks like "the best" of 1999.

If the final render engine is GPU enabled and only takes 1/2 the time to get to final render then I'm all over that too!

I don't know...those cry engine examples just don't look very accurate/good to me. They look like someone did a FG render with one bounce and a huge radius. Yeah...I see some bounced colors...but I wouldn't say it looks "real" let alone photo real. The vray RT GPU stuff though- that looked REALLY good. I have clients that would sign off on that in a heart beat...the cry engine...maybe if they were making a game...

I think though that half of the issue could be that MAYBE a good lighting artist was not a play with these cry engine videos? I wouldn't sneeze at having that engine going in the view port that's for sure!

erikals
08-11-2009, 04:34 PM
the quality is pretty much like heavy interpolated mode in LW, so not all that.
but in time i would guess the quality could be improved,
as they improve the code /computers gets faster,...

wacom
08-11-2009, 05:23 PM
I tried reading a tech paper on their site about it, and while I only understood 1 in 100 words, I could gather that yeah- there is only 1 bounce, and the samples are quite averaged. Still, there were examples around of stills using the tech that were quite impressive- I'd say more so than the video example. This tech was made to run on the xbox 360 as well- so they took that into consideration when making it.

Impressive to say the least!

geo_n
08-12-2009, 06:49 AM
As a previewer it is far superior to FPrime in both speed and functionality. It does volumetrics, fur and even individual parts of the shading such as reflection amount, reflection color, bump, subsurface amount, subsurface color and a huge list of similar isolated shading previews. It previews the alpha channel interactively, full scene ambient occlusion and oh so much more :) But it doesn't do progressive final frame rendering like FPrime. It is "only" an interactive previewer.

How is the previewer for 402? The problem I had with the demo of 302 ipr was it kept crashing with highpoly cad models and I couldn't resize it to more than half the screen. We were deciding between hypershot and modo for real time presentations to client so I tested modo for that but we decided to scrap that and wait for vray rt.

beverins
08-12-2009, 08:03 AM
I don't know...those cry engine examples just don't look very accurate/good to me. They look like someone did a FG render with one bounce and a huge radius. Yeah...I see some bounced colors...but I wouldn't say it looks "real" let alone photo real.

Totally agree.. but that said - wouldn't you like CORE's viewport to look that good? When I saw the video, I was looking at it as a "damn, that'd be sweet if CORE had that as a switchable viewport mode."

One can wish.