PDA

View Full Version : Nvidia Gelato



MH9
04-19-2004, 03:58 PM
I'd love to be able to have the GPU harnessed as a 3rd processor at final render time. And, now, with Gelato software from Nvidia, this is possible. Now someone needs to write the plugin to interface with Lightwave 3D. Currently this is shippling with final rendering support for MAYA.

Let's hear it for LIGHTWAVE!

http://film.nvidia.com/page/gelato.html

Does NT have anything to say about this at this time?

L&R,
S9:Micah

Exper
04-20-2004, 02:08 AM
Come on NT... don't lose this train!

Take a look at partners page (http://film.nvidia.com/page/partners.html) and you'll see there's a lot of competitors already listed there!

Bye.

Nemoid
04-21-2004, 10:04 AM
Agree totally! :)
maybe gelato is expensive, but someone will use it surely. i bet it will become a standard thing for 3D production soon.

Exper
04-21-2004, 10:49 AM
Good features list... here:
NVIDIA Gelato Features and Benefits
http://film.nvidia.com/object/gelato_features_benefits.html

Take a look here also... page 2:
NVIDIA GELATO 1.0 FEATURES AND SPECIFICATIONS
http://film.nvidia.com/docs/CP/4825/Gelato_product_overview.pdf

Quite impressive for a "1st " release!

Bye.

Gregg "T.Rex"
04-22-2004, 05:53 PM
At work, most likely we'll get a Gelato license for use with Maya and XSI. At this point, Gelato's real time render features are far beyond than LW's software render ( see: true subpixel displacement).

We'll see the day that we stop doing long night renders. We'll create the content and hit "play"; just as a non-linear editor, like AVID's Adrenaline does....

I'd like to see Newtek support the Gelato or similar apps. Soon...

Cheers,

Lightwolf
04-23-2004, 02:57 AM
Originally posted by Gregg "T.Rex"
At this point, Gelato's real time render features are far beyond than LW's software render ( see: true subpixel displacement).
...except for the fact that Gelato isn't a real time renderer at all. It is hardware accelerated, but in no way realtime. And I honestly doubt that'll we'll see a realtime high quality renderer anytime soon. (Just think Videores with antialiasing in realtime, no way :) ).

Cheers,
Mike

Gregg "T.Rex"
04-23-2004, 03:53 AM
Well...
The term "real time" is subjective. If you get a hardware render in 10 minutes, instead of 10 hours and in film quality then, for me that's real time!


...i honestly doubt that'll we'll see a realtime high quality renderer anytime soon.

I never said "anytime soon". I said we'll live the day to see....

Best regards,
___________________
Gregg "T.Rex" Glezakos
3D Artist

Lightwolf
04-23-2004, 04:04 AM
Hi Greg,

Originally posted by Gregg "T.Rex"
The term "real time" is subjective. If you get a hardware render in 10 minutes, instead of 10 hours and in film quality then, for me that's real time!
I know what you mean from our point of view, however, the term "real time" isn't subjective at all. Real time is real time, and not "faster than what I'm used to" ;)
So, if you do film, real time is 2K at 24 fps, or a full render in 41 ms. Now compare that to your 10 Minutes and we have a factor of roughly 14.000 ...


I never said "anytime soon". I said we'll live the day to see....
I never said you did ;)
I hope I'll get that old too (I'm 32 now ;) ).
I just think that, as ardware progesses, we'll always be below real time because there's just one more effect to add, one more lighting/shading feature :)

Cheers,
Mike - excuse my nitpicking, I'm not trying to be a pain :D

Nemoid
04-25-2004, 07:46 AM
I dunno if we will se RT rendering, because what we ask to our tools is always beyond what we have usually.if now we manage huge scenes , we will want to manage even bigger scenes, and so on. but a fast rendering is what is needed. so naturally a dedicaed hardware for it is surely needed.

i think that Open GL will allow us to predict at least how the rendering will be at 80-90% so that we will have no need to make huge tests with IPRs or Viper anymore...in this case videocards will play a always more important role in visualization.

jairb
05-06-2004, 04:24 PM
In computer-land "real time" is not subjective. Real time means that if the result is late it's wrong.

What you really mean is more like interactive. Much less rigorous.

nutman
07-10-2006, 08:12 AM
I'd like to add my interest for Gelato support. Pleeeeease newtek :)

Gregg "T.Rex"
07-10-2006, 02:18 PM
It feels like NT dosen't have the resources ($$$) to pull this of...
Maybe, a 3rd party vendor or as a payed addon from NT, though i doubt that...

I'd love to have Gelato in LW, though....

Cheers,
Gregg

Captain Obvious
07-10-2006, 02:44 PM
A few things I would like to point out:

- Gelato doesn't use the GPU for everything. I don't know what it does on the GPU, precisely, but there's plenty of work done by the CPU, that's for sure.

- As Lightwolf said, it's not real-time at all, and it will not be for a long while.

- As far as I've seen, it's not really all that fast.

RedBull
07-10-2006, 02:47 PM
Can anyone show some work they've achieved with Gelato?
Because from my own experience, it wasn't even worth the download.

Also lets not support Nvidia's global dominance in the industry where they try and force more users to buy overpriced and overhyped Quadro cards..
to push their crap efforts in GPU rendering.

Gelato being one of those innovations, that tries to force you to use Quadro's.. The professional version of Gelato, obviously requres Quadro cards.

Have you people priced the price and maintainance of Gelato Pro?
And then used it's features in comparison?
And it's certainly not Real-Time, as Nvidia specify it as a "Non-Realtime-Renderer" The basic version is crippled compared to the Pro version,
which costs more than double what LW does.

Also Nvidia were the ones who developed the Max plugin,
So if they want to support more programs, Nvida should of supported LW, C4D and XSI, rather just concentrated on Autodesk products.

The Basic version is not multithreaded, not 64bit, and does not have interactive lighting (these are it's only good points anyway)

While GPU based rendering will become increasingly more doable and usable in the future.... I think Gelato is a nice name for icecream, and that's about all...
Add the $1500 price and the Quadro cards, and this technology is way overhyped and being controlled by people selling hardware/videocards

Which to me spells "Danger, Danger Will Robinson"

Correct me if i'm wrong but aren't Max and Maya the only 3D programs to benefit from Nvidia's Quadro videocards?

Jace-BeOS
03-06-2008, 05:51 AM
as of the free version 2, you can use any GForce chipset, such as the 8800...

i think it is rather silly to NOT make use of as many processors as possible, especially when there are processors present in a system that are designed and optimized for certain calculations that 3D rendering requires. i mean, it doesn't make sense to NOT use as many processing/DSP facilities as possible. This is like the question of trying to create a translation layer for VST/AU audio plug-ins to use a graphics card's GPU as a DSP. It's there and it's not always in use. Why waste it?

RedBull
03-06-2008, 04:14 PM
as of the free version 2, you can use any GForce chipset, such as the 8800...

i think it is rather silly to NOT make use of as many processors as possible, especially when there are processors present in a system that are designed and optimized for certain calculations that 3D rendering requires. i mean, it doesn't make sense to NOT use as many processing/DSP facilities as possible. This is like the question of trying to create a translation layer for VST/AU audio plug-ins to use a graphics card's GPU as a DSP. It's there and it's not always in use. Why waste it?

You seem to be requesting GPU support. It's not as simple as that Gelato is a completely different renderer, it has nothing to do with NT or LW, and therefore it's not like using another processor...

Videocards my 8800GTS has 128SP which can these days do lend themselves to certain maths operation faster than CPU alone. They are designed to be used in Parallel. Many of the algorithms we use in 3D are Parallel.

LW is one of the first 3D application to see this innovation, and to offer Ageia Physx plugin for Lightwave which used the GPU or CPU.

However CUDA does not currently work under Vista, only XP and Linux, and has only just supported MacOSX. The CUDA libraries don't even support 3D textures etc as yet (will next version) and then some nice things should start to happen.

Hardware assistance will eventually make it into 3D programs, and already has started, but don't expect any revolution overnight, this will take time to be a viable useful technology that NT could combine with their standard raytracing techniques.

It is something NT should be looking at at adding some GPU stuff perhaps in LW10 if enough results show it's worth it.... Can you name any major 3D application that currently takes advantage of this kind of stuff?

Titus
03-06-2008, 06:15 PM
Gelato is slow. I've kind of beta tested it and I can't find a reason to install it again.

Jace-BeOS
03-12-2008, 12:42 PM
Can you name any major 3D application that currently takes advantage of this kind of stuff?

3DS Max
Maya
...

both (and others) use nVida's Gelato... plus, DAZ 3D's DAZ Studio uses the nVidia OpenGL support for fast renders, too. They look quite nice for not being full and normal renders done by the built in full rendering engine (3Delight).

RedBull
03-12-2008, 03:41 PM
3DS Max
Maya
...

both (and others) use nVida's Gelato... plus, DAZ 3D's DAZ Studio uses the nVidia OpenGL support for fast renders, too. They look quite nice for not being full and normal renders done by the built in full rendering engine (3Delight).

Sorry i meant GPU innovation, not Gelato, which as discussed is not innovative or useful for anything other than Gelato. As far as i know LW is the only GPU enhanced PhysX/Ageia particles plugin on the market.

As mentioned there is already an exporter being worked on for Gelato and LW, i would suggest that you contact the author, because it's something that Nvidia and the community of Gelato users need to add, not Newtek.

Ztreem
03-18-2008, 04:19 AM
XSI uses PhysX/Ageia for hardbody dynamics.

RedBull
03-18-2008, 06:45 AM
XSI uses PhysX/Ageia for hardbody dynamics.

I think you misunderstand the topic of the thread, Lightwave has both a CUDA and Non CUDA version of the Ageia PhysX. XSI only supports software as opposed to hardware Ageia, and it's not GPU enhanced with CUDA like the LW plugin.

Jace-BeOS
03-28-2008, 10:29 AM
I think you misunderstand the topic of the thread, Lightwave has both a CUDA and Non CUDA version of the Ageia PhysX. XSI only supports software as opposed to hardware Ageia, and it's not GPU enhanced with CUDA like the LW plugin.

i'm not familiar with this. Also, who/where is the development being done for a LW Gelato exporter? Is it the Renderman RIB exporter?

Thanks for the discussion. i'm not really fully into the LW community since i'm still struggling to get really going with it... not LW's fault... my life.

RedBull
03-30-2008, 01:19 AM
Thanks for the discussion. i'm not really fully into the LW community since i'm still struggling to get really going with it... not LW's fault... my life.

The link on Nvidia forums, shows SoulExertion has started a script/plugin for LW/Gelato called Pistachio..

http://forums.nvidia.com/index.php?showtopic=33857&hl=lightwave

Jace-BeOS
03-31-2008, 06:38 AM
:)

thanks for the link, i will check it out!