View Full Version : Nvidia's Gelato 2.0 and LW

Ivan D. Young
05-02-2006, 04:00 PM
Nvidia's Gelato 2.0 is now free, I was wondering if Newtek has said or contemplated using such a renderer for LW. It might be a nice alternative, if it could work well enough.
here is the Nvidia link:http://www.nvidia.com/page/gelato.html
Right now it only has support for Max and Maya.

05-02-2006, 11:39 PM
I believe it is possable to create a Lightwave plug-in considering this from Nvidia

NVIDIA Gelato does not have a required scene file format. Instead, Gelato has scene format reader plug-ins (DSOs under Linux, or DLLs under Windows). These plug-ins make it easy to directly read any scene format you have a plug-in for. You can use the ones developed by NVIDIA, or write your own.
Gelato comes with a plug-in that reads Pyg, which is Python with embedded Gelato API calls. But this plug-in is not “privileged”—it required no proprietary knowledge of Gelato’s internals to write it, nor is it inherently more efficient than any plug-in you may write. As an example, a third party wrote an RIB-reading plug-in that is just as efficient for getting scene input into Gelato. (Go to http://film.nvidia.com/page/gelato_download.html for a link where you can download this plug-in, with source).

Gelato lets you mix and match formats freely within a single Gelato rendering. For example:

You may have a top-level scene file as Pyg. ��
That Pyg may read RIB files (using Pyg’s Input() command).
Those RIB files may have a ReadArchive of another Pyg.
That Pyg may Input() an obj file.

And so on….

Does your studio use an in-house format (say, for the results of your simulations, or as a common intermediate format)? If so, we encourage you to write your own scene file reader plug-in and have Gelato read it directly, rather than translating to Pyg, RIB, or any other intermediate format.

All of Gelato’s APIs and formats (the C++ API, Pyg, grid dump files, and so on) are fully documented. We have no secret or proprietary formats.

Gelato’s APIs are also truly “open.” The APIs and how they are used, the header files, and all the example scenes, shaders, and source code that we ship with Gelato are covered by the BSD License. With the exception of the trademarks on the names NVIDIA, Gelato, and Mango, you are free to use all the header files and examples, copy them, modify them, redistribute them, and extend them. You can also write, distribute, or sell the readers and writers and compatible tools, including renderers.

05-03-2006, 02:41 AM
There is also RenderMan support: http://www.renderman.org/RMR/Utils/gelato/index.html#ribelato

There is also a LW RenderMan Exporter: http://garagepost.tv/renderman.htm


Should I cross these wires?

Elmar Moelzer
05-03-2006, 12:01 PM
Hmm, is it only me or are the images in the Gelato Gallery less than impressive?
I mean Gelato is rendering a bit faster than other renderers, but it is NOT realtime and the images dont look to great. Considering the amount of extra work needed to make Gelato work in a pipeline, I am wondering about the benefits...
Maybe someone here can explain this to me a bit more?

05-03-2006, 01:03 PM
Considering the amount of extra work needed to make Gelato work in a pipeline, I am wondering about the benefits...
Maybe someone here can explain this to me a bit more?

Gelato is a sort of a RenderMan implementation, but it is not RenderMan. Larry Gritz and his group of developers have experience with RM renderers, they developed BMRT and Entropy (now defunct).

Gelato uses the GPU for some heavy parallelized floating point computations, but you need an expensive Quadro card, I've heard some tweaked geforce cards can do the job if you use rivatuner. nvidia sent me some time ago a Quadro card to test Gelato and found it was a fast renderer and very configurable, I was surprised also Gelato still needs the CPU.

Gelato is using some ideas taken from RenderMan and REYES algorithm like programable shaders and micropolygons, so think in all the features of RenderMan.The funny thing is that the next version of Pixar RenderMan will have Gelato's approach to use the GPU.

05-03-2006, 02:08 PM
One potential benefit is that seems to support a hair primitive.

Elmar Moelzer
05-03-2006, 02:09 PM
Yeah, but considering the need for a very expensive graphics- card I am wondering whether some barebone renderslaves (that you can get for a simillar price) in a more workflow optimized pipeline wouldnt be more effective.
In the end the outputquality/time for setup and rendering (to a lower extent than setup IMHO) and is what counts, right?

05-03-2006, 04:07 PM
It looks like fairly average technology, that's been marketed to 3D professional, so they will buy up the QuadroFX in faster numbers.

Gelato works on all Nvidia's and Gelato Pro, only on Quadro's..
Only Maya and Max, supported by Nvidia.
The gallery is indeed poor...

The Licence fee for Gelato Pro is $1500.00
for the first year and $300 per year afterwards. (ROFL, LOL)

As for being a totally open API, what a load of crap, it's a commercially licenced, closed API controlled by Nvidia for a price, that nobody else has any influence over, and will only work with Nvidia hardware (mainly their overpriced, underperforming Quadro line)

"The reason we only certify and support Gelato Pro on NVIDIA Quadro FX hardware is the different ways we manufacture and distribute GeForce and NVIDIA Quadro FX hardware. All NVIDIA Quadro FX hardware is manufactured specifically for NVIDIA and we tightly control the specifications and quality."

That's the most self-indulgent load of utter marketing crap iv'e read this year.

I'm all for a GPU/CPU hybrid renderer, but if it comes from this crappy (Nvidia)company, i'd rather go start a sewing circle as a new hobby instead.


Elmar Moelzer
05-04-2006, 01:23 PM
Well, while I dont see the direct advantage of Gelato, I still think that it is a great effort and I DONT think that Nvidia is a crappy company. Their OpenGL- drivers are still the best on the market, e.g and their graphics- cards deliver excellent performance in all aspects. I would not spend extra money on a Quadro- card though. They offer to little for the extra price.

Captain Obvious
05-04-2006, 02:24 PM
Judging by what I've heard so far, Gelato is less-than-stellar in just about all regards...

Pablo Argaluza
05-05-2006, 03:32 AM
Is there any similar for Ati cards in development?

Iīve downladed the program yesterday from de Nvidia Site but itīs primarily aimed at Maya, although have support for 3Ds and Renderman.
It is in a early stage (it seems that you have to do some work on a promt shell, but...

The add power of the actual GPUs should, in no way, be ignored.
I think anyone at Newtek should jump on this.
Graphics cards are used today to co-calculate physic maths in games trought DirectX 9.c.
We need that power!!

Mix that with a Quad core processor setup (on a single Die the next year; let alone you could setup a Quad graphic card monster today) and you end up with a computational power to render in near real time for the most part of the time.

Itīs the way to go !!

Iīve e-mailed Nvidia about the Lightwave support and (in less than half an hour !!!; thatīs support, man) responded me that they are only working with Maya support but encourage anyone to make a plug-in for Lightwave with all their aid.

As you have seen there is third party support for 3Ds Max and Renderman.
No coment. (Newtek need it too for real success).

And donīt be fool, the basic version is free and works with any card since 5200.
The pro version only works with Quadro line of graphic cards but it comes with net render support and a program for real time light post-processing. It may be very interesting if you work for real.

Free render time with a card that you probably own yet, Whatīs the problem??

Of course it is an interested program for you to buy Nvidia cards, but if it works as advertised, where on **** is the problem?.

As iīve said before : thatīs the way to go Newtek; donīt miss that train.

05-06-2006, 11:20 PM
I did some rough tests, the visual results seemed on par with 3Delight atleast. Oddly enough, the 3Delight render was much faster, maybe there was a difference of what was loaded from the generic RIB file I exported or some other unknown variable -- I don't exactly know what I'm doing. :D

If nothing else, the included iv image viewer is a very handy tool to have.

05-08-2006, 03:49 AM
A question for Albertdup.

When you say that Gelato can mix and match formats and "That Pyg may Input() an obj file." Is this reliant on someone writing a plugin for the obj format? I take it that the Pyg file needs to load a DLL or DSO to accomplish this?

I am new to this and would like to use Gelato, but I don't have any software at the moment that will produce RIB files (or Pyg for that matter.) I keep Googling for converters and plugins, but no joy yet.

Captain Obvious
05-08-2006, 05:28 AM

I'm not sure how relevant this is, since the scene is so darned simple, but... 13 seconds for that? You could probably render the same scene in 13 seconds in Fprime with radiosity turned on...

Edit: the actual images are inaccessible now. Odd. Oh well, it was a simple scene with a few spheres lit by a single light.

Edit again: I rendered a similar scene in less time on my iBook! This is Lightwave's default renderer on a 1.33GHz G4, rendering a similar scene faster than Gelato on a Geforce 7800GT, even with ambient occlusion turned on!

Elmar Moelzer
05-08-2006, 02:36 PM
Well as I said, Gelato is not realtime not even close and anyone expecting that from it , completely missed the point. Just because something runs on graphics- hardware does not mean that it will be in realtime or even faster than if it was running on a CPU.
And DONT start comparing to game- engines, that is something completely different.
That said, putting some tasks on the GPU will become more interesting in the future I am sure, but until then I think that a good and fast software rendering still cant be beat in daily use.

Captain Obvious
05-08-2006, 06:56 PM
I'm perfectly aware of the fact that Gelato is a final renderer, and not a real-time one. Nobody even mentioned game engines. But if Gelato on a 7800GT is not leaps and bounds faster than Lightwave's default renderer on a 1.33GHz G4, something is really wrong. Just look at the CGsociety thread: two minutes for an ambient occlusion pass on a really simple scene. That's just plain slow.

I guess this proves my theory: A fast renderer on slow hardware is much faster than a slow renderer on fast hardware.

Elmar Moelzer
05-09-2006, 02:38 AM
Sorry that was not meant in direct reply to your post Captain.
I just see so many posts by people that think what I wrote (runs on GPU = Realtime) and think that Gelato has to be sooo much faster because of that.

Captain Obvious
05-09-2006, 05:27 AM
Ah, okay. Well, using the GPU is supposed to make it faster. Not real-time, perhaps, but faster than just a CPU. But it isn't faster than many renderers that run on just the CPU...

Elmar Moelzer
05-09-2006, 11:05 AM
Yes, it is supposed to do that and I am sure that in some situations it will be faster, the question is just: how often do you hit those situations in daily production- work.
GPUs are designed to do one thing well and very fast: draw triangles.
Sure modern GPUs are highly programmable, but this programmability is still rather limited (number od instructions in the shader- code e.g.) and comes at a price. A couple of GPU- generations from now the situation might be better, but then we will have quad- core CPUs and what not, so it is for me still a matter of wait and see how things develop.
I also want to mention that the fastest Intel CPU today costs less than the fastest Nvidia Graphics- card...
With the current situation I dont see any reason to hurry with support for Gelato, if you ask me.

05-09-2006, 11:46 AM
GPUs are designed to do one thing well and very fast: draw triangles.

I'm sure a GPU can do more than just draw triangles, like vectorizing floating point operations.

With the current situation I dont see any reason to hurry with support for Gelato, if you ask me.

I agree.