PDA

View Full Version : Nvidia CUDA technology



Pablo Argaluza
11-14-2006, 11:58 AM
OK, Lightwave doesnīt jumped on Gelato and it was a inportant one, but, what about this one:
http://developer.nvidia.com/object/cuda.html

It seems easier to develop for.
Sure you can say itīs Nvidia only software, but it seems that with DirectX 10 itīs easier to use the GPUs as a Thread more. And there is lot to gain. In brute render force we can start to talk about 10x velocity (with only one GPU).....
I know there are new intel Quad procesors right now, but there is nothing to do with the power of those actual GPUs. And you can add more than one in a home system. Where is the problem?
What is the Newtek oficial answer to that aproach of the render pipeline?
Renderman already use that. Isnīt the render velocity important for Newtek? becouse as users i think it is in the top priority list for all of us. Donīt you agree??

MiniFireDragon
11-14-2006, 12:02 PM
Keep in mind that using an Nvidia specific product could alienate the ati users of Lightwave

StereoMike
11-14-2006, 12:55 PM
But AMD-ATI has it's own view on this: It' stream processing and sounds very similar to cuda (in fact it's vice versa, cuda came in second).

http://ati.amd.com/companyinfo/events/StreamComputing/index.html

I think it would be cool being able to use low budget rendercards. Low budget compared to the cost of similar pcs.

Mike

RedBull
11-14-2006, 01:22 PM
Keep in mind that using an Nvidia specific product could alienate the ati users of Lightwave

Considering no Mac support, again not much good for LW.

I would need to see realworld examples to be convinced of it's appeal.
But the idea is interesting. Will keep an eye on it.

Pablo Argaluza
11-15-2006, 06:02 AM
[QUOTE=RedBull]Considering no Mac support, again not much good for LW.



OK, but now you have to think two more things: Macs have Bootcamp now, and second: there is Ati and Nvidia GPUs for Mac too.

If Newtek version Lightwave to render in 32 and 64 bits on Windows and Macs, they can do the same effort with Nvidia and Ati GPUs on both platforms.

Sure itīs easier to talk about it than make the effort, but we have tons to gain in final render speed.

I think itīs the way to go for succes. If they canīt afford a total integrated solution now, at least they could do a plugin or let us know if there is some third party effort for using that power in the future.
At least thanks for the revamped OGL in Modeler. (If only we can see those previews as good looking as the later games)......

lots
11-15-2006, 10:32 AM
With the coming flow of GPUs from both ATI and Nvidia being heavily based on stream processors rather than purely graphics oriented hardware, there will be many many tasks at which a GPU can be put to use. GPU can assist in AI, some forms of physics, rendering, etc.

I would be excited to see LW move in that direction. Of course this only appeals to a small portion of the market, for now. Someone with a small budget (freelancers, etc) will probably not own the hardware capable of this for a while longer. From what I hear, Nvidia has released most of its extensions to the OpenGL crowd, so that may open up some uses for the G80s stream processors on the Mac, and other non MS OSs, though I have not looked into this.

But why shouldn't Newtek take advantage of this new tech? It already does it with 64bit Lightwave. IMO its more a problem with Apple than anything else.. heh :) And again, you can always isntall a Windows OS on modern Macs, so its not a real big issue. You could work in OSX, and then render over night on Windows :)

tyrot
11-15-2006, 12:05 PM
dear NT (chuck?)

is there any plans to support GPU based LW?

You know LW was first 64 bit 3d application. LW brought some milestones in 3d history. Why not a new LW era starts with this type of integrated solutions.

And because you guys are writing a new CORE probably you will be considering future proof LW. Why not you start from there? Or did you already?

BEST

RedBull
11-15-2006, 12:18 PM
Considering no Mac support, again not much good for LW.


OK, but now you have to think two more things: Macs have Bootcamp now, and second: there is Ati and Nvidia GPUs for Mac too.

If Newtek version Lightwave to render in 32 and 64 bits on Windows and Macs, they can do the same effort with Nvidia and Ati GPUs on both platforms.

Sure itīs easier to talk about it than make the effort, but we have tons to gain in final render speed.

I think itīs the way to go for succes. If they canīt afford a total integrated solution now, at least they could do a plugin or let us know if there is some third party effort for using that power in the future.
At least thanks for the revamped OGL in Modeler. (If only we can see those previews as good looking as the later games)......

Nvidia claim Linux/Windows library support. (no Macs)
meaning NT could not port a mac version.

I agree the trend will be to use GPU's for co processing.
And i would like to see NT be among the first adopters.
Surely LW does use a lot of FFT, which can be improved on GPU's by 2.4x faster than current CPU's.

Here is a similar test with BLAS and FFT with ATI and Nvidia cards.
http://www.rapidmind.net/sc06_hp_rapidmind_cpugpu_summary.php

Performance is pretty good and i believe the rapidmind approach is cross platform, and non specific to ATI or Nvidia.

They do say ATI/AMD methods can speed results even faster.

Anyway as i said i would like NT to evaluate the trends in this area,
as it would look innovative in LW10.

Verlon
11-15-2006, 05:40 PM
I would think this would be a good move. A vid card has a lot of math processing power these days. Why not take advantage of it?

Speedmonk42
12-02-2006, 12:56 PM
I don't see why no mac support now should stop them.

Even if they pick one, a sub 1K very powerful network renderbox would be great for everyone.

lots
12-04-2006, 08:03 AM
Makes you wonder. All those SLI motherboards with multiple PEG (2, 3 or 4) slots. You could get a seriously powerful system for most types of rendering...

Speedmonk42
12-04-2006, 12:00 PM
Yeah, but the problem is what will it speed up?

I know it is not the same as a general purpose processor. I wonder if someone with more programming experience could elaborate on what areas it would speed up and if the work/benifit ratio is worth it?

tektonik
12-04-2006, 09:05 PM
amd-ati will sell boards with no video out, just for their coprocessor power...

i hope they have an easy sdk for coders!

creativecontrol
12-04-2006, 09:26 PM
This all looks very promising. Although similar things have faded away before I sure would like to see GPU assisted rendering actively persued by Newtek. It would be nice to be on the leading edge instead of the trailing edge of a new important rendering technology.

I don't think the lack of Mac support should deter progress in this area. Mac has the same GPU's so it is likely they would support it at some point and quick progress can't be made by always waiting for the lowest common denominator. It would be nice to have it as an option.

Hardware always seems to be so far ahead of software in development it would be nice to see timely movement on something that could make a major difference to render speeds. Lightwave has a host of wonderful features that are too slow to be regularly used in many production settings but something like GPU assisted rendering has the potential to change that.

I vote for do it Newtek! (not that anyone cares).:newtek:

lots
12-04-2006, 09:43 PM
Honestly though, I want a built in previewer like Fprime that works with all aspects of lightwave. I can always wait on a final render :) But thats just how I work.. Perhaps NT could have a GPU assisted previewer in addition to a final rendering engine.. In either case, the renderer in LW still has alot of work and enhancements ahead of it, in terms of speed. But thats the thing :) its always about faster.. heh

Mipmap
12-04-2006, 10:28 PM
What I'd like to see happen is the idea of a board like those SLI ones, but instead of two normal video cards, have a normal video card and the second slot just be another PCI-Express card with no other functions than some extra GPUs for programs like Lightwave to use when they need them.

Mipmap
12-04-2006, 11:08 PM
Also I remember reading that Sony had hinted at the idea of putting a bunch of Cell processors on a PCI Express card so people could just plug in a bunch of extra processing power into their motherboard. Although I wouldn't begin to have a clue how a card like that would compare performance wise to a card that had 2-4 dual core GPUs on it.

Sensei
12-05-2006, 12:13 AM
BTW, NVidia choosed good name :) Cuda in Polish language means miracle..

lots
12-05-2006, 07:23 AM
For what its worth, I feel the G80 is far superior than Cell. For one thing, Cell only has 7 stream processors (well something "like" a stream processor anyway :P), G80 has 128. Granted it does not have a general processing unit such as Cell's stripped down, in order PPC core, but in a multi core system, this doesn't really matter much ;)

A Geforce 8800GTX + a display-less G80 (this is possible, since the display unit was removed from the G80 core and put into an external chip) and a quad core CPU such as Kentsfield, and you could have a very powerful system on your hands.. It would be interesting to see if nvidia does something along these lines.

I'd imagine the G80 would "load balance" all kinds of stream processes and assist both GPU and CPU, depending on need. At least that would be one interesting direction Nvidia could go with the tech..

Also dont count AMD-ATI out, as they are working on a similar tech, but at the CPU level called fusion. Basically integrating a stream processing element to the CPU die, and/or secondary socket on the motherboard.

I'd imagine its in everybody's best interests to have a uniform API, so as to help promote adoption of the technology, but of course that may not happen :)

Speedmonk42
12-05-2006, 01:12 PM
I am still curious though, how does the kind of calculations in LW translate to these kind of ATI/NVIDIA GPU processors?

Does everything benifit, some more than others or good for some and not at all for others.