PDA

View Full Version : nvidia sigraph 2018



Zerowaitstate
08-13-2018, 07:55 PM
check it out https://www.facebook.com/NVIDIA/videos/10155983076983253/

Zerowaitstate
08-13-2018, 08:03 PM
Oh my cpu rendering ..... im not so sure about its future any more .

TheLexx
08-14-2018, 07:14 AM
Here is an article including the apecs table of the new Nvidia RTX GPU range.
https://www.anandtech.com/show/13217/nvidia-announces-turing-powered-quadro-rtx-family

I am a bit lost with what is happening with LW and Octane 4, like will they be compatible and will these GPUs make the combination faster still ?

AMD has a 64 thread Threadripper CPU coming.
https://www.redsharknews.com/technology/item/5714-amds-new-threadripper-now-features-up-to-a-whopping-64-threads-over-32-cores

I don't know what goes on under the hood. Is there any reason why AI can not be incorportaed in CPU at some level ?

Qexit
08-14-2018, 08:02 AM
Here is an article including the apecs table of the new Nvidia RTX GPU range.
https://www.anandtech.com/show/13217/nvidia-announces-turing-powered-quadro-rtx-family

I am a bit lost with what is happening with LW and Octane 4, like will they be compatible and will these GPUs make the combination faster still ?

AMD has a 64 thread Threadripper CPU coming.
https://www.redsharknews.com/technology/item/5714-amds-new-threadripper-now-features-up-to-a-whopping-64-threads-over-32-cores

I don't know what goes on under the hood. Is there any reason why AI can not be incorportaed in CPU at some level ?You can already buy a 64 thread Threadripper. They escaped this week, though I can only find them available for pre-order here in the UK. Very limited stock...plus I would need a lottery win or some similar windfall to be in a position to buy one including the rest of the kit required. 1700 asking price for the 32 Core Ryzen Threadripper 2 2990WX alone at Scan. That might be cheap for what you are getting but way outside my current budget :oye:

rustythe1
08-14-2018, 10:35 AM
Here is an article including the apecs table of the new Nvidia RTX GPU range.
https://www.anandtech.com/show/13217/nvidia-announces-turing-powered-quadro-rtx-family

I am a bit lost with what is happening with LW and Octane 4, like will they be compatible and will these GPUs make the combination faster still ?


Probably not, this is what I was mentioning in another thread "what happens when Nvidia drops cuda" as they will, and by the look of it soon, the rtx cards have less cuda and more tensor than previous models, if im not mistaken the next 11 series cards (at least the high end ones) are also dropping some cuda and adding tensor, so current octane may be faster on older cards unless tensor has been added to the mix already? problem with that is older cards are already sometimes expensive and getting hard to get hold of thanks to mining,

rustythe1
08-14-2018, 10:38 AM
by the way, that Quadro will probably be in the region of $10,000 and each customer is only allowed to purchase one!

m.d.
08-14-2018, 06:04 PM
where did you get the idea they are dropping CUDA?

Tensor is solely AI you realize. They are adding Tensor to support AI de-noising and AI lighting....it's not a competing technology with CUDA

Tensor is a library inside CUDA.....it's not a separate language.

https://devblogs.nvidia.com/programming-tensor-cores-cuda-9/

m.d.
08-14-2018, 06:26 PM
Probably not, this is what I was mentioning in another thread "what happens when Nvidia drops cuda" as they will, and by the look of it soon, the rtx cards have less cuda and more tensor than previous models, if im not mistaken the next 11 series cards (at least the high end ones) are also dropping some cuda and adding tensor, so current octane may be faster on older cards unless tensor has been added to the mix already? problem with that is older cards are already sometimes expensive and getting hard to get hold of thanks to mining,

Already running in house....along with redshift, renderman, clarise, arnold, blender, and thea

It's still run through CUDA....just like tensor cores are optimized for AI, RTX cores are optimized for raytracing....its all still CUDA

from nvidia's press release

Otoy OctaneRender: GPU-accelerated, unbiased, physically correct renderer is demonstrating performance improvements of 5-8x with Octane 2019’s path-tracing kernel — running at 3.2 billion rays/second on NVIDIA Quadro RTX 6000, compared with 400 million rays/second on Quadro P6000.

m.d.
08-14-2018, 06:33 PM
by the way, that Quadro will probably be in the region of $10,000 and each customer is only allowed to purchase one!

$2300 for the cheapest which should be 60% the power of the $10k one....

So like 5x faster than the fastest current Nvidia GPU for about double the price

pixym
08-14-2018, 08:53 PM
I really think CPU rendering would be over in the next few years :-(

m.d.
08-14-2018, 09:15 PM
It'll still be around....but it's hard to compare a general purpose CPU against specialty ray tracing cores.
Octane was a 5x speed boost on top of LW renders....now a $2k card is another 5x speed boost on top of Octane. Not sure how a CPU render can compete when a single card can outperform a 20+ CPU farm.

Not to mention all the other things a GPU can speed up....I used to have a $5k Red Rocket card for realtime 4k Red....until a single gaming CUDA GPU blew it away in performance.
AI runs faster....physics run faster, fluids run faster...video decoding runs faster.

Ernest
08-14-2018, 11:53 PM
where did you get the idea they are dropping CUDA?

Tensor is solely AI you realize. They are adding Tensor to support AI de-noising and AI lighting....it's not a competing technology with CUDA

Tensor is a library inside CUDA.....it's not a separate language.

https://devblogs.nvidia.com/programming-tensor-cores-cuda-9/

I think he's talking about how less and less of the die is being/will be dedicated to Shader/Compute cores. I just don't see them being eliminated... ever, but RTX and Tensor took over a whole half of the die in just one generation!


CPU rendering may continue to exist now that there is a core/price arms race in the CPU market again, but it will not be pure CPU rendering anymore. Denoising and Anti-Aliasing in the CPU, at the very least, is becoming a crazy proposition really fast.

I can't help but cringe at myself every time I think of how smugly I would have laughed 3 years ago at anyone who suggested that in just 3 years there would be realtime blurry reflections and caustics.

Now, I almost expect that in 10 years there will be a pure AI renderer that will just read the scene and guess how the final render would look, without calculating any rays.

m.d.
08-15-2018, 01:08 AM
Tensor cores are just matrix calculators. They are optimized and internded for AI but really can be purposed for any matrix calculation. It’s no reason to panic. Although a larger share of the die is going to RT and Tensor cores, shader units will still likely grow with each generation.

And really....how fast do we need to run Far Cry at full AA and 4k. Games are changing too....
Soon we will have real-time raytracing in games as well....within a few years. Nvidia knows this and that’s why the device needs to be refocused beyond just shader units.

It’s strange people making predictions of the death of GPU rendering at the very moment of its breakout into changing the entire industry. Very soon game engines and traditional 3D renders will cross paths and be indistinguishable from each other. Then the cost calculation of the rendering aspect for software companies will go right out the window.

rustythe1
08-16-2018, 06:15 AM
where did you get the idea they are dropping CUDA?

Tensor is solely AI you realize. They are adding Tensor to support AI de-noising and AI lighting....it's not a competing technology with CUDA

Tensor is a library inside CUDA.....it's not a separate language.

https://devblogs.nvidia.com/programming-tensor-cores-cuda-9/

Wasn't an idea, was an observation, the new ranges all have around 1000 less cuda cores when tensor cores are added, so it stands to reason they could be slower when using cuda only enabled apps over the older cards,

rustythe1
08-16-2018, 06:18 AM
and im not saying it will end or die off, im saying smaller studios / hobby are going to probably have to fork out to stay current, and the software itself is going to have to develop fast to keep up with hardware changes, where as CPU gives people the option to stagnate themselves.

Nicolas Jordan
08-16-2018, 09:27 AM
I would like to get a Threadripper 2 2990WX in the next 6 months or so but it's price of $2400 cdn has me thoroughly investigating GPU rendering again with Octane as my next step forward. I have never liked the idea of purchasing GPU hardware specifically for rendering in Octane since it would be totally useless for doing any native rendering. This is why CPU rendering still has a slight advantage in my mind.