PDA

View Full Version : GPU VS CPU showdown render test!



cresshead
04-12-2014, 10:59 AM
hi

okay it's blender...don't string me up!
this is a test scene i've uploaded to see how CPU compares to GPU for rendering out a blender test scene with cycles
as you can choose either cpu or gpu from the one renderer.

the render is 1920 x 1080 pixels ( note when you open the zip it's set to 50% on the render tab just slide it up to 100%)

https://scontent-a-lhr.xx.fbcdn.net/hphotos-ash3/t1.0-9/10173688_10152376035352871_5745285452599506173_n.j pg

so
CPU
1.my OLD quadcore intel 2.4ghz with no hyperthreading 20mins 34 seconds
2.my imac i5 2.5ghz rendered the HD frame in :: 10mins 01seconds
3.Hex core hyperthreaded (12 threads) at around (4930K chip) 4.1ghz :: 3mins 44 seconds


GPU
1.760gtx has (1152 cuda cores and 2 gb ram) :: 3mins 42 seconds
2.gtx 770 with (1536 cuda cores) 4gb ram :: 2mins .47 secs
3.quadro K5000 (1536 cuda cores but lower clock of 700mhz) 4gb ram :: 4mins 40 seconds


conclusions:
Basically using cycles as it's hybrid..so can choose cpu or gpu
a 172 graphics card is as fast as an six core 12 render buckets 4930K chip which costs 413
with the caveat that you could add another card and double your speed
simply where as you'd need another complete pc to get another cpu...

if you want to test the scene yourself let me know


of course the limiting factor is RAM on the GPU.

50one
04-12-2014, 11:23 AM
Please add refractive objects, sss an micropoly displaced ball, otherwise this test is pointless:)

Greenlaw
04-12-2014, 11:41 AM
Wow, that's very interesting. Thanks for posting.

In my case, I can't upgrade the GTX 460 card in my computer because my computer is old and pretty much maxed out at this point. Reports like this have me seriously thinking about getting a new box later this year though.

G.

cresshead
04-12-2014, 12:52 PM
would also be cool to see some render time tests with lightwave CPU vs octane GPU on same geometry scenes.

lighting process would be different i guess.

cresshead
04-14-2014, 11:39 AM
more render times come in..

GPU
1. K4000, (which is effectively a 560Ti) small tiles 32x32 5mins 00 seconds
2. K4000 128x128 tiles on the GPU. 4mins 24 seconds


CPU
16 cpu's (32 threads) H.P workstation - Small tiles at 32x32make the CPU render slightly faster, 2:01:69

Danner
04-14-2014, 02:55 PM
Even if we limit the comparison to just render times, It's very hard to compare GPU vrs CPU because every situation is different, Lightwave's native render can cache the radiosity solution. On my renders it only needs to be rendered every 8-12 frames and even then only the first frame takes any time at all. With a little trick my radiosity pass usually takes less than 20 min for the whole scene. I also cache the shadowmapped lights. For stills, for animated characters, or any kind of render where cache is not a good idea, then GPU renderers would shine.

What I'm saying is that it's not that simple. Comparing rendering speed on a still frame that uses a lot of fur is not a real world benchmark for GPU vrs CPU.. unless you do that kind of stuff.

cresshead
04-14-2014, 03:41 PM
cycles in 2.71 is getting GI baking so that'll be the same capability for animation as lightwave then...

for sure cpu and gpu both have plus and minus points...good to have both options though

Rayek
04-14-2014, 11:39 PM
Hi Cresshead - Could I test that scene of yours? I have a GTX590 - might be interesting how it runs on that.

geo_n
04-14-2014, 11:55 PM
Would be interesting to see a lwoctane and cycles preset scene test.

safetyman
04-15-2014, 05:30 AM
Cress, as an aside, try larger tile sizes on the GPU (256x256 and higher).

cresshead
04-16-2014, 12:38 PM
as you open the file it's set to 50% render size just slide it upto 100% to render full HD frame

www.f9render.com/blender/blender_27_fur_test.zip

cresshead
04-26-2014, 05:35 AM
https://www.youtube.com/watch?v=r3WasjW247U

render test to see if the FUR pops or flickers when the object and camera are animated

erikals
04-26-2014, 06:54 AM
looks great, wonder, is it fur, or is it polygons...?

cresshead
04-26-2014, 02:03 PM
its blender fur

cresshead
04-27-2014, 02:31 AM
new test model for fur

just head n shoulders :)

https://scontent-b-lhr.xx.fbcdn.net/hphotos-ash3/t31.0-8/1617907_10152409940407871_4500388498647060603_o.jp g

eagleeyed
04-28-2014, 07:00 PM
I came in to see the comparison results, but was amazed at the quality of the hair.
Time to download Blender.

JoePoe
04-28-2014, 07:46 PM
new test model for fur

just head n shoulders :)

https://scontent-b-lhr.xx.fbcdn.net/hphotos-ash3/t31.0-8/1617907_10152409940407871_4500388498647060603_o.jp g

If we could just figure out what your inspiration was! :D :D

cresshead
05-17-2014, 12:11 PM
okay new pc arrived today so did the cat render test with it!

i7 4930k cpu - six core at 3.4ghz (3.7)
CPU render time 3mins 54 seconds (12 buckets)

gpu is a gigabyte gtx 780 with 2304 cuda cores
GPU render time 1 min 48 seconds

this was with window 8 64bit, 16gb ram

erikals
05-17-2014, 04:08 PM
I came in to see the comparison results, but was amazed at the quality of the hair.
Time to download Blender.

i know... it looks king...! :

cresshead
05-17-2014, 04:53 PM
standard render test scene from blender to compare

https://fbcdn-sphotos-g-a.akamaihd.net/hphotos-ak-frc1/t1.0-9/10298785_10152456012702871_5182718217270296514_n.j pg


update managed to get the imac down to 3mins 7 seconds (187 seconds) by changing the bucket size to 32x32 and making it render top to bottom

erikals
05-17-2014, 07:11 PM
the more i look at GPU times the more i wonder when NT will go that route...

i know Jay Roth said NT was gonna implement it in the future, that was 6 years ago or so...

madno
05-18-2014, 12:45 AM
2 x Xeon ES-2687W @ 3.1 GHz (factory default)
1 x Nvidia Quadro K6000 (factory default)

CUDA:
tile size 512x512
00:32:74

openCL:
tile size 512x512
00:37:48

CPU:
tile size 32x32
00:46:60

cresshead
05-18-2014, 05:56 AM
2 x Xeon ES-2687W @ 3.1 GHz (factory default)
1 x Nvidia Quadro K6000 (factory default)

CUDA:
tile size 512x512
00:32:74

openCL:
tile size 512x512
00:37:48

CPU:
tile size 32x32
00:46:60

hi, which scene are you rendering out in this test?

if it's the cat scene in bl;ender you need to drag the render size to 100% i think i left it as 50% by error in that file but everyone so far has moved it to 100% so renders should be full HD size at 1920 x 1080
sorry for any confusion :)

if it's the car scene then you're fine, just hit render unless you want to set up a different bucket size as the default bucket size for GPU means you'll get a slow render result form gpu

fazi69
05-18-2014, 07:27 AM
Main reason for GPU rendering in my opinion is easy way to expand computing power of workstation. Just add another GPU, most of us have 4 slots for that .... how many free slots for additional CPU You have ?

cresshead
05-18-2014, 08:04 AM
Main reason for GPU rendering in my opinion is easy way to expand computing power of workstation. Just add another GPU, most of us have 4 slots for that .... how many free slots for additional CPU You have ?

correct, the only other thing you need is a much larger PSU for example a gtx 780 under full loading could take around 400w to adding a second gtx780 means you need a 1000w psu in your computer
and you need decent cooling in the box - plenty of decent fans

nvidia site says 250w and toms hardware test says around 250 also

http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-780/specifications

http://media.bestofmicro.com/W/I/408114/original/Power-Consumption-GTX-780-Windforce-GHz-Edition.png

http://media.bestofmicro.com/W/S/408124/original/05-Power-Consumption-Torture.png


although in this test they say a 780 is 400w
http://www.hardocp.com/article/2014/01/20/gigabyte_gtx_780_ghz_edition_video_card_review/10#.U3jDbFhdVwA

not sure who right now!

okay more digging

http://www.tomshardware.co.uk/geforce-gtx-760-vs-780-sli,review-32851-8.html

http://media.bestofmicro.com/7/N/411107/original/image019.png

fazi69
05-18-2014, 08:17 AM
Sure, PSU needs to be quality one. 850W is enough for over-clocked six core CPU and two GTX780 . Additional two GTX780 will consume exactly 480W. It is easy to add cheap external PSU just for cards. In my case (chieftek bigtower there is enough space for additional PSU).
With new Maxwell based cards it will be possible to have 1000w PSU and four cards as fast or faster than gtx780.

ianr
05-18-2014, 09:07 AM
Well Cress,
with those GPU results in a free prog( Blender)
I would mail those results to Mr.Powers.. tut-suite!
LW 12 must smell that coffee!

cresshead
05-18-2014, 09:33 AM
Well Cress,
with those GPU results in a free prog( Blender)
I would mail those results to Mr.Powers.. tut-suite!
LW 12 must smell that coffee!

i really hope Newtek embrace GPU rendering for lightwave 12 maybe even hybrid using cpu or gpu or even both together!

cresshead
05-18-2014, 09:35 AM
Sure, PSU needs to be quality one. 850W is enough for over-clocked six core CPU and two GTX780 . Additional two GTX780 will consume exactly 480W. It is easy to add cheap external PSU just for cards. In my case (chieftek bigtower there is enough space for additional PSU).
With new Maxwell based cards it will be possible to have 1000w PSU and four cards as fast or faster than gtx780.

i have a 750w psu in my new pc so it maybe okay or might need throwing out when/if i get a second gpu.

madno
05-19-2014, 08:20 PM
hi, which scene are you rendering out in this test? ...

It was the car scene.

cresshead
05-20-2014, 08:45 PM
oh okay cool thanks for the reply :)

jburford
05-21-2014, 04:44 PM
Damn, must say the Fur looks really great! Hmnn....

cresshead
05-21-2014, 09:30 PM
yup Blender is pretty amazing when it comes to fur, i was mighty surprised.

vonpietro
06-10-2014, 03:09 PM
so if i'm reading this right, you can use gpu to render in blender latest version?

if so, can you tell me where the setting is?

BokadCastle
06-10-2014, 05:00 PM
so if i'm reading this right, you can use gpu to render in blender latest version?

if so, can you tell me where the setting is?

http://wiki.blender.org/index.php/Doc:2.6/Manual/Render/Cycles/GPU_Rendering

Rayek
06-10-2014, 05:34 PM
so if i'm reading this right, you can use gpu to render in blender latest version?

if so, can you tell me where the setting is?

Also do not forget to set the render engine to "Cycles". And any viewport can be set to a GPU rendered one as well, updating in real time when changes are made.