PDA

View Full Version : How many threads should a scene have?



Mr Maze
08-01-2007, 11:31 AM
When setting up a scene to be rendered by screamernet, how many threads should I have under the multithreading setting in render globals? Right now I have 2 computers: one with a dual core processor, and one with 2 single core processors. Each core/processor is set up as a node in screamernet. Should I be setting the scene to 1 thread?

BloodQuest
08-02-2007, 05:34 PM
One. No more, and (obviously) no less.

As has been discussed in the past, if you have enough RAM you can try running more than one render nodes per CPU (or core) on the basis that this forces the OS to yield cycles to the rendering tasks.

Whether the gain is sufficient to offset the increased IO overhead may be fairly marginal, so experiment to see what sort of results you can achieve.

Simon

Mr Maze
08-02-2007, 07:34 PM
Thanks, I have been using one but was unsure... I figure better to ask...

Dave Jerrard
08-24-2007, 08:14 PM
One per CPU/core. LightWave 9.x uses some extremely efficient thread management, that well outperforms that of the OS itself. If you have two corse, you'll get your best performance with two threads. This is true for a farm as well. If each machine on the farm has two cores, then one node per machine, running with two threads will be your optimal setting. Running two nodes is actually cutting your recources in half per node. Each node would be sharing splitting the available RAM with another node, so the total RAM available to each will be dropped. If you have a scene that uses more than a GB to render (a very likely case these days) and you only have 2GB installed, the nodes will not have enough to work with without paging to the hard drive, and thus, performance will be severely impeded.

Likewise, the OS will be handling the thread managment in a less efficient manner, adding its own overhead into the equation. When LightWave renders now (with the new cameras), each CPU core gets a chunk of the image to render, and when it's done that chunk, if there's more to do, the remainders gets split amongst the CPUs again. No CPU will be left idle. Rendering more than one node on a machine using single-threaded rendering will result in one CPU being idle when one one node finishes (if affinity is used), or it'll be at the mercy of the OS's management again.

I've done tests here with a large scene on a dual Opteron with 2GB. On a scene that takes about 1.2GB to render, trying to render two single-threaded nodes is about two times longer than rendering the same two frames on one node with two threads. So, if a frame normally takes about 20 minutes on a single node, dual-threaded, two frames would be done in 40 minutes, while using multiple, single-threaded nodes to render the same two frames simultaneously could easily take over an hour.

He Who Gives The Renderer As Much RAM As He Can.

papou
08-25-2007, 07:21 AM
just found that myself too.

There is no more reason to launch 1 Lwsn per Cpu/core.
Now we can use 1 Lwsn (multithreaded in lw.cfg) for all cpu/core.
That's a great news for renderfarmer, like a double Ram update for no cost!

dmack
08-28-2007, 06:26 AM
Say you have a mix on the network, some dual cpu dual cores (ie 4 cores) and some dual cpu quad cores, would you set the multithreading to 8 (the maximum on the network)? I'm guessing it wouldn't do any harm time wise to the 2x2 core boxes? Would that be the general rule - set the multithreading to the max number of cores that any one machine on the network has and then the others just split the frame more than necessary?

hdace
08-28-2007, 06:30 PM
Say you have a mix on the network, some dual cpu dual cores (ie 4 cores) and some dual cpu quad cores, would you set the multithreading to 8 (the maximum on the network)? I'm guessing it wouldn't do any harm time wise to the 2x2 core boxes? Would that be the general rule - set the multithreading to the max number of cores that any one machine on the network has and then the others just split the frame more than necessary?

I have thought about this too and I think you're right. I know Dave is right because my dual cores have one node each set to 2 threads and they run beautifully. I don't think it would hurt if I added a quad core, set the master config file to 4 threads and let the duals do four. I'm sure it adds a small overhead to the processing but it is probably minuscule.

John the Geek
09-01-2007, 09:53 AM
Say you have a mix on the network, some dual cpu dual cores (ie 4 cores) and some dual cpu quad cores, would you set the multithreading to 8 (the maximum on the network)? I'm guessing it wouldn't do any harm time wise to the 2x2 core boxes? Would that be the general rule - set the multithreading to the max number of cores that any one machine on the network has and then the others just split the frame more than necessary?


When I have multiple configs I make separate config directories. One folder called dual, one called quad, etc... then when launching each ScremerNet instance I make sure the right config directory gets called.

Telling a dual core machine to run with 8 threads might just choke it.

Dave Jerrard
09-01-2007, 10:13 PM
When I have multiple configs I make separate config directories. One folder called dual, one called quad, etc... then when launching each ScremerNet instance I make sure the right config directory gets called.

Telling a dual core machine to run with 8 threads might just choke it.
Naw, but it will run slightly slower that way, by about 1-5%, depending on the scene and system.


He Who Has Tried All Threading Options On His Dual Opteron Here.

Mr Maze
09-05-2007, 10:07 AM
Hey Dave - thanks for posting up in this thread. I will be setting up my pseudo-renderfarm properly now!