View Full Version : does LWSN.exe utilize memory or just CPU

08-28-2015, 04:57 AM
I didn't get any bites over on the screamernet forum, so I am trying to repost here in the general forum...

I use Smedge render manager controlling lwsn.exe (v11.6.3) for our network renders and it, by default, renders solely through the CPUs. Smedge has the option to utilize both cpu and memory but I don't know whether lwsn.exe does or can access memory for rendering. From what I've read on this forum (mostly older messages), it appears as though screamernet does utilize memory or at least can utilize it. Does anyone know for sure?

Secondly, does the lwsn.exe have user-adjustable controls to dictate cpu/memory usage for its renders or is it all automatic and internal to its code?

And just another side question that arose as I was reading some of the earlier threads, is screamernet used as the render engine for the routine F9 and F10 renders directly within LW or is it strictly for network rendering? I've always assumed it is the engine that LW taps for its internal renders whether networked or not.


08-28-2015, 05:14 AM
Any program uses either CPU and memory.. ;)

You would have to ask Smedge author what they meant by it.
Maybe they're making virtual disk and loading entire scene to virtual disk?
And then loading from it, instead of loading from disk. That would speed up some scenes.

Any LW render controller is executing internally LWSN.EXE, after they do their work (transfer data between server and clients).