View Full Version : cache crunch

head worm
10-02-2008, 03:31 AM
Using LW9.5 for the first time on this job, I've had a strange issue when using a saved radiosity cache file.

I have a 750 frame sequence which uses Final Gather. If I switch off cached radiosity and render frame 1, it takes 12 minutes. If I bake the radiosity cache of the entire sequence (set to automatic) in 10 frame steps, frame 1 then takes 4 hours!! If I bake just a few cached frames (say 30 in 10 frame steps), frame 1 takes about 13 minutes.

Can anyone explain why baking out a long cached radiosity sequence would cause individual frame times to go through the roof, yet baking in small segments doesn't cause the same problem? I'd really like to know how I should best use baked radiosity cache.

PS: once cached, I'm setting it to "locked".

12-24-2008, 01:30 AM
I have same issue. With cache turned off - render time ~6 min per frame, with cached GI render time can go up to 1.5 hours per frame (this is for 10 nodes renderfarm). I've been exploring this issue and found out that nodes constanly read/write something to network drive (quite heavy - non stop traffic in both directions). So my guess was that network is bottle neck, i assigned scene to one node and it helped. It was slower than with cache turned off but not so critical (+1-2 min per frame). But now i have scenes which express same behavior on ONE node. Render time rises from 5 min to 1 hour with cached GI. Say 1st frame 5 min... 20th frame 45 min... and so on. With cache turnd off render time is 5-6 min per frame for all frames.
I didn't find any solution yet. (exept moving back to 9.3).... With V-Ray precaching GI reduces render time but not with LW. Looks like serious bug for me.