View Full Version : Lightwave rendering, computing normals?

03-12-2012, 04:47 AM
Hi there!

I've been started some months ago a scene under LW 9.6 and take it back with both LW10.1 and 11.0.
I've noticed that when rendering a frame, LW10 and 11 compute "normals", where 9.6 didn't, and it's taking pretty long time:
I just can't find what is this function?

Thank you :)

03-12-2012, 05:29 PM
+1 ignorant user here too :)

05-26-2012, 01:35 PM
Bumping this.

I don't feel like I remember seeing this be an issue earlier (pre LW 10/11) either and for my current scene it's the vast majority of my render time.

Any thoughts?

05-26-2012, 03:28 PM
Any renderer is computing normals, even OpenGL/DirectX game/preview, even if it doesn't show it to user.