Results 1 to 4 of 4

Thread: Lightwave rendering, computing normals?

  1. #1

    Lightwave rendering, computing normals?

    Hi there!

    I've been started some months ago a scene under LW 9.6 and take it back with both LW10.1 and 11.0.
    I've noticed that when rendering a frame, LW10 and 11 compute "normals", where 9.6 didn't, and it's taking pretty long time:
    I just can't find what is this function?

    Thank you

  2. #2
    Super Duper Member kopperdrake's Avatar
    Join Date
    Mar 2004
    Location
    Derbyshire, UK
    Posts
    3,147
    +1 ignorant user here too
    - web: http://www.albino-igil.co.uk - 2D/3D Design Studio -
    - PC Spec: Intel i9-7940X @3.1GHz | 64Gb | 2 x GeForce GTX 1080 Ti 11Gb | Windows 10 Pro -

  3. #3
    Indie Indie Artist crpcory's Avatar
    Join Date
    Jul 2003
    Location
    Cleveland, oh
    Posts
    81
    Bumping this.

    I don't feel like I remember seeing this be an issue earlier (pre LW 10/11) either and for my current scene it's the vast majority of my render time.

    Any thoughts?
    OS X 10.6
    LW 9.6
    MacPro 2.8 octo
    ...Akron OH, the next hollywood

  4. #4
    TrueArt Support
    Join Date
    Feb 2003
    Location
    Poland
    Posts
    7,993
    Any renderer is computing normals, even OpenGL/DirectX game/preview, even if it doesn't show it to user.

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •