Page 6 of 6 FirstFirst ... 456
Results 76 to 81 of 81

Thread: GPU Rendering - Anywhere in the development pipeline for Lightwave?

  1. #76
    Registered User ianr's Avatar
    Join Date
    Oct 2006
    Location
    Chiltern Riviera
    Posts
    1,402
    NVIDIA RTX DOES RAYTRACING ACCELERATION

    is that loud enough for yer [email protected] ?

  2. #77

    Quote Originally Posted by ianr View Post
    NVIDIA RTX DOES RAYTRACING ACCELERATION

    is that loud enough for yer [email protected] ?
    no, usually someone has to quote it.  
    LW vidz   DPont donate   LightWiki   RHiggit   IKBooster   My vidz

  3. #78
    Registered User
    Join Date
    Apr 2019
    Location
    Lubbock, Texas
    Posts
    9

    Then perhaps....

    Quote Originally Posted by erikals View Post


    no, usually someone has to quote it.  
    Quoting the quote will help.

  4. #79
    Registered User
    Join Date
    Feb 2019
    Location
    Lancashire, England
    Posts
    9
    I know Nvidia supports ray tracing at the hardware level with regards to RTX, but you have to remember that it is proprietary to NVIDA. In a short space of time not far off AMD and Intel are going to release graphics cards WITH raytracing capabliites. I do believe that gpu rendering is the future, but I'd hold off until Nvida, AMD and Intel have released all the raytracing cards AND that an API such as possibly OpenCl supports hardware raytracing. At the point that an API such as OpenCl or another API supports hardware raytracying and it is mature then that would be the time for Lightwave to start implementing hardware raytracying, when both the gpu hardware supports it but also when an API supports it fully. I don't think that hardware raytracing is mature enough yet and that it is still in its infancy.

    You also have to remember that most modern GPU memory is at its largest about 11gb to 16gb of vram (on standard gaming gpus, I know there is a nvida card with 48gb), compare that to a workstation that can have ram of 256gb or even a server that can have 2048gb of ram, I don't think the current amount of vram is enough for high end use with very large complicated scenes and complicated models.

    I do however think that gpu rendering is the future, I just see the lack of video ram compared to system ram, the immature nature of hardware rendering and lack of an API to support GPU rendering as issues for not implementing GPU rendering in 2019. When video ram is roughly at least 32gb of vram, when hardware raytracing is mature and when there is an open API that supports hardware raytracying, then that would be the time to implement GPU rendering inside Lightwave.
    Last edited by MichaelBeckwith; 09-14-2019 at 06:37 PM.

  5. #80
    Quote Originally Posted by MichaelBeckwith View Post

    You also have to remember that most modern GPU memory is at its largest about 11gb to 16gb of vram (on standard gaming gpus, I know there is a nvida card with 48gb), compare that to a workstation that can have ram of 256gb or even a server that can have 2048gb of ram, I don't think the current amount of vram is enough for high end use with very large complicated scenes and complicated models.
    A lot of rendering is done through render layers and composition so not everything you see has to all be loaded into memory at the same time, only parts. Then all of those elements are put together in comp. So 11GB can be more than adequate to render different layers of a scene. Also, if you use SLI, you can turn that 11GB into 22GB of memory on a 2080Ti.

  6. #81
    Registered User
    Join Date
    Jun 2014
    Location
    Right here
    Posts
    1,535
    Quote Originally Posted by MichaelBeckwith View Post
    You also have to remember that most modern GPU memory is at its largest about 11gb to 16gb of vram (on standard gaming gpus, I know there is a nvida card with 48gb), compare that to a workstation...
    That was an issue in the past, modern GPU engines support out-of-core memory (geometry/textures) now and higher RTX cards offer NVLink to combine the GPU memory.

    You miss out alot by not using GPU rendering nowadays. I can render animations quickly with a bells and whistles that would have taken too long before.

Page 6 of 6 FirstFirst ... 456

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •