Page 1 of 2 12 LastLast
Results 1 to 15 of 20

Thread: Banding Issues when modeling, Please Help!!

  1. #1
    Super Member jboudreau's Avatar
    Join Date
    May 2003
    Location
    Halifax
    Posts
    935

    Banding Issues when modeling, Please Help!!

    Hi Guys

    I'm wondering if anybody can tell me why there is color banding when I model. I can't figure out how to get rid of the banding. I thought it was an OpenGL issue but it's even there when I render
    the model out. I also thought it might be a screen issue but it is doing it on 4 different monitors and the gradients look great in photoshop but not in lightwave so I can't see it being a monitor issue.

    Has anybody else ran into this problem? Can you see banding in the attached image below?

    I also made sure I was in smooth shading or texture mode and the surface has smoothing applied.

    Click image for larger version. 

Name:	3D_Model_Banding_Issue.png 
Views:	176 
Size:	1.58 MB 
ID:	123088

    Is this just normal and I haven't noticed it before.

    Thanks,
    Jason
    Last edited by jboudreau; 07-17-2014 at 03:35 PM.
    VFX Artist / 3D Animator
    Animatrix Productions

    Dell T7600 Dual XEON E5-2687W 3.8GHz / Dell M6700 i7 -3940XM 3.9GHz
    64GB Ram / 32GB Ram
    Nvidia Quadro K5000 / Nvidia Quadro K5000M
    Windows 7 x64

  2. #2
    Registered User Slartibartfast's Avatar
    Join Date
    Jun 2012
    Location
    Sweden
    Posts
    417
    I'd say OpenGL issue. If it's in the renders I would check bad geometry. Try merging points and unify polygons. Something appears to be sticking out of the object- do you have internal polys messing with smoothing?

  3. #3
    Super Member jboudreau's Avatar
    Join Date
    May 2003
    Location
    Halifax
    Posts
    935
    Hi thanks for the reply

    It isn't a problem to this particular model it does it for all models. I can just create a box subdivide it a few times and then sub patch it and I get this darn banding. Very frustrating

    Are you not having this issue on your end? Try what I said and let me know if you get banding issues

    Thanks,
    Jason
    VFX Artist / 3D Animator
    Animatrix Productions

    Dell T7600 Dual XEON E5-2687W 3.8GHz / Dell M6700 i7 -3940XM 3.9GHz
    64GB Ram / 32GB Ram
    Nvidia Quadro K5000 / Nvidia Quadro K5000M
    Windows 7 x64

  4. #4
    Registered User Slartibartfast's Avatar
    Join Date
    Jun 2012
    Location
    Sweden
    Posts
    417
    I have banding in opengl but not in renders. Post a scene and a rendered image and I will check tomorrow. Unless someone else comes up with an answer during my sleep

  5. #5
    Hi!
    Hmm....i think it's the linear Opengl correction that might do that. It get's worse if you change to sRGB. But seems to happen only on Grey scaled colors.

    Sam
    Last edited by Samus; 07-17-2014 at 04:44 PM.

  6. #6
    RETROGRADER prometheus's Avatar
    Join Date
    Aug 2003
    Location
    sweden stockholm
    Posts
    14,677
    Havenīt experienced it as much of a problem...not until now
    I do get that banding too with low diffuse values, higher diffuse will make it better, also higher spec values...seem to be some color space thingy maybe.
    nothing to do with subpatch levels or smoothing at least, none of the settings in openGL seem to be able to make it better, so better of to use a model shading preset with not too low diffuse I reckon.

    I think this is normal though, just make your model surface brighter or more diffuse and maybe a little spec on it.
    another though ..could be some graphic card settings too.

  7. #7
    Super Member Snosrap's Avatar
    Join Date
    Aug 2004
    Location
    Ohio, USA
    Posts
    4,923
    In the options panel just set your color space preset to "disabled" to set everything to linear.
    Last edited by Snosrap; 07-17-2014 at 09:37 PM.

  8. #8
    Registered User
    Join Date
    Jun 2014
    Location
    Right here
    Posts
    1,415

    Banding seems depending on the light type

    Attachment 123095Attachment 123096Attachment 123097

    Banding seems to be depending on the light type. Area light looks good.

  9. #9
    Registered User Slartibartfast's Avatar
    Join Date
    Jun 2012
    Location
    Sweden
    Posts
    417
    Good morning,
    Ok, I was looking to quickly on the renders. There IS banding in my renders also, but not as obvious as in opengl. My conclusion after some investigation is that it is due to monitor brightness resolution! Explanation below.
    It does not matter what kind of light you have, or linear/sRGB settings. It just moves the area with banding or sometimes disguise them.

    First clue: in imageviewer you can leftclick on the pixels and the window title shows you the actual pixelvalues (in percent, like R:5.28% G:5.28% B:10.12% A:100%). Try and you will see that within one band the actual pixelvalue changes.
    To verify lightwave's innocence I made a linear gradient in GIMP. To make possible bands visible they need to be more than a pixel wide, so I made a 600 px gradient from 10%-20% grey (0-100% is 256 "bands" so this gave me 25 bands stretched over 600 pixels. Every band roughly 24 px). TURN OFF DITHERING while testing this. There was clear banding so I think I dare saying it's MONITOR limitations. I would love to have one of those 10 bits/channel monitors now...

    Cheers / Thomas

  10. #10
    Registered User Slartibartfast's Avatar
    Join Date
    Jun 2012
    Location
    Sweden
    Posts
    417
    And I almost forgot:

    Hit ctrl-F5 (effects window) and go to the processing tab. Enable dithering 4x and banding goes away
    Last edited by Slartibartfast; 07-18-2014 at 02:05 AM.

  11. #11
    Registered User
    Join Date
    Jun 2014
    Location
    Right here
    Posts
    1,415
    Slartibartfast, thanks alot, you're absolutely right!

    LW10 help:

    Dither Intensity

    Dithering blends two colors to simulate a third color between them, forming a more realistic blend. Dither Intensity lets you set the amount of color blending used by LightWave when rendering an image. Even with 24-bits of color data, it is possible to perceive color banding where distinct changes in color or brightness occur within otherwise smoothly ramped colors. Off removes all dithering, and you will probably experience some color banding. Normal, the default setting, reduces banding to the point where it nearly disappears. 2x Normal increases the dithering even further, which may be useful for high-end systems that still retain some appearance of banding in the final image. 4x Normal boosts dithering so that the resulting image looks more grainy, like film, which may be a desirable effect (especially when used with Animated Dither, below).

    Animated Dither

    Select Animated Dither to change the dithering pattern used from one frame to the next. This ensures that dithering is randomly placed, so there is no apparent pattern to the dither blend. With a 2x Normal or 4x Normal Dither Intensity, this can be used to approximate the randomness of film grain moving through an image.

  12. #12
    Quote Originally Posted by Marander View Post
    Slartibartfast, thanks alot, you're absolutely right!

    LW10 help:

    Dither Intensity

    Dithering blends two colors to simulate a third color between them, forming a more realistic blend. Dither Intensity lets you set the amount of color blending used by LightWave when rendering an image. Even with 24-bits of color data, it is possible to perceive color banding where distinct changes in color or brightness occur within otherwise smoothly ramped colors. Off removes all dithering, and you will probably experience some color banding. Normal, the default setting, reduces banding to the point where it nearly disappears. 2x Normal increases the dithering even further, which may be useful for high-end systems that still retain some appearance of banding in the final image. 4x Normal boosts dithering so that the resulting image looks more grainy, like film, which may be a desirable effect (especially when used with Animated Dither, below).

    Animated Dither

    Select Animated Dither to change the dithering pattern used from one frame to the next. This ensures that dithering is randomly placed, so there is no apparent pattern to the dither blend. With a 2x Normal or 4x Normal Dither Intensity, this can be used to approximate the randomness of film grain moving through an image.
    That's sort of hidden deep, and I've never considered it with my renders ever. That function needs to be incorporated into the render globals panel.

  13. #13
    Super Member jboudreau's Avatar
    Join Date
    May 2003
    Location
    Halifax
    Posts
    935
    Quote Originally Posted by Slartibartfast View Post
    Good morning,
    Ok, I was looking to quickly on the renders. There IS banding in my renders also, but not as obvious as in opengl. My conclusion after some investigation is that it is due to monitor brightness resolution! Explanation below.
    It does not matter what kind of light you have, or linear/sRGB settings. It just moves the area with banding or sometimes disguise them.

    First clue: in imageviewer you can leftclick on the pixels and the window title shows you the actual pixelvalues (in percent, like R:5.28% G:5.28% B:10.12% A:100%). Try and you will see that within one band the actual pixelvalue changes.
    To verify lightwave's innocence I made a linear gradient in GIMP. To make possible bands visible they need to be more than a pixel wide, so I made a 600 px gradient from 10%-20% grey (0-100% is 256 "bands" so this gave me 25 bands stretched over 600 pixels. Every band roughly 24 px). TURN OFF DITHERING while testing this. There was clear banding so I think I dare saying it's MONITOR limitations. I would love to have one of those 10 bits/channel monitors now...

    Cheers / Thomas
    Hi

    I think you are right it might be a limitation with monitors, but that doesn't explain why it's happening on my Dell U2713H 10-Bit Screen. In OpenGL the banding is awful even in VPR the banding is awful. I don't think this should be happening on a 10 bit display.

    Marander is right if you use different lights it reduces the banding, I still see some banding with an Area light though. The only thing that takes the banding away is to use specular and glossiness with a area, dome or spherical light. (I think this is because those lights add a bit of noise to the image acting almost like a dither on the gradient)

    Basically from what I can see is that if you choose a dark grey surface with no specularity or glossiness you get extreme banding. If you change the color to any other color besides grey you don't see the banding unless you zoom in close to the object then you can see banding in the image.

    Thanks
    Jason
    VFX Artist / 3D Animator
    Animatrix Productions

    Dell T7600 Dual XEON E5-2687W 3.8GHz / Dell M6700 i7 -3940XM 3.9GHz
    64GB Ram / 32GB Ram
    Nvidia Quadro K5000 / Nvidia Quadro K5000M
    Windows 7 x64

  14. #14
    Super Member jboudreau's Avatar
    Join Date
    May 2003
    Location
    Halifax
    Posts
    935
    You Guys all Rock!!

    Thanks Slartibartfast, That dithering option did the trick. I only had to use Normal and the Banding was completely gone and that was with a very low diffuse suface with no specularity or glossiness. I found the 2x and 4x was adding way to much noise to the image but the normal setting worked perfectly!!

    I agree djwaterman, that is hidden deep. I had no idea that existed and it should be in the render global panel.

    Thanks for all your help guys. That fixes the rendering issue with banding but it still doesn't fix the OpenGL or VPR Banding. I thought with a 10bit display I wouldn't see banding at all. It must be a limitation with OpenGL (probably because their is no dither option for the the openGL viewport)

    Thanks
    Jason
    VFX Artist / 3D Animator
    Animatrix Productions

    Dell T7600 Dual XEON E5-2687W 3.8GHz / Dell M6700 i7 -3940XM 3.9GHz
    64GB Ram / 32GB Ram
    Nvidia Quadro K5000 / Nvidia Quadro K5000M
    Windows 7 x64

  15. #15
    Registered User Slartibartfast's Avatar
    Join Date
    Jun 2012
    Location
    Sweden
    Posts
    417
    Happy to have helped

    Regarding 10-bit monitors, I never used one but maybe you're just feeding it 8 bits??
    Snippet from web:


    By default all operating systems and GPUs output at an 8-bit color depth and expect to be connected to an 8-bit display. In order to drive a 10-bit display from a computer, both the operating system and GPU must be capable of supporting 10-bit color.

    Operating System with 10-bit Support
    Windows 7 or later - Yes
    Mac OS X 10.8 or earlier - No
    Linux - Yes

    GPUs with 10-bit Support
    NVIDIA GeForce on Windows - No
    NVIDIA GeForce on Linux - Yes
    NVIDIA Quadro - Yes
    AMD Radeon - No
    AMD Firepro - Yes
    Intel HD Graphics - No

    While all GeForce cards are capable of 10-bit output, NVIDIA intentionally cripples their Windows drivers to disallow this. This is a business decision by the management of NVIDIA in order to force users to buy their much higher priced Quadro workstation GPUs.

Page 1 of 2 12 LastLast

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •