PDA

View Full Version : Banding Issues when modeling, Please Help!!



jboudreau
07-17-2014, 03:32 PM
Hi Guys

I'm wondering if anybody can tell me why there is color banding when I model. I can't figure out how to get rid of the banding. I thought it was an OpenGL issue but it's even there when I render
the model out. I also thought it might be a screen issue but it is doing it on 4 different monitors and the gradients look great in photoshop but not in lightwave so I can't see it being a monitor issue.

Has anybody else ran into this problem? Can you see banding in the attached image below?

I also made sure I was in smooth shading or texture mode and the surface has smoothing applied.

123088

Is this just normal and I haven't noticed it before.

Thanks,
Jason

Slartibartfast
07-17-2014, 03:54 PM
I'd say OpenGL issue. If it's in the renders I would check bad geometry. Try merging points and unify polygons. Something appears to be sticking out of the object- do you have internal polys messing with smoothing?

jboudreau
07-17-2014, 04:01 PM
Hi thanks for the reply

It isn't a problem to this particular model it does it for all models. I can just create a box subdivide it a few times and then sub patch it and I get this darn banding. Very frustrating

Are you not having this issue on your end? Try what I said and let me know if you get banding issues

Thanks,
Jason

Slartibartfast
07-17-2014, 04:21 PM
I have banding in opengl but not in renders. Post a scene and a rendered image and I will check tomorrow. Unless someone else comes up with an answer during my sleep :)

Samus
07-17-2014, 04:41 PM
Hi!
Hmm....i think it's the linear Opengl correction that might do that. It get's worse if you change to sRGB. But seems to happen only on Grey scaled colors.

Sam

prometheus
07-17-2014, 05:24 PM
Havenīt experienced it as much of a problem...not until now:)
I do get that banding too with low diffuse values, higher diffuse will make it better, also higher spec values...seem to be some color space thingy maybe.
nothing to do with subpatch levels or smoothing at least, none of the settings in openGL seem to be able to make it better, so better of to use a model shading preset with not too low diffuse I reckon.

I think this is normal though, just make your model surface brighter or more diffuse and maybe a little spec on it.
another though ..could be some graphic card settings too.

Snosrap
07-17-2014, 09:32 PM
In the options panel just set your color space preset to "disabled" to set everything to linear.

Marander
07-18-2014, 12:20 AM
123095123096123097

Banding seems to be depending on the light type. Area light looks good.

Slartibartfast
07-18-2014, 01:48 AM
Good morning,
Ok, I was looking to quickly on the renders. There IS banding in my renders also, but not as obvious as in opengl. My conclusion after some investigation is that it is due to monitor brightness resolution! Explanation below.
It does not matter what kind of light you have, or linear/sRGB settings. It just moves the area with banding or sometimes disguise them.

First clue: in imageviewer you can leftclick on the pixels and the window title shows you the actual pixelvalues (in percent, like R:5.28% G:5.28% B:10.12% A:100%). Try and you will see that within one band the actual pixelvalue changes.
To verify lightwave's innocence I made a linear gradient in GIMP. To make possible bands visible they need to be more than a pixel wide, so I made a 600 px gradient from 10%-20% grey (0-100% is 256 "bands" so this gave me 25 bands stretched over 600 pixels. Every band roughly 24 px). TURN OFF DITHERING while testing this. There was clear banding so I think I dare saying it's MONITOR limitations. I would love to have one of those 10 bits/channel monitors now...

Cheers / Thomas

Slartibartfast
07-18-2014, 02:01 AM
And I almost forgot:

Hit ctrl-F5 (effects window) and go to the processing tab. Enable dithering 4x and banding goes away :thumbsup:

Marander
07-18-2014, 02:12 AM
Slartibartfast, thanks alot, you're absolutely right!

LW10 help:

Dither Intensity

Dithering blends two colors to simulate a third color between them, forming a more realistic blend. Dither Intensity lets you set the amount of color blending used by LightWave when rendering an image. Even with 24-bits of color data, it is possible to perceive color banding where distinct changes in color or brightness occur within otherwise smoothly ramped colors. Off removes all dithering, and you will probably experience some color banding. Normal, the default setting, reduces banding to the point where it nearly disappears. 2x Normal increases the dithering even further, which may be useful for high-end systems that still retain some appearance of banding in the final image. 4x Normal boosts dithering so that the resulting image looks more grainy, like film, which may be a desirable effect (especially when used with Animated Dither, below).

Animated Dither

Select Animated Dither to change the dithering pattern used from one frame to the next. This ensures that dithering is randomly placed, so there is no apparent pattern to the dither blend. With a 2x Normal or 4x Normal Dither Intensity, this can be used to approximate the randomness of film grain moving through an image.

djwaterman
07-18-2014, 02:31 AM
Slartibartfast, thanks alot, you're absolutely right!

LW10 help:

Dither Intensity

Dithering blends two colors to simulate a third color between them, forming a more realistic blend. Dither Intensity lets you set the amount of color blending used by LightWave when rendering an image. Even with 24-bits of color data, it is possible to perceive color banding where distinct changes in color or brightness occur within otherwise smoothly ramped colors. Off removes all dithering, and you will probably experience some color banding. Normal, the default setting, reduces banding to the point where it nearly disappears. 2x Normal increases the dithering even further, which may be useful for high-end systems that still retain some appearance of banding in the final image. 4x Normal boosts dithering so that the resulting image looks more grainy, like film, which may be a desirable effect (especially when used with Animated Dither, below).

Animated Dither

Select Animated Dither to change the dithering pattern used from one frame to the next. This ensures that dithering is randomly placed, so there is no apparent pattern to the dither blend. With a 2x Normal or 4x Normal Dither Intensity, this can be used to approximate the randomness of film grain moving through an image.

That's sort of hidden deep, and I've never considered it with my renders ever. That function needs to be incorporated into the render globals panel.

jboudreau
07-18-2014, 02:35 AM
Good morning,
Ok, I was looking to quickly on the renders. There IS banding in my renders also, but not as obvious as in opengl. My conclusion after some investigation is that it is due to monitor brightness resolution! Explanation below.
It does not matter what kind of light you have, or linear/sRGB settings. It just moves the area with banding or sometimes disguise them.

First clue: in imageviewer you can leftclick on the pixels and the window title shows you the actual pixelvalues (in percent, like R:5.28% G:5.28% B:10.12% A:100%). Try and you will see that within one band the actual pixelvalue changes.
To verify lightwave's innocence I made a linear gradient in GIMP. To make possible bands visible they need to be more than a pixel wide, so I made a 600 px gradient from 10%-20% grey (0-100% is 256 "bands" so this gave me 25 bands stretched over 600 pixels. Every band roughly 24 px). TURN OFF DITHERING while testing this. There was clear banding so I think I dare saying it's MONITOR limitations. I would love to have one of those 10 bits/channel monitors now...

Cheers / Thomas

Hi

I think you are right it might be a limitation with monitors, but that doesn't explain why it's happening on my Dell U2713H 10-Bit Screen. In OpenGL the banding is awful even in VPR the banding is awful. I don't think this should be happening on a 10 bit display.

Marander is right if you use different lights it reduces the banding, I still see some banding with an Area light though. The only thing that takes the banding away is to use specular and glossiness with a area, dome or spherical light. (I think this is because those lights add a bit of noise to the image acting almost like a dither on the gradient)

Basically from what I can see is that if you choose a dark grey surface with no specularity or glossiness you get extreme banding. If you change the color to any other color besides grey you don't see the banding unless you zoom in close to the object then you can see banding in the image.

Thanks
Jason

jboudreau
07-18-2014, 02:41 AM
You Guys all Rock!!

Thanks Slartibartfast, That dithering option did the trick. I only had to use Normal and the Banding was completely gone and that was with a very low diffuse suface with no specularity or glossiness. I found the 2x and 4x was adding way to much noise to the image but the normal setting worked perfectly!!

I agree djwaterman, that is hidden deep. I had no idea that existed and it should be in the render global panel.

Thanks for all your help guys. That fixes the rendering issue with banding but it still doesn't fix the OpenGL or VPR Banding. I thought with a 10bit display I wouldn't see banding at all. It must be a limitation with OpenGL (probably because their is no dither option for the the openGL viewport)

Thanks
Jason

Slartibartfast
07-18-2014, 03:37 AM
Happy to have helped :)

Regarding 10-bit monitors, I never used one but maybe you're just feeding it 8 bits??
Snippet from web:


By default all operating systems and GPUs output at an 8-bit color depth and expect to be connected to an 8-bit display. In order to drive a 10-bit display from a computer, both the operating system and GPU must be capable of supporting 10-bit color.

Operating System with 10-bit Support
Windows 7 or later - Yes
Mac OS X 10.8 or earlier - No
Linux - Yes

GPUs with 10-bit Support
NVIDIA GeForce on Windows - No
NVIDIA GeForce on Linux - Yes
NVIDIA Quadro - Yes
AMD Radeon - No
AMD Firepro - Yes
Intel HD Graphics - No

While all GeForce cards are capable of 10-bit output, NVIDIA intentionally cripples their Windows drivers to disallow this. This is a business decision by the management of NVIDIA in order to force users to buy their much higher priced Quadro workstation GPUs.

BeeVee
07-18-2014, 06:45 AM
And even once you've jumped that hurdle, there's also the fact that LightWave is displaying an 8-bit per channel image. There are articles all over the web about achieving a 10-bit display with Photoshop.

B

jboudreau
07-18-2014, 10:08 AM
I have all the requirements needed windows 7 64bit. Nvidia quadro k5000 video card and a Dell u2713h premiere color 10 bit display or like they like to call it 8 bit + 2 bit (it says it has a 14bit lut).

Oh BeeVee does that mean that lightwave is not capable of displaying more than 8 bit because like you said photoshop and aftereffects doesen't have any banding issues. It's only the 3d programs like modo, 3d coat, lightwave, blender etc that are giving me issues with banding in the viewport.

Thanks
Jason

madno
07-19-2014, 11:28 PM
As far as I have noticed, LW does not send 10 bit data.

By the way, here is a link to a NEC 10 bit test app (runs without the need to install).
It generates 3D geometry on the fly and sends it as two side by side videos to the screen.
One is 8 bit data one is 10 bit data.

http://www.necdisplay.com/monitor-software.
search for
"10 bit Color Depth Demo Application"
on the site.

jboudreau
07-22-2014, 07:47 AM
As far as I have noticed, LW does not send 10 bit data.

By the way, here is a link to a NEC 10 bit test app (runs without the need to install).
It generates 3D geometry on the fly and sends it as two side by side videos to the screen.
One is 8 bit data one is 10 bit data.

http://www.necdisplay.com/monitor-software.
search for
"10 bit Color Depth Demo Application"
on the site.

Hi

Thanks for the link. I just tested it and the 10bit video has no banding at all, the 8-bit one has banding. Does anyone know if their are any 3D software packages that use 10-bit Data?

Thanks,
Jason

Mr_Q
07-22-2014, 12:05 PM
The development team is aware their OpenGL code does not make use of the FP features of modern video cards. This banding issue is even more noticeable, rather extremely, when using color-space modes. (sRGB). It is in their database and hopefully will be addressed in the future.