PDA

View Full Version : How to view 30 bit color?



Mr Rid
02-18-2014, 09:40 PM
I am testing a new 30-bit display, but I am only seeing banding in 16 bit integer/float image gradients, that is typical of 8-bit images.

I am viewing on a 30-bit capable, Asus PA279Q monitor, connected to a 30-bit capable Quadro FX 3800 card by DisplayPort, in Win 7 Pro. My desktop is set to "True Color (32-bit)." I am primarily expecting to see 30-bit color in Fusion 6, while viewing 16 bit exr images generated in Lightwave or Fusion, and I am also viewing the commonly referenced "ramp.psd" file found on the AMD site- http://www.amd.com/us/products/workstation/graphics/software/Pages/adobe-photoshop.aspx I also have a second monitor, displaying 24-bit color, for comparison. I see the same banding in gradients on both the 24-bit and on the 30-bit monitors, in Fusion or Lightwave. My most recent version of Photoshop does not support 30-bit.

In Lightwave, I assume the image viewer will automatically display 30-bit (?).

In Fusion prefs, I have checked "Use 10-10-10-2 framebuffer" (as stated in the manual- http://www.vfxpedia.com/index.php?title=Eyeon:Manual/Fusion_6/Tweaks_Preferences ), and checked "Use float16 textures", and GL display is set to "HiQ."

Any thoughts on what else I might try would be much appreciated. So far, Asus and Nvidia support is unable to offer a solution.

MSherak
02-19-2014, 04:24 AM
I am testing a new 30-bit display, but I am only seeing banding in 16 bit integer/float image gradients, that is typical of 8-bit images.

I am viewing on a 30-bit capable, Asus PA279Q monitor, connected to a 30-bit capable Quadro FX 3800 card by DisplayPort, in Win 7 Pro. My desktop is set to "True Color (32-bit)." I am primarily expecting to see 30-bit color in Fusion 6, while viewing 16 bit exr images generated in Lightwave or Fusion, and I am also viewing the commonly referenced "ramp.psd" file found on the AMD site- http://www.amd.com/us/products/workstation/graphics/software/Pages/adobe-photoshop.aspx I also have a second monitor, displaying 24-bit color, for comparison. I see the same banding in gradients on both the 24-bit and on the 30-bit monitors, in Fusion or Lightwave. My most recent version of Photoshop does not support 30-bit.

In Lightwave, I assume the image viewer will automatically display 30-bit (?).

In Fusion prefs, I have checked "Use 10-10-10-2 framebuffer" (as stated in the manual- http://www.vfxpedia.com/index.php?title=Eyeon:Manual/Fusion_6/Tweaks_Preferences ), and checked "Use float16 textures", and GL display is set to "HiQ."

Any thoughts on what else I might try would be much appreciated. So far, Asus and Nvidia support is unable to offer a solution.


True Color 32-bit means 24bit with 8bit alpha (transparency) 8,8,8,8. To get 30bit color to work I believe that you have to disable the AERO option in Win7 so the alpha is not used so you can have 10,10,10,2.

http://nvidia.custhelp.com/app/answers/detail/a_id/3049/~/how-to-enable-30-bit-color-on-windows-platforms

-M

Mr Rid
02-19-2014, 06:15 AM
Thanks. As stated in that link, the Quadro driver is suppose to automatically disable Aero, which it does not appear to do. I manually disabled Aero in Win7 Services, but this still does not resolve the issue.

I think the graphics card's control panel should generate a test pattern to determine if 30-bit deep color is working. That would eliminate any possible extra confusion when testing with various graphics apps and their cryptic settings.

Lightwolf
02-19-2014, 07:29 AM
At least as far as LW is concerned, it'd need to be explicitly coded to support it (from reading the code samples by nVidia).

I can't help any further unfortunately, since I don't have a 30-bit capable display. :(

Cheers,
Mike

Mr Rid
02-19-2014, 09:42 PM
So you dont think LW is displaying 30-bit color?

I always have weird problems that no one Earth has ever heard of before. Its amazing that I have exhausted web searches, have posted in 4 support forums, and am corresponding with two tech supports daily, and no one has a clue what is going on.

m.d.
02-20-2014, 12:39 AM
I think we had this discussion on the fusion forums.

Very few apps have 10bit support.
Adobe PPro guaranteed does...activating the secondary monitor out in the prefs...
Not necessarily in the GUI

m.d.
02-20-2014, 12:54 AM
Try this....
http://www.necdisplay.com/documents/Software/NEC_10_bit_video_Windows_demo.zip

It's for nec displays to tell if they are in 10bit mode....not sure if it works for any others....can't test cause I'm on my phone


Here's an explanation to enable 10bit in photoshop to test
http://www.ronmartblog.com/2011/07/guest-blog-understanding-10-bit-color.html

m.d.
02-20-2014, 01:59 AM
there is a set of apps mentioned in this whitepaper, that can test if you have a proper 30bit displaychain
http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf

i used to have a copy somewhere i found on the web....a precompiled .exe...for the life of me i cannot find it on my system
you can search for it....I did but could not find it again....it's like check30bit or set30bit or something

if you know someone who can compile it for you the source code is here
http://www.nvidia.com/content/quadro_fx_product_literature/Color30bitsrc.zip

this will tell you for sure if your displaychain is 30bit

if not another problem that can occur is your monitor is not sending out a EDID (Extended Display Identification Data) that the quadro can properly interpret....being that you have an older quadro it is possible...and if you look at nvidia's driver release notes...many times there are EDID fixes

from the nvidia docs
At least one EDID 1.04 30-bit color monitor must be attached for the driver to
automatically switch to 30-bit mode. The tested monitors are HP Dreamcolor LP 2480zx.

it seems your monitor is EDID 1.3, there was a significant change at 1.3

from microsoft
EDID version 1.0 through 1.2 consists of a single block of data, per the VESA specification.
With EDID version 1.3 or enhanced EDID (E-EDID), manufacturers can specify one or more extension blocks in addition to the primary block.


this document talks about overiding EDID in the registry
http://msdn.microsoft.com/en-us/library/windows/hardware/jj133967(v=vs.85).aspx

http://www.extron.com/download/dltrack.aspx?file=EDID_ManagerV1x0.exe&id=38772

this can test and interupt/overwrite your EDID via hardware....$400 bucks though
http://www.gefen.com/kvm/ext-dp-edidp.jsp?prod_id=10909

if you have problems, you can try phoning gefen....they know this stuff inside and out and where going to make me a EDID hack cable in the past...

Lightwolf
02-20-2014, 03:08 AM
So you dont think LW is displaying 30-bit color?
Not unless it asks for a 10-bit (10-10-10-2) OpenGL framebuffer (and then you'd still have 8-bit textures). Which I doubt it does, since it'd need to be optional.

I always have weird problems that no one Earth has ever heard of before. Its amazing that I have exhausted web searches, have posted in 4 support forums, and am corresponding with two tech supports daily, and no one has a clue what is going on.
Yeah, it's odd that it's not used more often, especially considering that the tech is not _that_ new. But then I suppose simple colour management is enough of a headache for most people already. ;)

Cheers,
Mike

Mr Rid
02-20-2014, 09:54 PM
Thanks for the info.

That demo is only for NEC displays.

I dont have a 30-bit version of Pshop. I am testing with Fusion 6.

Have been corresponding with Nvidia and Eyeon to resolve the issue, but have yet to figure it out.

jwiede
02-21-2014, 01:19 PM
So you dont think LW is displaying 30-bit color?

There's some slim chance LW supports 30bit color output, but I tend to agree with Mike, it seems unlikely LW has the needed 30bit color support code.

Lightwolf
02-21-2014, 02:47 PM
...it seems unlikely LW has the needed 30bit color support code.
Mind you, having looked at the sample code, it doesn't seem to be that much work to add it - at least on a basic level (just letting the viewports render to 30-bit) - image texture support at higher bit-depths would be a bit more elaborate though.

Cheers,
Mike

m.d.
02-21-2014, 04:42 PM
it would be useful probably just as a render viewer to check if intended output is 10bit or higher

Honestly most of the issues I have is viewing high bit depth video on a 10bit monitor, and then outputting to 8bit....

quite often things like sky's ect...shot with high bit depth cameras...will look great on 10bit monitors, but when you do an 8 bit output...there can be severe banding...usually there is no surprises going the other way, as if there is potential banding....you will see it in the 8 bit monitor

BTW The solution to the banding issue is apply a 0.4% noise to the image before converting to 8bit. The value comes from a LSB or least significant bit dither....where 1/256=0.0039.
That should be as far as you need to go to kill banding.

Mr Rid
02-21-2014, 07:07 PM
When compositing, a grain is usually added that kills noticeable banding. But banding is particularly bad in night skys, and I need to be certain about what is really going on in renders.

Mr Rid
02-24-2014, 07:41 PM
So I finally discovered that I needed a newer version build to see 30 bit color in Fusion. There was an issue with 30 bit color in the older build. So my Asus is displaying properly. And Lightwave is not displaying 30 bit. I really appreciate the responses. I received no replys on the Fusion or Nvidia forums.

I am spanning two displays, one is 30 bit, and the other 24 bit. So the test gradient now appears bandless in Fusion, on the 30 bit monitor. But it is curious that when I drag the Fusion window over to my 24 bit monitor, there is no banding there either, unless I turn off the 30 bit display off!? Then banding suddenly appears on the 24 bit monitor. How can a 24 bit monitor display 30 bit color, just because it spans with a 30 bit monitor?

m.d.
02-24-2014, 10:20 PM
That doesn't sound good....

It cannot do that....
I would suspect Fusion is reading the EDID of the 10bit and activating dithering to simulate 10bit...


Try the change depth tool in fusion set it to 8 bit and click on the additive noise, throw the original in the B buffer and A/B compare the two compare the noise in the image....

Lightwolf
02-25-2014, 05:00 AM
I'd suspect that it's related to dithering as well... dithering the 10-bit framebuffer down to 8 prior to sending it out to the second display.

Cheers,
Mike

m.d.
02-25-2014, 08:05 AM
The strange thing is it is only doing it while the 10bit monitor is active....
So it's definitely reading the EDID of the 10bit monitor...

Why would you activate dithering when a 10bit monitor is connected.?
Why not dither the 10bit framebuffer all the time when going to an 8 bit monitor? Based on the image source....not the monitor connected....

2 possibilities...

1. They purposefully dither the second monitor to match the user experience across the 2 screens
2. There is no 10bit output from Fusion....merely a 'simulated' 10bit output which is activated when the monitor is detected

Lightwolf
02-25-2014, 08:08 AM
2. There is no 10bit output from Fusion....merely a 'simulated' 10bit output which is activated when the monitor is detected
Fusion explicitly mentions a 10-10-10-2 frambuffer, they know what they're doing. ;)

120470

Cheers,
Mike

m.d.
02-25-2014, 09:58 AM
Ruin my conspiracy theory why don't you...:)

Lightwolf
02-25-2014, 10:18 AM
Ruin my conspiracy theory why don't you...:)
*whistles innocently*

Mr Rid
02-27-2014, 03:14 AM
I dont see how Fusion can be auto-dithering anything.

In Fusion, I can add a 'change depth' tool to the 16-bit image, so I also see an 8-bit version, side-by-side with the 16-bit version. On the 30-bit monitor, the 16-bit shows no banding, while the 8-bit version does show banding, as expected. The weird part is that I get the same result on the 24-bit monitor, until I turn the 30-bit monitor off, then both the 16-bit and the 8-bit versions show banding on the 24-bit monitor, as expected. Turn the 30-bit monitor back on, and the 16-bit image snaps back to 'no banding' on the 24-bit monitor.

Lightwolf
02-27-2014, 05:34 AM
I dont see how Fusion can be auto-dithering anything.

Not Fusion but OpenGL/the GPU/gfx driver.
At this point is has a 10-10-10-2 framebuffer to work with - on both monitors. And somehow, that needs to be converted down to 8-8-8 before being sent out to the 24-bit display. That's where I _suspect_ the dithering happens.

Cheers,
Mike

m.d.
02-27-2014, 12:41 PM
found this in some Nvidia docs

When displaying a depth 30 image, the color data may be dithered to lower bit depths, depending on the capabilities of the display device and how it is connected to the GPU. Some devices connected via analog VGA or DisplayPort can display the full 10 bit range of colors. Devices connected via DVI or HDMI, as well as laptop internal panels connected via LVDS, will be dithered to 8 or 6 bits per pixel.

Lightwolf
02-27-2014, 12:55 PM
found this in some Nvidia docs

When displaying a depth 30 image, the color data may be dithered to lower bit depths, depending on the capabilities of the display device and how it is connected to the GPU. Some devices connected via analog VGA or DisplayPort can display the full 10 bit range of colors. Devices connected via DVI or HDMI, as well as laptop internal panels connected via LVDS, will be dithered to 8 or 6 bits per pixel.
Ka-ching... thanks for the research.

The next question is... is there a way to use 10-10-10-2 if no 30-bit display is installed at all? It looks like even that might be useful.

Cheers,
Mike

spherical
02-27-2014, 03:10 PM
OK, let me get this straight.... The "New Tech" of DVI and HDMI, that all manufacturers are moving to, doesn't do as high quality as the old tried and true analog 15-pin D-sub; which isn't on any of the monitors I have here, nor on the KVMP.

m.d.
02-27-2014, 04:28 PM
Hdmi can support it....DVI only in dual link I think.....
Nvidia just doesn't do it....I don't think they do out of DSub either as I don't ever see that on any new card

m.d.
02-27-2014, 04:34 PM
Ka-ching... thanks for the research.

The next question is... is there a way to use 10-10-10-2 if no 30-bit display is installed at all? It looks like even that might be useful.

Cheers,
Mike

Could be useful for display reasons, and also confusing...
I graded some footage on a 10bit monitor and rendered to a 8 bit codec and just quickly checked the results....only to have it rejected by the director for some banding in the sky. I couldn't see it on my 10 bit monitor.....and the encoder didn't dither.
If you had dithering on a 8 bit you could make the same mistake.
It's nice to get a true WYSYWIG representation.

Mr Rid
02-27-2014, 04:49 PM
Sooo, I think I am am back to my original issue, how do I know my 30 bit monitor is actually displaying 30 bit color, and is not some dithering trick as it may or may not be applying on the 24 bit monitor?


The thing that finally made the banding go away was a change in the Fusion build. But according to what I think you are suggesting, Fusion tells the Nvidia driver when to dither a 16-bit image displayed on a 24-bit monitor, depending if that 24 bit monitor is spanning with a 30 bit monitor, and whether or not that 30-bit monitor is on or off? Sounds weird.

Lightwolf
02-27-2014, 06:40 PM
But according to what I think you are suggesting, Fusion tells the Nvidia driver when to dither a 16-bit image displayed on a 24-bit monitor, depending if that 24 bit monitor is spanning with a 30 bit monitor, and whether or not that 30-bit monitor is on or off? Sounds weird.
No, not Fusion, but the GFX board/driver.

Fusion just uses a 10-10-10-2 frame buffer - how that is converted to serve the attached displays is none of Fusion's business.

Cheers,
Mike

Lightwolf
02-27-2014, 06:42 PM
Could be useful for display reasons, and also confusing...
I graded some footage on a 10bit monitor and rendered to a 8 bit codec and just quickly checked the results....only to have it rejected by the director for some banding in the sky. I couldn't see it on my 10 bit monitor.....and the encoder didn't dither.
If you had dithering on a 8 bit you could make the same mistake.
It's nice to get a true WYSYWIG representation.
Oh, you certainly need to know what you're doing.
However, some output media are 10-bit or even 12-bit, like the current digital projection specs require.

Cheers,
Mike

Lightwolf
02-27-2014, 06:46 PM
OK, let me get this straight.... The "New Tech" of DVI and HDMI, that all manufacturers are moving to, doesn't do as high quality as the old tried and true analog 15-pin D-sub; which isn't on any of the monitors I have here, nor on the KVMP.
Define "high quality". ;)
What you certainly do have with an analogue connection _and_ display is no quantization of the RGB levels. However, if your frambuffer as well as the DAC (digital-analogue converter) doesn't do more than 8-bit then it's moot.

Cheers,
Mike

Mr Rid
02-27-2014, 06:50 PM
No, not Fusion, but the GFX board/driver.

Fusion just uses a 10-10-10-2 frame buffer - how that is converted to serve the attached displays is none of Fusion's business.

Cheers,
Mike

Nvidia support seems to be informing me that Fusion is responsible.

"I think that this behavior is from within the application just like the original report of banding on the 30 bit display."

Doesnt Fusion still have to be telling the Nvidia driver what to do? Otherwise, it wouldnt matter which build of Fusion I was using.

Why would Fusion or Nvidia bother with trying to fool a 24-bit monitor? Why not dither on the 24-bit monitor all the time?

m.d.
02-27-2014, 07:38 PM
I would download another 10bit capable app to test...

Davinci Resolve lite is free.....and useful

spherical
02-27-2014, 08:24 PM
If your monitors are spanned, creating one desktop, is it possible to set individual bit depths?

m.d.
02-27-2014, 10:02 PM
No, all control is hidden...Nvidia sets it based of EDID, desktop itself is 8bit.....very few apps are 10.

spherical
02-27-2014, 10:58 PM
Ah. That's why there is 32-bit, with no other bit depth choice available. All you can choose is resolution.

Lightwolf
02-28-2014, 03:02 AM
Doesnt Fusion still have to be telling the Nvidia driver what to do? Otherwise, it wouldnt matter which build of Fusion I was using.
By looking at the OpenGL code samples from nVidia (and AMD) - all it really needs to do is set the frame buffer to be 10-10-10-2, that's it.
The basic sample source doesn't even address the issue of how many or which displays are attached.

There are extended source code samples that do indeed look at the capabilities of the GPUs and the attached displays (GPU vendor specific code as well) - but I'd be surprised if Fusion used it.

I'm pretty surprised by the reply from nVidia support - especially in light of the quote m.d. found.

Why would Fusion or Nvidia bother with trying to fool a 24-bit monitor? Why not dither on the 24-bit monitor all the time?
Because it still requires some changes to the OpenGL programming. And yes, I see no reason why they shouldn't allow applications to make that choice regardless of the attached display (and then dither down if needed).
At least when it comes to the GPU capabilities, it is one of those features used to differentiate pro from consumer cards. You need to justify the price hike somehow after all. ;)

Cheers,
Mike

Edit: Man, now I really want a 10-bit capable display - still saving up for that UP3214Q ;)

spherical
02-28-2014, 05:09 AM
By looking at the OpenGL code samples from nVidia (and AMD) - all it really needs to do is set the frame buffer to be 10-10-10-2, that's it.
The basic sample source doesn't even address the issue of how many or which displays are attached.

Which would indicate that a spanned display pair can only do one bit depth at a time, across both monitors, so when the 30-bit display is not part of the equation the driver falls back to 24-bit.

Mr Rid
03-01-2014, 03:12 AM
By looking at the OpenGL code samples from nVidia (and AMD) - all it really needs to do is set the frame buffer to be 10-10-10-2, that's it....

Not sure I follow. So, a 24 bit monitor can display 30 bit color, if the graphics driver tells it to?

m.d.
03-01-2014, 08:24 AM
Which would indicate that a spanned display pair can only do one bit depth at a time, across both monitors, so when the 30-bit display is not part of the equation the driver falls back to 24-bit.

Not really sure why that would indicate that....
Mike is saying all fusion has to do is set the frame buffer....Nvidia handles the rest

Here is a quote from Nvidia docs
In the case when a single GPU drives multiple displays, as long as there is one 30-bit
compatible display active, 30-bit color is automatically enabled on all displays. At scan out,
the driver automatically downgrades the 30-bit output to 24-bit for regular displays.

http://www.nvidia.com/docs/IO/40049/TB-04701-001_v02_new.pdf


Also in the docs is a very important fact about multi GPU displays (I put this here for someone's google search in the future)
To enable 30 bit.....the primary display MUST be attached to the GPU also driving the 30-bit panel.

m.d.
03-01-2014, 08:31 AM
Not sure I follow. So, a 24 bit monitor can display 30 bit color, if the graphics driver tells it to?

No....he's saying Fusion's part is simple, and hard to screw up...they just have to specify the framebuffer....then it's hand off.
The 24 bit monitor will be dithered at best.....as long as the driver is doing it's job.

This only holds true for displayport....for DVI each monitor has specific pixel packing and the monitor has to be queried, and the code adapted....hence why it's not largely supported.

quote again
On current operating systems, 8 bits per component or a
total of 24-bit color is the default for the desktop and GDI rendering. However, applications
can use 30-bit color through OpenGL by requesting the appropriate pixel format (explained
in the next section). This approach allows for windowed 30-bit OpenGL buffers that are
then composited with the GDI components before scan out.

it seems full screen 30 bit is trivial....it's the combination of GDI, desktop ect that makes it hard for 'windowed' 30 bit...i.e. your fusion display window may be 30bit...but the interface surrounding it is not.

But even that heavy lifting is done by the driver.

m.d.
03-01-2014, 08:45 AM
interestingly enough....directX 10bit (full screen) is supported in Geforce gaming cards....and has been since 2009

not sure how much harder that is to code then OpenGL....and obviously windows only...but that would be an easy way to get 10bit without quadros...and perhaps even over HDMI

Lightwolf
03-01-2014, 10:21 AM
interestingly enough....directX 10bit (full screen) is supported in Geforce gaming cards....and has been since 2009

not sure how much harder that is to code then OpenGL....and obviously windows only...but that would be an easy way to get 10bit without quadros...and perhaps even over HDMI
Fusion supports a DX full screen image view for for a similar reason - consumer boards by nVidia (and AMD until recent driver changes) don't allow for windowed stereo.

But then it only needs to display an image as well.

Other than that, what you wrote, yes. Disregarding the fact that there's plenty of places where dithering can take place (starting at the GPU up to the display) and displays with 12 or more bits per channel being available as well (in greyscale for medical use, in RGB to have enough headroom to allow for hardware calibration within the display).

Cheers,
Mike

m.d.
03-01-2014, 12:45 PM
Fusion supports a DX full screen image view for for a similar reason


How do you enable this....I would like to test if possible....

I have a dreamcolor display...but my quadro is sitting on the shelf replaced by 4 GTX's

EDIT....ah I see you didn't specifically say it was 10 bit DX...

Lightwolf
03-01-2014, 12:56 PM
EDIT....ah I see you didn't specifically say it was 10 bit DX...
No, stereo only.

"3"

Cheers,
Mike

spherical
03-01-2014, 03:37 PM
Not really sure why that would indicate that....

Here is a quote from Nvidia docs
In the case when a single GPU drives multiple displays, as long as there is one 30-bit
compatible display active, 30-bit color is automatically enabled on all displays. At scan out,
the driver automatically downgrades the 30-bit output to 24-bit for regular displays.

So the 24-bit is dithered and looks as if it's 30-bit but isn't. When the 30-bit monitor is shut off, the driver then outputs non-dithered 24-bit data, because there is no signal to output 30-bit, and the banding appears.

Mr Rid
03-01-2014, 05:58 PM
I would download another 10bit capable app to test...

Davinci Resolve lite is free.....and useful

I cant figure out how the hell to view an image in Resolve. The media thing is grayed out, but I managed to load an image in the gallery, but cant figure out how to view it anywhere.

Lightwolf
03-01-2014, 06:04 PM
So the 24-bit is dithered and looks as if it's 30-bit but isn't. When the 30-bit monitor is shut off, the driver then outputs non-dithered 24-bit data, because there is no signal to output 30-bit, and the banding appears.
Basically, yes, that's what we're currently assuming. The desktop will be 30-bit on both displays, since the (single) frame buffer goes across both displays.

Cheers,
Mike

Mr Rid
03-01-2014, 09:53 PM
No....he's saying Fusion's part is simple, and hard to screw up...they just have to specify the framebuffer....then it's hand off.
The 24 bit monitor will be dithered at best.....as long as the driver is doing it's job...

Not sure that dithering explains it. The banding in the 8-bit version of the test gradient is too severe to hide with typical dithering. I have to add such an intense noise or grain to hide banding that it overwhelms the image. No matter how closely I zoom into the bandless test image on the 24 bit display, I see absolutely no dithering or noise.

In non-30 bit Fusion, I can change the test image from 16 bit to 8 bit, and checkbox 'error diffusion' which does an excellent job of smoothing out the banding. But if I zoom in closely, I can see the dithering. If I apply the 'error diffusion' after the image is fully converted to 8 bit, it has no practical effect.

So, Nvidia must be applying truly miraculous dithering!

Btw, I get the same result in CS6. No banding in the 30 bit test image on the 24 bit monitor, until I turn the 30 bit monitor off, and have to restart CS6 to see no banding again.


Not really sure why that would indicate that....
Mike is saying all fusion has to do is set the frame buffer....Nvidia handles the rest

Here is a quote from Nvidia docs
In the case when a single GPU drives multiple displays, as long as there is one 30-bit
compatible display active, 30-bit color is automatically enabled on all displays. At scan out,
the driver automatically downgrades the 30-bit output to 24-bit for regular displays....

This sounds like "30-bit color is automatically enabled on all displays", meaning a 24 bit monitor can show 30 bit color, because it would be included in "all displays," right? And the Nvidia driver is just "downgrading" the image for a 24 bit monitor, IF there is no 30 bit monitor present, correct?

But if it is just dithering on the 24 bit monitor, then why not always do this on 24 bit monitors? I mean, the end result is identical to the 30 bit display. I cant see or read any color value difference between the 30 bit or 24 bit monitors, while both are on. So where is the difference in price justified? It sounds like the only reason to buy a 30 bit monitor is so that the Nvidia driver will activate dithering (?!), since I cant tell that the 30 bit display is any different.

spherical
03-01-2014, 10:09 PM
So, it still sounds like the end result is that I am seeing 30 bit color on my 24 bit display, in the Fusion or CS6 window, which is where it matters most, and therefore a 24 bit monitor can display 30 bit color if the graphics driver 'allows' it, right?

That's not what I'm reading.... Doesn't stand to reason. 24-bit capable hardware cannot suddenly display 10 bits per channel. If it could, the 30-bit monitors wouldn't be so expensive, as we all would already have them. What you are seeing is a 30-bit approximation on the 24-bit hardware, as output by the video card/driver when it sees only a 24-bit device. Or... I've totally misunderstood the situation... again... which is quite possible. :)

EDIT: Might help to read this (http://www.imagescience.com.au/kb/questions/152/10+Bit+Output+Support) and this (http://www.tedlansingphotography.com/blog/?p=287).

Mr Rid
03-01-2014, 10:41 PM
That's not what I'm reading.... Doesn't stand to reason. 24-bit capable hardware cannot suddenly display 10 bits per channel. If it could, the 30-bit monitors wouldn't be so expensive, as we all would already have them. What you are seeing is a 30-bit approximation on the 24-bit hardware, as output by the video card/driver when it sees only a 24-bit device. Or... I've totally misunderstood the situation... again... which is quite possible. :)

I edited my post after you quoted. But I will repeat this, because what Lightwolf and m.d. are posting still sounds too weird. "30-bit color is automatically enabled on all displays", meaning a 24 bit monitor can show 30 bit color, and the Nvidia driver is just "downgrading" the image when a 24 bit monitor is the only monitor present (?). But when a 30 bit monitor is on, there would be no reason to "downgrade" the image for the 24 bit monitor. That sound really surprising.

And if the image is dithered on the 24 bit, it is amazingly undetectable. If it is just miraculous dithering (I am unable to see or duplicate in Fusion), then why not always do this on 24 bit monitors? I see that the color values read and look the same on both the 24 and 30 bit monitors. So where is the difference in monitor price justified? Dither, or not, this does not make sense.

spherical
03-01-2014, 10:54 PM
Check my recent edit. (I'm researching this as we go along, as I want to know, too.) However, you have said that when the card doesn't see a 10-bit monitor in the chain, it bands on the 8-bit monitor. As I see it, when there is a 10-bit monitor present, the card outputs 10-bit and emulates 10-bit on an 8-bit display. When there is no 10-bit capable device present, the card drops back to 8-bit. The emulation is what you are seeing, not an actual 10-bit output to an 8-bit device displaying AS 10-bit upon it.

Mr Rid
03-01-2014, 11:11 PM
Check my recent edit. (I'm researching this as we go along, as I want to know, too.) However, you have said that when the card doesn't see a 10-bit monitor in the chain, it bands on the 8-bit monitor. As I see it, when there is a 10-bit monitor present, the card outputs 10-bit and emulates 10-bit on an 8-bit display. When there is no 10-bit capable device present, the card drops back to 8-bit. The emulation is what you are seeing, not an actual 10-bit output to an 8-bit device displaying AS 10-bit upon it.

If that is the case, the 24 bit 'emulation' appears to perfectly match the 30 bit 'actual' image. I cannot see any dithering, no matter how closely I zoom into the image. And as I drag the cursor over the 24 bit emulation, or over the 30 bit actual gradient, the color values read and look identical. So why did I buy a 30 bit monitor?

It looks to me that the 24 bit monitor is not emulating or dithering. From that Nvidia quote, "30-bit color is automatically enabled on all displays", that would include 24 bit displays. And if that is the case, the question still stands, why buy an expensive 30 bit display? The video driver could apply this magically imperceptible dithering to all 24 bit displays. Or, the only reason to buy a more expensive 30 bit monitor is to bypass the auto-downgrading that occurs internally when only 24 bit displays are connected.

When I posed this question to Nvidia support, the response was a bit defensive, and they claimed it was Fusion that must be responsible. :stumped: I am still waiting to hear a response about this from Eyeon.

spherical
03-01-2014, 11:13 PM
If that is the case, the 24 bit 'emulation' appears to perfectly match the 30 bit 'actual' image. I cannot see any dithering, no matter how closely I zoom into the image. And as I drag the cursor over the 24 bit emulation, or over the 30 bit actual gradient, the color values read and look identical.

And you probably won't see any dithering difference. As long as the 10-bit monitor is in the chain, the card/driver isn't dithering the image, it's modifying the LUT on the lesser display and outputting those values on a per-pixel basis to the lesser capable monitor.

Check the second edit in that post. (I added another link just found.) Using the test file in the first link, I tested my workstation. Without having a 10-bit monitor and a Quadro and the right drivers (a GTX may do this if you get it to install Quadro drivers, because they're the same GPUs), all I get is 8-bit. The second link has a method for generating your own test file.

Look at it this way: you didn't have to buy two 10-bit monitors. :)

Lightwolf
03-02-2014, 10:56 AM
I cannot see any dithering, no matter how closely I zoom into the image.
Of course, not, the dithering is always on a _display_ pixel by pixel basis... after your zoom. It is the final desktop frame buffer that's dithered - not the steps before that used to compose it.

And as I drag the cursor over the 24 bit emulation, or over the 30 bit actual gradient, the color values read and look identical.
For the same reason. It's the GPU that dithers as the framebuffer is converted to a signal that's compatible for the respective display. The dithering doesn't exist before that moment - and is thus not "visible" to any software.
Also, Fusion does not return the value below the cursor on the display framebuffer... but the value that corresponds to the part of the image that's visible underneath the cursor.


It looks to me that the 24 bit monitor is not emulating or dithering.
No, unless it is a 30-bit capable display that only uses a 24-bit/8-bit panel (similar to 6-bit panels that accept 8-bit pixel data).


From that Nvidia quote, "30-bit color is automatically enabled on all displays", that would include 24 bit displays. And if that is the case, the question still stands, why buy an expensive 30 bit display? The video driver could apply this magically imperceptible dithering to all 24 bit displays. Or, the only reason to buy a more expensive 30 bit monitor is to bypass the auto-downgrading that occurs internally when only 24 bit displays are connected.
Well, for one, only a 30-bit display will give you a pixel accurate rendering of the 30-bit data.
Why nVidia doesn't allow for 30-bit all the time and then just dithers down to 24-bit regardless of the displays that are attached? Ask nVidia. Technically there's no reason for that.

Just to make this a bit clearer (I hope), there's three components at play: The capabilities of the gfx board (including the encoding of pixel values for output), the capabilities of the transmission of the pixel values to the display (which includes the transmission standard and the decoding hardware in the display) and the capabilities of the display panel itself.

The gfx board can generate 8-bit or 10-bit frame buffers.
It can output 8-bit or 10-bit pixel data - depending on what the interface allows and the display attached to it accepts.
Displays can accept 8-bit or 10-bit data (again, depending on the interface and what their electronics accept)
Display panels can show 6-bit (cheap ones), 8-bit or 10-bit data.

(I'm leaving out other cases that exist).

Dithering may happen if the gfx board needs to output a 10-bit framebuffer to a 8-bit only device.
However, even a display with a 6-bit panel will accept 8-bit pixels (and some displays that have 8-bit panels will also accept 10-bit pixels). In this case, dithering will happen in the display itself.

Those are the only two places where automatic dithering will happen though.

Cheers,
Mike

Mr Rid
03-02-2014, 06:05 PM
Yes, thanks! But I am wondering how to tell if the 30 bit monitor is not just 'dithering?' What is the practical difference? I dont see any.

When the 24 bit monitor is the only monitor active, the gfx board still "needs to output a 10-bit framebuffer to a 8-bit only device" but it is not dithering it. 'Dithering' only while a 30 bit monitor is shared, seems like extra effort for no discernible reason. If Nvidia is bothering to dither, or if dithering is not any extra effort, then why not always dither for 24 bit monitors, regardless if a 30 bit monitor is active? This effectively eliminates banding, and no one would need to spend extra dough for a 30 bit display. Either way, the values read the same as you pick them with a cursor. And they 'look' the same as far as the human eye can detect on a monitor.


Nvidia Boss: "Lets develop a card that displays 30 bit color, to eliminate banding. It'll be all the rage."

Tech raises hand.

Boss: "Yes?"

Tech: "What if a user happens to share the display between 30 bit monitor AND a 24 bit monitor?"

Boss: "Who cares?"

Tech: "Well, the 24 bit display will still show banding."

Boss: "So what? 24 bit monitors always look that way."

Tech: "Well, what if we bother to undetectably dither the display on the 24 bit? It will look nicer and compliment the 30 bit monitor."

Boss: "If we do that, why would anyone have any particular reason to invest in more expensive 30 bit monitors and cards?

Tech: "Well... what if it only works when a 30 bit monitor is connected? That way they still have to buy 30 bit crap."

Boss: "Fine. Spend time implementing that for the twelve people who might have such a setup and notice. No one will question why we didnt just always dither 24 bit displays."

m.d.
03-02-2014, 08:21 PM
Are you testing on a 10bit image?

Here's the way I see it.....fusion only delivers a 10bit framebuffer....what if you are feeding it a 16bit image?

It stands to reason....that because Nvidia only has a 10bit RGB buffer available(12 greyscale in some cases)....that Nvidia itself will dither higher then 10 bit images.....before sending them down the pipe

So when sending a 16 bit image out....Nvidia likely pre dithers the 16bit image.....so effectively that even the 8bit monitor looks smooth as well

It is a very small amount of noise that is mathematically needed.....1/256.....0.0039%...but depending on the bit depth reduction could be 2 bits dithered....still not even 1% noise

If nvidia didn't pre dither a 16bit image before sending it to the framebuffer....you would also see banding in the 10bit monitor guaranteed....since nvidia will only send out in a 10bit signal.

It is very common for the people that use a 10bit display...to want to view higher then 10bit images....usually 16bit

So nvidia probably enabled the 'dither everything' approach you mentioned for images 12bit and higher


Just a side note...a lot of so called '10bit' monitors will incorporate temporal dithering alternating color values rapidly....
A lot of non IPS monitors are actually 6bit....

Older Mac laptops were mostly 6bit....led to some lawsuits
http://arstechnica.com/apple/2007/05/lawsuit-over-mac-book-mac-book-pro-displays/

It's a complicated topic....there is dithering from Nvidia drivers from 10-8bit (100% guaranteed)
There is likely dithering from nvidia from 16-10bit.....
And your monitor itself may actually be dithering.....

Whether true 10bit is worth the extra effort is a judgement call....

m.d.
03-02-2014, 08:27 PM
Boss: "If we do that, why would anyone have any particular reason to invest in more expensive 30 bit monitors and cards?

Tech: "Well... what if it only works when a 30 bit monitor is connected? That way they still have to buy 30 bit crap."

Boss: "Fine. Spend time implementing that for the twelve people who might have such a setup and notice. No one will question why we didnt just always dither 24 bit displays."

Sound about right.....
Followed by....
"Let's disable 10bit out in the driver....to force people to spend $4000 on a quadro for the equivalent of a $500 gtx"

m.d.
03-02-2014, 08:36 PM
Here's a way to check your monitor
http://www.mediachance.com/pbrush/monitor.html

The lesson learned is next time the kid at best buy tells you his TV has a 1,000,000-1 contrast ratio....punch him right in the face.

1024-1 true contrast is a 10bit monitor.....the rest is marketing dither...very few actually are..

(Dolby has a true 40,000-1 contrast monitor...almost 16bit...I've seen footage on it and it looks pretty sweet)

Lightwolf
03-03-2014, 03:38 AM
Here's the way I see it.....fusion only delivers a 10bit framebuffer....what if you are feeding it a 16bit image?

It stands to reason....that because Nvidia only has a 10bit RGB buffer available(12 greyscale in some cases)....that Nvidia itself will dither higher then 10 bit images.....before sending them down the pipe

So when sending a 16 bit image out....Nvidia likely pre dithers the 16bit image.....so effectively that even the 8bit monitor looks smooth as well
Nope. it doesn't. That was tried as well.


If nvidia didn't pre dither a 16bit image before sending it to the framebuffer....you would also see banding in the 10bit monitor guaranteed....since nvidia will only send out in a 10bit signal.
10 bits is 1024 shades per component... you need to have excellent eyes to see any hint of banding there (assuming the source image has at least as many bits). Even if the 16-bits are only truncated.

Granted, if it's pure greyscale, more than 10-bits make more sense (that an huge screens).

Cheers,
Mike

Lightwolf
03-03-2014, 03:42 AM
Yes, thanks! But I am wondering how to tell if the 30 bit monitor is not just 'dithering?' What is the practical difference? I dont see any.
Pixel accuracy - dithering will only be an approximation of the real deal - even if your eyes can't make out the difference.


When the 24 bit monitor is the only monitor active, the gfx board still "needs to output a 10-bit framebuffer to a 8-bit only device" but it is not dithering it.
If it is allowed to use a 10-bit frame buffer that is. Applications request certain attributes for the frame buffer from the gfx drivers - but they are not guaranteed to get them. Such as 10-10-10-2 or quad buffers for stereo - both are optional.

As for why nVidia doesn't allow it for 8-bit displays - I've no idea, I'd suspect it's more related to marketing and market segmentation than anything else.

Cheers,
Mike