Loading... Please wait...

Please note we are absolutely flat out with printing right now.
Please triple check your files are correct and that you lodge matching online orders to avoid delays - thanks!

Our Newsletter

Join over 3000 people who subscribe to our informative monthly newsletter (archive here).


This article is one of more than 170 articles on digital imaging provided free by Image Science.
Read about How You Can Support Image Science.

10 Bit Output Support
Article Details

Last Updated
17th of February, 2014

Summary: Explains the theory and practice of achieving 10 bit video output.

Jump To:

  • Theory (What 10 bit video output is, and why it matters)
  • Practice (How to get it working)

The Theory

This is a confusing issue for a lot of people. This article tries to simplify and explain the issue. (In some ways it's over-simplified for the sake of clarity rather than getting bogged down in too much detail).

The key thing is not to confuse the bit depth of your video cards output signal with the bit depth of your monitor's Look Up Tables (LUTs - 6,8,10, 12 or even 14 bit with the very latest Eizo CG and NEC PA monitors). Also, your digital image files have a bit depth that is a separate issue as well.

First, what is bit depth in this particular context? Well, it's a measure of how many discrete values the system can do its processing with - and more is better. For example, any 6 bit system has just 64 signals to play with - meaning there are only 64 possible adjustments you can make to this signal. Put very simply, you can choose 31 (might be too red) or 32 (might be too blue), but there's no concept of 31.5 (which might be just right). With 8 bit systems, you have only 256 signal levels to play with - and this is the normal scenario for video card output signals - your computer can only ever output a value of between 0 and 255 for each of Red Green and Blue, which combined form a specific colour. 99.99% of computers and monitors on the planet work this way.

Once the signal actually reaches the monitor, and for example lets choose (128, 0, 0) which is a medium strength red - then the monitor uses it's LUTs (look up tables) to choose which colour to actually display for this signal. With, for example, a 6 bit LUT, there are only 64 shades of red available. So choices for the actual tone the monitor displays are very, very limited and it's basically impossible for the monitor to choose a correct colour (as odds are Red 31 is not right, and neither is Red 32). Move up to an 8 bit monitor, and the choice is improved somewhat as there are now 256 reds to play with - this increases accuracy, as Red 127 might be a bit too strong, 128 is closer, 129 is too strong, so 128 is chosen as the best option. But odds are this is still not enough finesse to get the right colour. (When you calibrate a monitor, this is what is happening - the calibrator tells the monitor to display Red 127, Red 128, Red 129 etc, measures them, and creates a table of what actual colours these values represent - this table is then used to know what signal to send the monitor to later get the right colour).

This is the way it works for almost all normal scenarios, except for quite cheap or quite expensive monitors - 8 bit video cards and 8 bit LUTs are standard. Also, the monitor LUTs are single dimension - that is they only work on one colour at once. Good monitors now have so called 3D LUTs which allow them to adjust R, G and B simultaneously, which helps as colour error is rarely just along one axis.

As monitors get more expensive, the LUTs get better, with 10 and 12 bit being most common in higher end monitors. This means the signal quality (really the amount of signal finessing that can be done) moves up - 10 bit means 1024 levels, 12 bit means 4096 and 14 bit means 16,384 levels - and the best is 16 bit with over 65000 levels - basically, vastly more precision is available in the mapping of input tones from the computer to output tones on the monitor (remember - because the video card signal is 8 bit, these values range from 0 to 255).

However, the bottleneck in this system is the video card - it can only output 8 bit signals (because there are three channels (RGB) means a total palette of approximately 16.7 million colours - which sounds like a lot but there's still only 256 pure greys in there). To solve this bottleneck, systems are moving toward having 10 bit output from the video card. Meaning 1024 possible signals for each of R, G and B, or a palette of over 1 billion colours (1024 along the pure grey axis).

So, to look at it holistically - there are several components to the complete video path:

Your actual digital image file -> OS -> Software (e.g. Photoshop) -> Video Card Digital Signal -> Monitor LUTs -> Panel Depth

Each and every one of these can have a different bit depth.

The classic path, and this is is true of almost all computers (Mac and PC) before about 2010, is to have an 8 bit video output signal. With consumer class monitors, processing on these signals is done in the monitor with 6 (really bad office monitors) or 8 bit LUTs (almost all consumer monitors). Better monitors, such as those from NEC (the PA series) and Eizo (the CG series), have LUTs that are 10, 12 or even up to 16 bit.

Video cards are now appearing that support 10 bit output, and Windows 7 facilitates 10 bit output (older OSes do not reliably work with 10 bit outputs). Photoshop CS5 supports 10 bit output although the support is a bit hit and miss still. Photoshop CS6 has moved to a much more stable 10 bit rendering model, using OpenGL for rendering so the support is much better, so we strongly recommend you move to CS6 or above, if you want 10 bit display.

The practical result is yet smoother, more accurate colour (particularly on very wide gamut monitors) and a more robust system in general that can cope with calibration to wider, more exotic targets and a greater variety of brightness levels.

Practice - How To Actually Get 10 Bit Output

January 2014 - The best advice currently is to use nVidia Quadro cards - better drivers than the ATI cards. We recommend the K600 or faster K2000 based cards, as used in our Photoshop PCs. 10 Bit works reliably, and without Aero issues as the nVidia drivers are better than the ATI drivers, and thanks to Photoshop now using OpenGL rendering.

July 2010 - Here you can find a document that contains instructions on setting it up with ATI video cards from ATI, some notes from Eizo on this, as well as a file you can use to test your output called '10 bit test ramp.psd' - if you open this in Photoshop you will see banding if you have an 8 bit output path, and complete silky smoothness on a 10 bit path. That said, it is not without glitches in practice - Photoshop will, when using some tools like 'clone' for example, render a small part of the working area in 8 bit around your pointer. Also, Windows 7 will lose the Aero transparency features with ATI video cards. So while it works, it is not without glitches and while it may be useful in some circumstances, the performances of these screens with 8 bit input is already so good it's debatable whether the glitches are worth it on a day to day basis right now.

A lot of people think they're getting 10 bit output because their video card or monitor ostensibly support it - however until very recently, almost no one has actually had 10 bit output working in reality.

To get a 10 bit output path working, you must have:
  • Windows 7 or above
  • Note 10 bit output is currently impossible with the Mac up to and including Mavericks (10.9). While you may have read a famous article about this being possible, this was later retracted by the author as incorrect (you can see the author conversing with Adobe engineers here on this, well after the article was published) - although that particular site has unfortunately not bothered to update the article to reflect its known incorrect information unfortunately).
  • A video card that has 10 bit output support (common in theory) AND drivers that actually offer this (rare).

    On the PC side
    of things, the best supported cards in 2014 are the nVidia Quadro cards. We recommend the basic K600 model, or if you have more budget the K2000 model.

    You can also try the ATI FirePro series of cards (many cheaper ATI cards have the hardware required but no driver support for 10 bit as of May 2010). I have used ATI FirePro 4800/4900 cards with good results.

  • Photoshop CS6 (earlier versions do NOT really reliably support 10 bit output)
  • A monitor that will accept 10 bit video input (only available over DisplayPort connections typically) - The NEC PA series all accept 10 bit inputs, and most Eizo CG models do as well.
  • Only if you have all of these things in place will 10 bit support be possible and you will need to manually activate it in the video card drivers in most instances.
  • 10 bit output may interfere with games etc, so they do let you turn it on and off as required.
File Attachments
These Articles May Be Related...