10 Bit Output Support

14th September 2015 Colour Accurate Monitors

We discuss the theory and practice of achieving 10 bit video output support, and why (or why not!) it may be important.

First, we present a quick summary of the state of play - below that we go in to more detail for those of you looking for the full explanation.

The State of Play 2016

Executive Summary

This summary is all you really need to know about getting 10 bit video output working in 2016 for image making purposes. It's finally relatively easy to achieve this on both Mac and PC as long as you have up to date hardware and applications.

If you want to know why you might want 10 bit, read on below. The short version is that 10 bit is better for colour (smoother gradients, mainly) - but it's definitely not essential for general use. If you're upgrading your equipment anyway, then sure add hardware if needed for 10 bit support, but it's not really worth upgrading anything specifically for 10 bit.

Minimum Requirements - Mac & PC

Obviously, in all cases, you need a 10 bit capable monitor (such as Eizo ColorEdge and NEC PA monitors), and this must be connected via a 10 bit capable video connection (this means DisplayPort/Thunderbolt, or in some cases HDMI 1.3+, never DVI).

Mac OSX 10 Bit Support

Things changed finally, in late 2015:

You must be using El Capitan or above (i.e. OSX 10.11+), and you must be using up to date hardware - something like the 5K imac, or basically hardware from mid 2015 on seems generally to be ok.

There is still very limited actual application support for 10 bit colour but this is presumed to change fairly rapidly now that the OS and Mac hardware supports it. You must update to the latest Creative Cloud Adobe apps, for instance.

Windows PC 10 Bit Support

10 bit has been possible for years and years already. You need 1 10 bit capable card - generally it's best to get a workstation level card. This means an NVIDIA Quadro card (best drivers) or AMD FirePro (notoriously flakey drivers). Some high end gaming cards (AMD Fury models etc.) - also have 10 but support, but less reliable drivers. If at all possible go for a workstation level card from NVIDIA (K2200 is perfect) - if you want a reliable 10 bit solution.

Actual application support for 10 bit is (for now) much wider on the PC than the Mac. Adobe apps like Photoshop have supported 10 bit for several years now.

The Theory

Bit depth, in this particular context, is a measure of how many discrete values the system can do its processing with - and more is better.

10 Bit video output is a confusing issue for a lot of people. This article tries to simplify and explain the issue. (In some ways it's over-simplified for the sake of clarity rather than getting bogged down in too much detail).

The key thing is not to confuse the bit depth of your video cards output signal with the bit depth of your monitor's Look Up Tables (LUTs - 6,8,10, 12, 14 or even 16 bit with the very latest Eizo CG and NEC PA monitors). Also, your digital image files have a bit depth that is a separate issue as well.

First, what is bit depth in this particular context?

Well, it's a measure of how many discrete values the system can do its processing with - and more is better.

For example, any 6 bit system has just 64 signals to play with - meaning there are only 64 possible adjustments you can make to this signal. Put very simply, you can choose 31 (might be too red) or 32 (might be too blue), but there's no concept of 31.5 (which might be just right). With 8 bit systems, you have only 256 signal levels to play with - and this is the normal scenario for video card output signals - your computer can only ever output a value of between 0 and 255 for each of Red Green and Blue, which combined form a specific colour. 99.99% of computers and monitors on the planet work this way.

Once the signal actually reaches the monitor, and for example lets choose (128, 0, 0) which is a medium strength red - then the monitor uses it's LUTs (look up tables) to choose which colour to actually display for this signal. With, for example, a 6 bit LUT, there are only 64 shades of red available. So choices for the actual tone the monitor displays are very, very limited and it's basically impossible for the monitor to choose a correct colour (as odds are Red 31 is not right, and neither is Red 32).

Move up to an 8 bit monitor, and the choice is improved somewhat as there are now 256 reds to play with - this increases accuracy, as Red 127 might be a bit too strong, 128 is closer, 129 is too strong, so 128 is chosen as the best option. But odds are this is still not enough finesse to get the right colour. (When you calibrate a monitor, this is what is happening - the calibrator tells the monitor to display Red 127, Red 128, Red 129 etc, measures them, and creates a table of what actual colours these values represent - this table is then used to know what signal to send the monitor to later get the right colour).

This is the way it works for almost all normal scenarios, except for quite cheap or quite expensive monitors - 8 bit video cards and 8 bit LUTs are standard. Also, the monitor LUTs are single dimension - that is they only work on one colour at once. Good monitors now have so called 3D LUTs which allow them to adjust R, G and B simultaneously, which helps as colour error is rarely just along one axis.

As monitors get more expensive, the LUTs get better, with 10 and 12 bit being most common in higher end monitors. This means the signal quality (really the amount of signal finessing that can be done) moves up - 10 bit means 1024 levels, 12 bit means 4096 and 14 bit means 16,384 levels - and the best is 16 bit with over 65000 levels - basically, vastly more precision is available in the mapping of input tones from the computer to output tones on the monitor (remember - because the video card signal is 8 bit, these values range from 0 to 255).

However, the bottleneck in this system is the video card - it can only output 8 bit signals (because there are three channels (RGB) means a total palette of approximately 16.7 million colours - which sounds like a lot but there's still only 256 pure greys in there). To solve this bottleneck, systems are moving toward having 10 bit output from the video card. Meaning 1024 possible signals for each of R, G and B, or a palette of over 1 billion colours (1024 along the pure grey axis).

So, to look at it holistically - there are several components to the complete video path:

Your actual digital image file -> OS -> Software (e.g. Photoshop) -> Video Card Digital Signal -> Monitor LUTs -> Panel Depth

Each and every one of these can have a different bit depth.

The classic path, and this is true of nearly all computers (Mac and PC) before about 2015, is to have an 8 bit video output signal. With consumer class monitors, processing on these signals is done in the monitor with 6 (really bad office monitors) or 8 bit LUTs (almost all consumer monitors). Better monitors, such as those from NEC (the PA series) and Eizo (the CG series), have LUTs that are 10, 12 or even up to 16 bit.

Video cards are now appearing that support 10 bit output, and Windows versions 7 and on facilitate 10 bit output (older OSes do not reliably work with 10 bit outputs), OSX 10.11 and on (El Captian) support 10 bit output, Photoshop CS6 (on PC) and PS CC 2016 and on (Mac) support 10 bit output.

The practical result is yet smoother, more accurate colour (particularly on very wide gamut monitors) and a more robust system in general that can cope with calibration to wider, more exotic targets and a greater variety of brightness levels.

How To Actually Get 10 Bit Output


A lot of people think they're getting 10 bit output because their video card or monitor ostensibly support it - however suprisingly few people actually have 10 bit output working in reality.

January 2016 - The best advice currently on the PC is to use NVIDIA Quadro cards - these have much better drivers than the ATI cards. We recommend the K620 or ideally the faster K2200 based cards, as used in our Photoshop PCs. 10 bit is completely reliable with these setups in our experience, up to and including with Windows 10.

Here you can find a file you can use to test your output called '10 bit test ramp.psd' - if you open this in Photoshop you will see banding if you have an 8 bit output path, and complete silky smoothness on a proper 10 bit path. That said, it is not without glitches in practice - Photoshop will, when using some tools like 'clone' for example, render a small part of the working area in 8 bit around your pointer. Also, Windows 7 will lose the Aero transparency features with ATI video cards. So while it works, it is not without glitches and while it may be useful in some circumstances, the performances of these screens with 8 bit input is already so good it's debatable whether the glitches are worth it on a day to day basis right now.

A lot of people think they're getting 10 bit output because their video card or monitor ostensibly support it - however very few people actually have 10 bit output working in reality. If you're trying to set it up, make sure you actually test your system to be sure it's all operating as expected.

To get a 10 bit output path working:

On The PC:

  • Windows 7 or above, ideally Windows 10
  • A video card that has 10 bit output support (common in theory) AND drivers that actually offer this in a stable manor (much more rare).
  • The best supported cards are the NVIDIA Quadro cards. We recommend the basic K620 model, or if you have more budget and definitely if you're using multiple monitors, go for the K2200 models.
  • You can also try the ATI FirePro series of cards (many cheaper ATI cards have the hardware required but no driver support for 10 bit). I have used ATI FirePro 4800/4900 cards with quite good results, but the drivers are flakey.
  • Photoshop CS6 (earlier versions do NOT really reliably support 10 bit output) - ideally Photoshop CC 2015+.
  • Activation of 10 bit mode in those apps (see e.g. Photoshop performance settings).
  • Only if you have all of these things in place will 10 bit support be possible and you will need to manually activate it in the video card drivers in most instances.
  • 10 bit output may interfere with games etc, so they do let you turn it on and off as required.

On The Mac:

  • OSX 10.11 El Capitan or higher
  • Very modern hardware (mid 2015 on) with 10 bit video support - such as the 5K iMac or MacBook Pro models, and the tube Mac Pro.
  • Up to date Apps - e.g Photoshop CC 2016+
  • Activation of 10 bit mode in those apps (see e.g. Photoshop performance settings).