New Workshop! Lighting 3 | Advanced Off Camera Flash Preorder

Gear & Apps

10-Bit Support Now Available For Photoshop CC In OS X

By Kishore Sawh on December 5th 2015


It’s interesting to note that most graphics and imagery professionals I know and know of, are using Apple computers. Now, this isn’t to get into which is better or if at all there can be a definitive answer to that question, but I do find it interesting that Windows has been pushing out 10-bit support since Windows 7, (which may or may not be one of the few strong points of that iteration), and MAC has only relatively recently done so with OS X El Capitan. That’s right, it was a point of much criticism in the past few years by professionals, and one Apple resolved rather quietly this year. So OS X now does, in fact, support 10-Bit output.

Just as a basic primer, 10-bit output in graphics processing speak essentially means that the software now enables a graphics card to show 1024 gradations per color channel versus 256. Allow me to tell you right now that if I had to clarify that to you, you probably won’t care nor notice.


This isn’t to say you’re alone, because most people, and even working pros won’t either. If you want to see it most clearly, or at least, easily, it helps to know where to look, and that would probably be in a big gradient of closely associated tones – think banding. With 10-bit, you’ll see much less of it and the transition of tones will seem more smooth.

If you’d like to see for yourself just Google ’10-bit test pattern’ and it’ll probably provide you with choices of 16-bit or even 24-bit files versus 8-bit files that are just a simple color or gray gradient and you’ll see the difference. However, make sure you download the files and don’t just open them in your browser because depending on the bit-depth, your browser probably won’t be able to render that file fully, defeating the exercise purpose. Actually, most programs won’t display 10-bit, and if you’re on a Mac with this new feature, you’ll only see the benefit in Apple apps like Preview and Photos.

This actually neatly brings me to the ‘news’ of this post; that even if you have some sick 16-bit LUT monitor like the Eizo ColorEdge CS270, Photoshop, your major photo editing program wouldn’t even support 10-bit files, and neither would Affinity Photo if any of you are using that. Well, Adobe just changed that and those with Photoshop CC. (Yes, it’s been around for Windows in the past).

Now, if you are working on a computer and monitor with support for 10-bit, you can enable Photoshop to support it with just a few clicks. Here’s how:

Photoshop CC>Preferences>Performance


Click on the ‘Advanced Settings’ under Graphics Processor Settings, then ensure the ’30 Bit Display’ box is checked. That’s it; you’re done.




Enjoy your new capabilities, if you can discern them.

This site contains affiliate links to products. We may receive a commission for purchases made through these links, however, this does not impact accuracy or integrity of our content.

A photographer and writer based in Miami, he can often be found at dog parks, and airports in London and Toronto. He is also a tremendous fan of flossing and the happiest guy around when the company’s good.

Q&A Discussions

Please or register to post a comment.

  1. Ben Greaves

    Thats great and all but does Mac have 10 bit graphics cards to be able to handle it?

    | |
  2. Mark Romine

    I understand that there are cameras, principally MF, that shoot 16 bit. But how does that benefit them, it doesn’t. PS provides for 16 bit editing. Editing in 10 bit is throwing out color if you capture in 16 bit. So no gain there. Then with DSLR and Mirrorless bodies you are capturing in 12 or 14 bit so to work in 10 bit you are again throwing out color info. Granted better than 8 bit edits bit10 bit files seem to be a weird in-between. But if I’m capturing in 16 bit no way I’m I going to edit it 10 bit when PS supports 16 bit editing.

    | |
  3. Mark Romine

    Ok, I’m stumped, why should I care about this? What’s the point? Most DSLRs capture in what, 14 bit color? Most labs will only accept 8 bit files for output. So why should I care if about a hidden 10 bit feature?

    | |
    • Kishore Sawh

      It’s a good question Mark, and perhaps it’s not something you actually need to care about. There are cameras around that are going to capture 16-bit, and labs that will handle them (to a degree). Most of the benefit for the average person will come from the lack of artifacts present whilst editing/viewing the image on screen. Keep in mind too, that 10-bit here is 10 per channel, so it’s 30 (10 R, 10 G, 10 B). But honestly, this is something that’s going to appeal to a certain group, as the average shooter just won’t care or even have the gear to notice it. I suspect this is why it was quiet news from Apple. That said, there are a fair percentage of our viewers who actually will see this benefit.

      | |
    • Rune Abro

      Mark Romine, Based on your statements, you clearly do not “fully” understand this topic and that probably means that there is a very good chance that you  indeed do not need support for a 10-bit monitor :) This does not however mean that none of us needs it :D Either way, if you do any type of digital image work beyond being a hobbyist, you really could benefit from understanding it!

      Firstly, I think that you need to distinguish between the bit depth of the file it self and then the bit depth of the representation of said file. Both are relevant to what you see but they are NOT the same! Equating them is like saying that there is no difference between the world being un-sharp and you having forgotten to wear your glasses :)

      Granted, a RAW/DNG file might be 14 or even 16bits and while most delivery will end up in 8bits, this does not mean that everything in between is wasted. (or that it even makes sense to talk about it those terms) If you for example, is going to do heavy CC and grading, the extra information in the file is very beneficial.

      | |
  4. Joseph Ford

    Very strange, if this feature was available in Windows and not OS X. Because Adobe pride itself on maintaining feature independent between OS.

    | |