November 1, 2015

Apple Is Paving the Way for 10-Bit Color with Its Latest OSX Update

5K iMac 10 bit Support
In its latest OSX update, Apple included support for 10-bit color depth, which could be a big deal for filmmakers.

The driver update, which was first noticed and reported by the German publication Mac & I, enables support for 10-bit color depth within OSX El Capitan on certain higher resolution Macs, including the latest Retina iMacs, the Mac Pro, and some 15-inch Macbook Pros. The significance of 10-bit color is that, once fully supported by the display hardware and software you're using, you will be able to see significantly more detail in the subtle gradations between colors.

To help explain and demonstrate why 10-bit matters so much, here's a brief excerpt from David Torcivia's excellent article about what to look for when purchasing a new monitor for color grading:

8bit means a video codec or display is only capable of 16.7 million colors. That sounds like a lot until you compare it to the 1.07 billion colors a 10bit panel (or codec) is capable of. It's quickly apparent why a 10bit panel is so superior to 8bit. This is especially true when working in codecs that are 10bit (or greater). An 8bit monitor would be incapable of displaying all the colors!

A result of the fewer colors can be the introduction of banding in your images.

8 bit vs 10 bit display

The downside here is that you likely won't be working in 10-bit color on your Mac any time soon. Because 10-bit has to be supported in not only the hardware and software that you're using, but also in your media, chances are that this update won't mean much to you right now unless you have an external 10-bit display, work exclusively in the native Photos OSX app (the only place besides Finder in which 10-bit support is enabled at the moment), and have some high bit-depth media to work with.

With all of that said, now that OSX offers 10-bit compatibility, we should start seeing software developers — Adobe and Blackmagic are at the top of that wish list — incorporating native 10-bit support into their products. Once that happens, you should be able to take full advantage of your higher bit depth media, provided that you have a 10-bit display as well. This could also mean that future generation of Apple displays (all of which are traditionally 8-bit) could make the jump into 10-bit territory.     

Your Comment

15 Comments

Come on pro colour graders! Chip in. You know what you want to say.

November 1, 2015 at 9:46PM

0
Reply
avatar
Jonathon Sendall
Stories
1655

The monitors would need to reproduce the luminosity range. That's what's really important. That's assuming the graphics cards and monitor circuits support 10-bit. Still images now have the range, not sure about video.

What's entirely over everyone's head is that this is purely luminosity processing and computational accuracy. The human eye can only see between 5-7 millions colors.

November 1, 2015 at 10:09PM

0
Reply
Vidrazor
498

10bit is great but all i want is disk utility back apple

November 1, 2015 at 11:35PM

4
Reply
matt
880

lol. it's still there.

November 2, 2015 at 12:41AM, Edited November 2, 12:41AM

0
Reply
Bryan
Cinematographer / Editor
88

Took Apple long enough, but i would like to add that you still need a graphics card that supports 10-bit and a 10-bit monitor. 10-bit has nothing to do with resolution! The only cards that support 10-Bit output are Quadro and FirePro cards. Only mac that has a FirePro is Mac Pro!
https://youtu.be/PhkJLF3oyI8?t=49m24s

November 2, 2015 at 1:45AM

0
Reply
Anthony
Director, DP, Editor, VFX
81

Paving the way? A slight hyperbole considering windows has had the support for a long long while.

November 2, 2015 at 4:40AM

14
Reply
avatar
Ash Tailor
Cinematographer
247

Agreed. Windows has been supporting 10bits since version 7. But having the rest of the chain support 10bits is another story.

November 2, 2015 at 5:51AM

0
Reply
avatar
Haroun Souirji
Director / DP and Producer
373

Yep, using a 10-bit Eizo screen through display port through a Quadro 4000.
In Windows it is supported since 2010.
And since CS5 it is at least supported in Photoshop.
Since CS6 in Premiere Pro and AE for sure.

So Apple is paving the way for Apple users only ;-)
For Windows it was already a highway :-p

November 2, 2015 at 8:35AM

11
Reply
avatar
WalterBrokx
Director, DOP, Writer, Editor, Producer
9259

"The downside here is that you likely won't be working in 10-bit color on your Mac any time soon."

Uhm, well my monitor is 10 bit (that's not that hard to find), my graphics card supports 10 bit, Resolve supports 10 bit and my camera shoots 12 bit

So I wouldn't jump to conclusions about not using it.

November 2, 2015 at 8:12AM

5
Reply

er rob, this is a tad embarrassing. we've had 10bit cards on macs for quite a while. we've had 10 bit monitors connected to 10 bit cards for quite a while too. we've had 10 bit footage for almost as long as i can remember. the only thing missing was native displays at 10 bit. which has never been an issue if you had a professional set up. i know this is no film school, but you didn't need to go to film school for facts as basic as this...

November 2, 2015 at 8:35AM

4
Reply
keith
441

Apple paving the way? Ha ha... I love it that people seem to think Apple OSX desktops and laptops are so dominate in the world when in reality they've just been a minority year after year. Windows is almost a massive 90% of the desktop and laptop space while OSX is a tiny 8%. Ref: https://en.wikipedia.org/wiki/Usage_share_of_operating_systems

Apple paving the way for the 8%? The uninformed write articles like this based on emotional bias rather than numerical facts.

Windows has had 10-bit color capability since Windows XP and Photoshop CS4. Ref: http://www.ronmartblog.com/2011/07/guest-blog-understanding-10-bit-color...

November 3, 2015 at 5:53AM

0
Reply
avatar
Razor
VFX Colorist
493

Who came up with the horribly doctored greyscale and colourscale images above?? I'm viewing this on an older iMac thats 8 bit yet that image shows an 8 bit and a "10 bit" stripe depicting the 10 bit one as being smooth and the 8 bit chunky.... on my 8 bit machine. Am I missing something?

Reminds me of ads for UHD that are played on HD televisions.

November 3, 2015 at 6:00PM

0
Reply

no, you're completely right. i was just about to write the same thing.

bullshit like this is the reason for 80% of the consumers that want 10bit TVs, although they probably couldnt see a difference in a proper test.

November 4, 2015 at 3:12PM

7
Reply
avatar
Paul-Louis Pietz Pröve
director / dop / editor
433

But who needs 10 bit on their native displays anyway?

November 5, 2015 at 1:52PM, Edited November 5, 1:56PM

0
Reply

I just saw smooth gradients on an 8 bit color monitor above, as did you. A truly 8 bit monitor will never show any issues, because 8 bit is about all your eyes can see. This is not the same issue as 8 bit acquisition formats, which will band when you grade them.

November 6, 2015 at 1:22AM

0
Reply
Robert Ruffo
Director/DP
333