
Recently, the Society of Motion Picture and Television Engineers (SMPTE) organized a meeting to review the standardization for Ultra High-Definition Television (UHDTV). The need for standards is especially important since shipments of ultra high-def TV sets are expected to reach four million units by 2017.
Before attending the meeting, I reviewed the committee’s thorough report on their findings thus far. One of the more impressive facts is that the range of colors UHDTV can display encompasses nearly “twice the colors perceived by humans or that can be captured by a camera.”
Two standards are actually being developed. Simply called UHDTV1 and UHDTV2, the easiest way to distinguish them is by their frame dimensions. UHDTV1 would have a 4k resolution of 3,840 x 2,160 pixels whereas UHDTV2 would have a whopping 8k resolution of 7,680 x 4,320 pixels. The standards would contain 10 and 12 bit depth, with chroma subsampling options at 4:4:4, 4:2:2 and 4:2:0. 8 bit, as well as interlacing and fractional framerates, would be discarded. The likely base framerate would be 120 frames per second, due in part that 120 is divisible by many popular framerates such as 24, 30, and 60. At such a high framerate, the “flicker fusion threshold,” a technical term for image flicker, would be greatly reduced.
This all seems like a great step forward. However, at the meeting I attended, it was clear there are numerous issues that confront the emerging technology. I spoke with John “Pliny” Eremic, an active member of the SMPTE Standards Community who now works at HBO. As former post-production manager at Offhollywood and co-owner of the first two shipping RED cameras, he’s been poised at the cutting edge of the video frontier for some time. Pliny says:
UHD is about more than spatial resolution. The areas where [the Standards Community is] looking to push the image are dynamic range, peak luminance, wider color gamut, temporal resolution meaning framerate, and spatial resolution.
To Pliny, the most important of these is dynamic range, and I tend to agree. Increasing only the resolution would do nothing to improve the image without also increasing the other aspects of the image, a detail consumer TV and camera manufacturers often seem to forget. Pliny goes on:
If you want to display more colors, there are certain colors you can’t hit unless you have a higher peak brightness. If you have higher peak brightness overall, the flicker fusion threshold actually changes. So an image that looks constantly illuminated when you are at 100 nits [a unit of measure for luminance], if you crank it up high enough, suddenly that same image looks flickery. Now you have to increase your refresh rate just to maintain the status quo of appearing constantly illuminated. If you have wider dynamic range on the display you’re going to need more bits to cover it to not get banding in things like skies and gradients. So all these things need to move in unison.
Besides considerations relating to image quality, other issues pertain to the physical cabling that carries the signals. As of now, a 6G-SDI cable is unable to transport a 4k video signal running at 60 frames per second at 12 bit in 4:4:4 color space. Two of them can’t even do it. To bandage the situation, more cables would need to be added into the pipeline, something that SMPTE board member Bill Miller considers unsustainable.
During his presentation at the SMPTE meeting, he delves into further detail to clarify some of the points in the report, and states we need new SDI technology that is capable of more data throughput, or an improvement in the image compression technology. Higher framerates are necessary, he says, and illustrates this visually with a high-motion image shot at 100 frames per second and the same subject at 50 frames per second.
The 100 frames per second image is crisp. There’s even text in the image that can be read due to the video running at a higher shutter speed. The 50 frames per second image looks like the same motion-blurred image we’re accustomed to seeing in a movie clip. Miller maintains that if we’re not going to end up with a crisper image after we increase the resolution, what’s the point?
More frames means more data, and with 8k cameras shooting up to 72 gigabits per second, this data management soon becomes serious. The challenge is on for countries like Japan, who want to broadcast the 2020 Olympics in 8k.
As insurmountable as these issues seem, it’s prudent to consider them now to establish solid standards before hardware is developed and built. It will help ensure that technology is properly implemented and systems are integrated with a complete production pipeline that ends with a greatly enhanced viewing experience.
[UHD images by Flickr member samsungtomorrow]
Tristan Kneschke is a freelance editor and colorist who operates Exit Editorial based in New York City. He has worked with a long list of clients including Victoria’s Secret, Pepsi, Amazon, Nissan, Colgate and Google, and has also enjoyed working with such artists as Beach House, RJD2 and Newvillager.
Your Comment
18 Comments
It's true indeed that image technology moves at a much faster pace than our storage or even our data infrastructure can allow it.
December 5, 2013 at 8:01PM, Edited September 4, 8:21AM
I personally see 1080p - 2K being a 16mm film equivalent with 4K being 35mm and 8K being 70mm. I still think that these are ONLY going to end up being acquisition formats in the longrun and that vector based video codecs will end up being the standard distribution format as it'll be the only streamlined format that small and large theater venues could afford. Vector based codecs are the only thing I can imagine becoming a true "standard" of anything as it would future proof footage to a degree. Not to mention vector codecs would be a fantastic way to digitally archive films for future generations.
December 5, 2013 at 8:04PM, Edited September 4, 8:21AM
Once a working codec is developed, that is.
December 5, 2013 at 8:06PM, Edited September 4, 8:21AM
I wonder how many people with 4k/8k cameras will ever have anything they produce projected in the theater because UHD resolutions will only make sense on a screen 20 ft or bigger. I have seen 84" 4k screen played next to a 90" OLED 1080p from 10-12 feet distance. I couldn't tell the difference.
What I was more impressed by was the size of the screen, than their resolution differences.
I'm sure there will be a 120" 8k screen out but who in New York City can fit one in their living room? <5% of the population?
December 12, 2013 at 1:44PM, Edited September 4, 8:21AM
8K may not be a consumer product for a long time. In terms of increments, the 4K (UHD-1) is already here. The next step is the 10-bit 4-2-2 and something closer to 48p or 60p for sports. Then higher DR. It's like when the HD sets were first introduced, all they was 1080i or 720p with a very low contrast ratio. It gradually improved to the current 1080p with the DR improving 5-10 times on the better (plasma) models. But even 1080i was a huge step up from 480i.
.
PS. OLED screens have a higher DR - and a much higher price point - than LCD's. The problem is that the the OLED's are limited to Full HD at the moment due the production constraints/yields. Once the costs come down within a year or two, consumers will be able to get the 4K HDR OLED's without feeling gouged. By then the codecs will improve (DAALA is promising to go Beta in a year), as will the processors and the broadband speeds.
December 5, 2013 at 9:19PM, Edited September 4, 8:21AM
Nice article, sounds like its just a matter of time. Maybe not as quick as some would hope but the good news is, it's going forward. Imho the best thing for me regarding all this 4k mumbo jumbo is we're now getting higher bit images, more latitude, DR and resolution with new cameras and recorders popping up. Hopefully manufacturers will phase out 8-bit all together as more 4k equiptment emerges. With that alone I'm happy to welcome 4k sooner than later. Thanks
December 5, 2013 at 9:58PM, Edited September 4, 8:21AM
4K, 8K. I'm impressed with how fast all this is coming. I didn't think it we would be this far along as afar as consumer products go. This isn't the 3D fad. I'm actually worried. As a consumer I don't want to invest in HD movies anymore. Will upscaling work as well as DVD's did in Blu-ray players?
December 5, 2013 at 10:25PM, Edited September 4, 8:21AM
I predict that this could spell the end of VHS, or was I a little behind the times already ? I'm only half kidding in that some of my fondest TV shows were shot on interlaced PAL, but better resolution etc, seems to result in the perception that many things that came before are not "good enough" to view, and I guess analogue libraries will hang in limbo.
December 6, 2013 at 12:12AM, Edited September 4, 8:21AM
No-one cares what SMPTE says, the 4K evolution will happen anyway.
December 6, 2013 at 1:10AM, Edited September 4, 8:21AM
Will you be able to tell the difference between 4K and 8K on normal viewing distances?
I see 8K being used in Cinema, but not for TV. 8K movies at the theatre, 4K at home.
But first be sure to be able to distribute that kind of data. The internet is now already one of the biggest users of electricity, what will be the influence of 4K and 8K?
December 6, 2013 at 1:14AM, Edited September 4, 8:21AM
I'm not all that concerned about 4K or even 8KTV's at the moment, since the only screens that I have seen that does UHD 4K look only slightly better than my 720p rear-projection TV at home. In cinemas I have yet to see pixelations on screens smaller than IMAX when projected 2K. So, the home-cinema projector I'm looking at buying is a 1080p version.
Heck, if even multimillion (soon billion?) dollar productions don't bother with 4K when mastering, then what are we supposed to see on all of those pixels
What entices me however is the promise of 10 bit, even 12 bit color. Even 120fps works nicely with the HFR3D that Cameron is supposed to shoot his Avatar Sequels with (60fps for each eye). Allthough I'll probably stick to 24fps playback unless it's supposed to be shown in anything else. And I'm throwing my hands up in much rejoicing ("yeeeeey!") when I read that they will finally ditch interlacing completely.
But what are we looking at here, if I want a screen that does all of that? A standard that is just about to be ironed out. Development of both screens and the interface to feed them the data will take a few years. Even more for decent versions to get down to consumer prices... I know I'll probably end up getting a >1080p screen eventually. But as it stands. I'm not holding my breath and declaring "Normal" Full HD dead anytime soon.
Also, while we may get 4K screens with 120 fps and 10bit color.... TV broadcasting is only just now inching its way to 720p highly compressed quality.... That's nine times the amount of pixels per frame... at 4 times the framerate... and up to 16 times the color information per channel.... uhm... that's 995 328 000 pixels per second at up to 12 bits per channel, 36 bit per pixel. 35 831 808 000 bits per second 4 478 976 000 bytes... so... roughly 4,48 GigaBytes per second of uncompressed sustained data-transfer... just for the image...
Normal HDTV at 720p for a comparison is at 1 327 104 000 bits per second (assuming 720p 60 fps) or... 165 888 000 bytes per second... or 165,88 Megabyte per second... that's a 27x increase in uncompressed through-put... uhm... they will get there... but not anytime soon.
December 6, 2013 at 4:04AM, Edited September 4, 8:21AM
A big problem for UHD is motion blur. At 24fps with a 180 degree shutter (which most people like the look of more than any other option for most things anyway) even slight motion really cuts down on the advantage of 4K displays. I recently delivered a 4K project with a few 1080 shots cut in. It was pretty much impossible to tell which shots were which except when the Epic shots were locked off. Then you could see the amazing detail. So if the future of 4K/8K means we're gonna shoot everything at a new base framerate of 120 fps just so we can appreciate the amazing detail, will we have ruined the aesthetic of 24fps?
December 6, 2013 at 9:22AM, Edited September 4, 8:21AM
FWIW, among are two issues bandied about here - one is the data that comes out of the camera; the other is the data delivered to the consumer. The former can probably be solved by multiple cable/ports. Intel is doing it with the dual cable Thunderbolt 2.0 that gives it 20 Gbps. HDMI 2.0 is also in the ballpark at 18 Gbps. Considering that a current 4K cameras like Z 100 compresses down to 600 Mbps, there's plenty of headroom for high FPS 4K in these cables. The final delivery of content to home can be easily offered by cable.
'
Here's a link on the new (and old/present) cable standards - http://arstechnica.com/information-technology/2013/12/why-comcast-and-ot...
.
Here's one on HDMI and T-bolt http://macthunderboltcables.com/hdmi-2-0-announced-brings-18-gbps-bandwi...
December 7, 2013 at 10:17PM, Edited September 4, 8:21AM
It seems like the Panasonic G4 will be coming under $3K. 4:2:2 10 Bit HDMI out or 200 Mbps All-I or 100 Mbps IB internally.
December 9, 2013 at 12:30PM, Edited September 4, 8:21AM
As it stands now, likely won't have 4K.
December 9, 2013 at 3:58PM, Edited September 4, 8:21AM
While 4K -thru- 12K are all great from a production acquisition and archival point of view, recent shoppers at Sony retail stores watching a 4K screen from 10 feet away said “Mewww, it’s nice but not that much better than the Sony HD we have at home.”
History holds the lesson: When we went from LPs/vinyl to CDs, the quality difference to the average consumer were giant leaps forward, and the recording and retail industry forced it by slowly vanishing LPs in stores.
Newer technologies after the CD? DAT (Digital Audio Tape), Digital Compact Cassette (DCC), Sony MiniDisc (MD) and SuperCD (SACD) all failed. Audiophiles and perfect-pitched musicians aside – the average consumer could not hear a difference between newer formats and their freshly-renewed CD collection, and were not about to start over again as they just did with their vinyl collection. Further, there was no “force” at the record companies and record stores to replace CDs with the newer tech.
In 1992, High Def could be compressed into a Standard Def 6MHz ‘t.v. channel.’ But HD’s implementation also needed a force: it took the federal government’s mandate that all broadcasters transmit digitally (SD or HD) to usher HD into the home en masse, ala 2009. Chicken or egg.
The 2K or 4K signal to the home is only slightly better than HD, not significantly better. With no FCC mandate this time around to force broadcasters into unilaterally sending 3D, 2K, 4K, the marketing hype may fall on deaf consumer ears.
December 17, 2013 at 4:17PM, Edited September 4, 8:45AM
... Before we send 4K to the home, I'd like to see a first priority of all HD signals move towards less compression and artifacts - macro-blocking and fades/dissolves at the top of the list. Water splashing around during the Olympics looks like bad YouTube, and all of us pros know it doesn't look like that in the Production Control Room, Master Control or editorial post bays. :)
December 17, 2013 at 4:20PM, Edited September 4, 8:45AM
Agree...exactly, hence my comment below.
December 17, 2013 at 4:23PM, Edited September 4, 8:45AM