High definition video used to be defined by two standards: 720p and 1080p. These remain the primary standards in use today: 1080p is a higher resolution, but as with everything digital, there’s a trade-off: the file sizes are bigger.
More resolution = more data. Also, more bandwidth-intensive.
At a resolution of 3840 x 2160 pixels, a 4K display is four times denser than a 1080p display—thus the name. At 8.3 million pixels, the resolution is sharp. But is it sharp enough to justify both the additional bandwidth and the cost of upgrading video infrastructure?
See Also
Big Video Insights June 2017
Big Video: Coming soon, to a mobile device in your pocket
Next-gen metro-networks needed for successful video service strategies
Big Video: Key to digital transformation and much more
For huge commercial displays, certainly. But in 2013, CNET reporter Geoffrey Morrison wrote that the idea of having a 4K display in the home is “stupid.”
Morrison wrote a lengthy and technical argument that explained how human vision won’t pick up the subtle differences between 1080p and 4K at reasonable display-sizes and reasonable distances from eye-to-display because of the “finite resolution” of the human eye. He’s got a point.
But...is it STILL stupid?
The adoption of 4K video in the home—where economies-of-scale start to drive commodity prices down—has changed, and so has Morrison’s tune. Like any good tech journalist, he’s willing to admit that the ever-morphing videoscape changed the channel on him.
“With prices dropping and picture quality improving, 4K TVs are finally worth considering for mainstream TV buyers,” he wrote in a 2015 article. “Although they didn’t make sense in the past, 4K TVs aren’t stupid anymore.”
That was two years ago, and Morrison now focuses on HDR (High Dynamic Range), a technology only available in 4K sets and another reason why the industry is moving towards 4K.
It takes time for video-hardware economies-of-scale to be felt in the marketplace: much like in the semiconductor industry, factories must be built that can manufacture the new displays in large quantities before assembly, shipment and mass delivery to your local retailer.
Morrison again: “Modern TVs are made from huge sheets of ‘motherglass.’ From this big piece, companies slice up smaller pieces to make televisions. It’s easier (read: cheaper) to make a big piece and cut it into smaller TVs.”
It’s odd to think of a factory making a huge piece of glass, then slicing it into smaller chunks to make TV sets, but that’s how it works. This is why you can get a slew of big-screen 4K televisions for less than $1,000 nowadays.
As 4K slowly becomes the new standard, it’s forcing everything in the video supply chain (content, hardware, delivery methods, and HDR) to follow suit. As streaming increasingly supplants optical discs as the fountainhead for the video cornucopia, how much bandwidth is this new behemoth going to gobble?
Streaming the beast
Live-streaming HD is relatively bandwidth-intensive, but 4K is another beast altogether. According to Wikipedia, “High Efficiency Video Coding [aka H.265] should allow the streaming of content with a 4K resolution with a bandwidth of between 20 to 30 Mbps.” That’s using an optimal codec under ideal circumstances, and a figure of 50-100 Mbps might be more accurate, all factors considered.
The standard for video coding/decoding, or codec, is a factor in determining requisite bandwidth. The H.265 codec is now used widely, but competition includes Google’s VP9 codec.
Increasingly, consumers prefer streaming video. Cinemaphiles buy optical disks for extras like audio commentaries and other extras, but the Korean drama/House of Cards TV-drama crowd cares only about subtitles in their preferred language. Streaming is the way forward—even though 4K video-via-disk will be introduced and will find some level of adoption.
RELATED ARTICLE: Big Video: Coming soon, to a mobile device in your pocket
This article first appeared in Telecom Asia Big Video Insights June 2017 Edition