4K in cinema: where did we come from, where are we now, and where are we going?

November 14, 2018
  • Home >
  • News >
  • 4K in cinema: where did we come from, where are we now, and where are we going?

In 2018, HDR or High Dynamic Range has been somewhat the talk of the town related to image format and image quality in cinema. Given all the attention and buzz of the recent months, it’s easy to forget about how these kinds of new-technology-discussions go through an evolution of themselves. Most of us probably still remember the attention for (higher) frame rates that came with the 2016 release of Billy Lynn’s Long Halftime Walk; and even going back 2011 when James Cameron demonstrated 3D HFR footage. But who still remembers the curve that 3D technology (the days of “full-resolution triple flash”) went through? 

 

In this article we will zoom in on the evolution and state-of-play of another image quality metric: resolution. In what could be called a historical battle between 2K and 4K; resolution has gone through a lifecycle of its own in cinema.

 

The early days

On June 18, 1999, DLP Cinema projector technology was used for the first digital cinema screening, that of Star Wars Episode IThe Phantom Menace. The projectors had a native resolution of 1280×1024, and the 2.35:1 aspect ratio was achieved through an anamorphic projection lens. That means that when this wonderful industry of digital cinema was born and
the first steps towards the new future for a whole industry were taken, we were not even using 2K resolution. Even more: we were stretching the pixel horizontally to fit the scope format. In the context of those pioneer days, it’s perfectly understandable that this was accepted: other, more important issues had to be solved back then when the whole concept
of digital was new. And, the 1280×1024 resolution was not at all bad given the industry standards of display technology those days. In 1999, more than 90% of TV sets sold in US used CRT technology. [2]

 

When DCI (Digital Cinema Initiatives, LLC) was formed in 2002, this organization put forward the standards that would define digital cinema as we know it today. Two resolutions were included in the standard: 2K (2048×1080) and 4K (4096×2160). This is mentioned i.a. as part of the paragraph on Projection Fundamental Requirements: “The projector is required to display either a native resolution of 4096×2160 or 2048×1080.” [3]

 

Since that standardization, the cinema market has now almost achieved 100% digitization. That so-called ‘first wave’ has left the market with most of the screens and projectors being 2K. No official numbers exist, but an 80/20 split between 2K/4K is likely. [4] Several drivers have contributed to that. Cost was a first one: 4K technology requires more capacity, bandwidth, and hence investment than 2K. This applies on the one hand to the content creation part: cameras, rendering computers, storage, transport, … were in the early days significantly more expensive in 4K compared to 2K. But also, on the exhibition part: 4K projectors and media servers had a price premium when introduced to the market. 

 

Another effect was the availability of 4K content. Titles released in 4K were limited in the early days of digital cinema. E.g. in 2012 only 11 movies were distributed to cinemas in 4K. This meant that the incentive for the industry (and exhibitors specifically) to turn to 4K technology was also small.

 

Today, almost 20 year after the formation of DCI, cinemas are looking into renewing their projection equipment. Zooming in on resolution, what is the outlook for the second wave that lies ahead of us?

External forces

The cinema industry has consistently managed to innovate the viewing experience ahead of home entertainment. Today however, we are on the verge of being leapfrogged by consumer technology. Some graphs will make this clear: 

Prices for 4K TV’s are decreasing fast, to a level where they are lower (for larger screens) or competitive to Full HD (1920×1080, the consumer-equivalent of 2K) sets. This leads to a fast adoption of 4K TV technology in the home: in 2016, more than 50% of all TV’s sold in North-America were already 4K. For over 60-inch TVs, UHD accounted for 96 percent of the shipments. This year it will be 99 percent. 

 

Note that 4K in the home is actually 3840×2160, just below the cinema specification. This is also called ‘4K UHD’ to indicate the difference. Six years ago — back in 2012 — 4K UHD was commercialized. In 2000 it was HD (High Definition, or a million pixels) and in 2006 it was Full HD (2 million pixels). Every six years there has been a generation shift. The message is clear: After a 10-year run, Full HD in the home will soon be obsolete at the high end, just as HD went extinct in 2009. [1] 

 

 
 
4K movie releases for the home are picking up fast. The graph above only considers 4K UHD Blu-ray releases (not streamed content) and shows that the number of titles grew almost with a factor 10 since 2012. The growth in 4K cinema releases is also clear, but not to the same extent as the movies released for the home. Still, in just four years’ time, the numbers of titles more than tripled.
 
 
As the graph on the left shows, Netflix alone has more than 120 titles available in 4K, of which 18% are movies.                        

The second wave

Going into the second wave of digital projection in cinema, the stars are aligned differently compared to ten-fifteen years ago. Moviegoers find 4K normal now. But can it also become (the new) normal in cinema? Will patrons even accept and understand when their preferred movie medium lags behind on their TV in a certain specification?

 

Let’s look at the first driver: availability of content for the cinema. As more titles are being distributed and promoted in their 4K format, exhibitors will be incentivized to embrace and adopt 4K projection technology. Looking at the availability of titles for the home mentioned above, you see that content creators are organizing their workflows around 4K. This does not mean that there will be a one-on-one spillover to cinema; nor that 4K will reach 100% market share just like that. But there is an undeniable trend emerging. This trend should be combined with another one: the dropping cost of computing power and network bandwidth. Stimulated i.a. by overarching trends like cloud computing, the cost of rendering and processing 4K has dropped to levels close to those of 2K. Also, the complexity and costs of transferring 4K content (which has a larger size than lower resolution) is now no longer limited by — the costs of — internal and worldwide network capacity. Looking at local networks inside a cinema, postproduction facility or studio lot, these are stepping up from Mbps to Gbps (Giga bit per second) capacity.

 

The second driver for exhibitors will be the price difference between 2K projectors and servers and the 4K alternative. Initially this premium was higher than 10%, but this is decreasing as well. This is driven by standard price erosion on the one hand, but also by the introduction of new technology. Miniaturization and availability of more powerful processing chips will make it possible to offer next-generation 4K products at a price point of 2K first-gen products soon.

 

Conclusion

Historically, 2K always had the edge on 4K in cinema. The first digitization wave made the market land on an 80/20 split. Since then however, forces in the consumer space made 4K the new normal. The disappearing cost and complexity hurdles in cinema, could also make it the new normal for the second wave. 

 

What’s your prognosis? Do you think exhibitors can afford to not choose 4K when making their technology choices for the next 10 to 15 years?

 

[1] https://www.zdnet.com/article/the-year-of-8k-is-here/
[2] https://www.nytimes.com/1999/01/14/technology/flat-tv-s-still-for-the-fat-wallet-set-improve-as-prices-fall.html 
[3] http://dcimovies.com/specification/DCI_DCSS_v12_with_errata_2012-1010.pdf 
[4] https://www.screendaily.com/features/the-resolution-war-is-cinema-falling-behind-home-entertainment-on-innovation/5124023.article