MediaKind: Are we HDR ready?

October 19, 2018

By Matthew Goldman, SVP Technology, MediaKind

The media industry has long championed the merits of Ultra-High Definition (UHD) content with High Dynamic Range (HDR) for its delivery of the all-important immersive, next generation viewing experience that consumers are demanding. According to a May 2018 Futuresource Consulting report, more than 100 million 4K UHD televisions will be sold worldwide this year, of which around 60 percent will include HDR.

The more people see of HDR, the more they want it – and it’s a view held by broadcasters, content owners, service providers and consumers alike. However, while there is a desire to provide, deliver and receive content in this format, there are still a number of technical challenges to overcome before we see widespread adoption.

For clarity, HDR as a system refers to the combination of HDR transfer function, Wide Color Gamut (WCG) and 10 bit sample depth (quantization).

The main HDR formats

While significant strides have been made this year, there are still too many HDR formats in existence. This has caused industry confusion about how to proceed and hampered deployment. They include:

HDR10

This was the first major format, initially championed by the Blu-ray Disc Association and the Consumer Technology Association. This format uses the perceptual quantization (PQ) transfer function along with static mastering display color volume metadata (SMPTE ST 2086 standard plus additions). It’s the most widely used HDR format for motion picture and TV episodic productions.

Dolby Vision

Dolby Vision uses the PQ HDR transfer function along with dynamic content-dependent metadata (SMPTE ST 2094-10) that can change on a frame-by-frame basis. Dolby Vision is claimed to have superior operating characteristics and to deliver a richer consumer experience versus the other HDR formats. However, some manufacturers have balked at the per device license fee, especially at the lower end of the consumer electronics market.

HDR10+

HDR10+, another PQ-based system with dynamic metadata (SMPTE ST 2094-40) was subsequently introduced by Samsung and has since garnered support from Panasonic. Content producers and distributors such as Amazon, Warner Brothers and 20th Century Fox have also shown their commitment to this format.

Hybrid Log-Gamma (HLG)

Jointly developed by the BBC in UK and NHK in Japan, HLG is an alternative HDR transfer function that also happens to be royalty free. HLG10 has emerged as an early favorite among the HDR formats for live TV production, partially because it does not use metadata.

PQ10

The baseline of all PQ-based HDR systems, PQ10 consists of the PQ transfer function, WCG and 10 bit. No metadata.

SL-HDR1

This system, developed by Philips and Technicolor, uses a completely different workflow from the others. The original source HDR stream (using either PQ or HLG transfer function) is converted to standard dynamic range (SDR) plus “HDR reconstruction” metadata.

The situation today – HDR systems

The above are the six most common HDR systems being discussed in the industry. Most of these systems have been or will soon be specified by the major regional industry associations, including the ATSC, DVB, and SCTE/ISBE.

So here we are: six different HDR systems. While the Ultra HD Forum has worked diligently to create and publish guidelines on how to implement these systems, having six major options for HDR has created industry confusion nonetheless and, in fact, has delayed HDR deployments. Key questions remain:

  • Which format is the best to use for the use case being considered?
  • Which format will survive in the long run?
  • Which format produces the most compelling HDR quality? Or are they all relatively the same?

Live production and delivery challenges

The live mixing of content in SDR with content produced in the various HDR systems seriously complicates live production. While indications show pre-produced content in 4K is already increasing in availability, live content requires everything to work flawlessly in real-time – there is no retransmission option, not to mention the obvious bandwidth challenge with producing and delivering 4K with its 3840×2160 resolution at 50 or 60 frames/s progressive. The situation is exacerbated when HDR is included. Especially challenging is mixing live with pre-produced content that has been produced with different light levels.

 How can we get ready for adoption?

There are a number of solutions entering the market to help solve some of these challenges. For example, our Encoding Live video processing platform, part of the Cygnus and Aquila family of solutions in the MediaKind Universe, enables live real-time ‘color volume’ mapping, converting SDR to HDR (inverse tone mapping), HDR to SDR and, between different light levels of HDR. Live tone mapping (inverse or forward) can be used to mix live and pre-produced content, enabling the industry to deliver it in UHD, end-to-end. Content producers can determine the native format, while broadcasters are able to perform conversion as necessary. This removes many of the barriers caused by having multiple formats by converting all content to a common ‘house format’.

Earlier this year, the Ultra HD Forum released its UHD Phase B Guidelines (1.0), which aim to help the industry address some of the remaining challenges that still exist, including the addition of dynamic HDR metadata systems and high frame rate (HFR) systems. In the near term, it is likely that both 4K (2160p50/60) and full HD (1080p50/60) with the more basic HDR systems will continue to dominate as broadcasters, service providers and operators need to offer services that deliver the most immersive viewing experiences for end-users, with the simplest implementation.

Yet, HDR can be monetized successfully because the impact on the viewing experience is very noticeable, often more so than the difference between 1080p and 2160p (4K) spatial resolution. We already are seeing how some broadcasters are delivering content in 1080p50/60 HDR. 4K displays do a wonderful job of upconverting this to 2160p, and assuming the 4K display supports HDR, we can enable the ‘wow’ factor that in many consumer viewing environments is very close to native 4K HDR viewing. By greatly reducing the bandwidth requirements and offering this richer, more lifelike picture quality over standard HD SDR, the industry can leverage 1080p HDR, a format that I dub “the best bang for the bit.”

What next?

The next step should be to define a small number of ‘universal’ profiles and agreement over the best methodology to use. As more UHD/HDR services come to market, a more aligned approach will lead the industry towards best practices and recommendations. In doing so, we can finally draw closer to realizing the full potential of HDR and deliver next generation immersive viewing experiences for everyone, everywhere.