By John Mailhot, CTO of Infrastructure, Imagine Communications.
Due to its significant advancement in picture quality, HD video was instrumental in helping broadcasters captivate and retain audiences during the late ‘90s and early ‘00s. Fast forward to today, and high dynamic range (HDR) is playing a similar role, offering superior visual quality even to a casual viewer’s eye.
But despite the introduction of HDR-capable televisions as early as 2016, the mass adoption of live HDR broadcasting has been slow to materialise. One significant factor in the hold-up was the Covid-19 pandemic. Amid lockdowns, the focus for broadcasters shifted toward adapting workflows to fully remote setups, effectively putting HDR on the backburner.
As broadcasters strive to deliver more modern and interesting content, the upcoming games represent a crucial opportunity to demonstrate the capabilities of this technology
However, with the return of in-person operations, the industry has once again set its sights on delivering more impressive programming with HDR technology. And despite facing multiple hurdles, the progress toward this goal has been substantial.
Shedding some light
HDR broadcasting tries to deliver a very large range of brightness to the viewer. In the real world, we encounter an enormous spectrum of brightness, extending from the darkest shadow in a corner to the overwhelming brilliance of the sun. In traditional “standard” dynamic range (SDR) — utilised by the industry for the past seven decades — the output range of cameras was quite restricted, with the variance between darkness and light capped at approximately nine f-stops, which was also about the limit for the best CRT (and early flat-panel) TVs.
Modern CCD sensors, LEDs, and other active display technologies offer a far more expansive range, which the HDR system is designed to support. However, this leap in brightness range introduced a significant problem — transporting the increased signal range through the existing television production and distribution infrastructure, particularly the ubiquitous 10-bit signal processing pipelines.
As with many problems in the broadcast industry, two technologies were developed to tackle the same issue: the hybrid log-gamma (HLG) and perceptual quantization (PQ) formats. These formats are both documented in ITU-R BT.2100 but were created by two groups that approached the issue from different perspectives.
The HLG group concentrated on the scene light, focusing on the image’s brightness and its transformation into the representation on the wire. The team behind PQ took a reverse approach. They began with a reference display and examined what the human visual system could comprehend in terms of detail at varying brightness levels. Both formats are widely utilised, with HLG typically being employed during shooting and production and PQ more commonly used during distribution. Distribution systems such as HDR10 and Dolby Vision are based on the PQ standard.
Final hurdles
Another challenge on the road to widespread live HDR productions has been the absence of standardised workflows that ensure consistent results ― particularly for shooting an event that must deliver both HDR and SDR versions to the client ― enabling a production workflow that can be replicated with ease across different shows and live events. To better grasp the issue, it’s useful to again reflect on the switch to HD.
When the industry first began the migration from SD, two separate trucks were used: one for each format. In a similar vein, the initial test HDR productions required separate trucks, crews, and cameras. In both cases, it quickly became apparent that for economic reasons, broadcasters needed to consolidate their operations into a one-truck setup that would produce both HDR and SDR versions. If executed well, the SDR version would also offer better image quality than it previously did. This approach is known as a single-master workflow, and every broadcaster recognises it as their ultimate goal.
Over the past few years, one of the more significant advancements in HDR has been the publication of a single-master workflow by NBCUniversal, which they utilise for high-profile events. The set of documents for this workflow is freely available to the industry.
A final hurdle in live HDR production involves delivering content seamlessly to viewers. While a majority of TVs sold in the past few years are both UHD and HDR-compatible, terrestrial television delivery of HDR (ATSC-3 and DVB-T2) is still in the trials phase. For cable and satellite operators, HDR content can be delivered to consumers’ homes, but it requires specialised set-top boxes and enough consumer interest to offer them as a service.
Over-the-top (OTT) delivery services like Amazon Prime and AppleTV+ are in the best position to deliver HDR content. This is due to their ability to separately deliver HDR and SDR signals via the internet, without incurring significant additional costs. Conversely, satellite and cable operators would need to allocate separate transponder bandwidth for those two signals. In addition, OTT services can offer VOD-based HDR content for consumers to select from.
Ready for showtime
With the advances that have taken place since the pandemic, the 2024 Paris Summer Games have the potential to be a watershed moment for live HDR productions. As broadcasters strive to deliver more modern and interesting content, the upcoming games represent a crucial opportunity to demonstrate the capabilities of this technology.
UHD and HDR signals for certain events will be available from the host, and each rights-holder is considering how to integrate HDR and UHD into its operations. UHD and HDR features are readily available in current-generation video processing equipment. In addition, most cameras sold in the last two or three years come equipped with an HDR feature, which can be easily activated by purchasing a license. Everything is in place for the Paris games to mark a pivotal point in HDR’s journey, potentially sparking the consumer’s demand for more HDR content delivery in the future.