HDTV Magazine
Welcome, Anonymous  •  Sign In  •  Register  •  Help

HDR: What’s It All About?

At ISE 2017, we heard talk about “HDR compatibility” for a wide variety of products. It’s fast becoming a buzzword in the AV community (even though most people don’t really understand what the term involves) and now, it appears, the marketing guys are once again getting out their messages ahead of common sense.

High dynamic range (HDR) is indeed a truly significant development in visual imaging. Consider that the human eye has a dynamic contrast ratio approaching 1,000,000:1, but the average LCD display is limited to about 300:1 contrast, and you can see where the ability to reproduce a wider range of tonal values electronically is a game-changer.

There are several different ways to achieve high dynamic range. A conventional standard dynamic range (SDR) camera might be able to capture 9 to 11 f-stops of light, and those can be reproduced by that 300-nit LCD display we just mentioned. Or, we can use an organic light-emitting diode (OLED) display that has much lower ‘black’ levels to reproduce 11 luminance steps from black to 100% white.

HDR changes the equation, as some HDR-equipped cameras can capture a dynamic range of as many as 22 stops of light! It stands to reason that we’ll need more horsepower at the bright end, so we’re now seeing Ultra HDTVs and 4K displays coming to market with a new backlight technology – quantum dots (QD).

A Samsung 65-inch Ultra HDTV with HDR capabilities at CES 2017

These tiny particles of metal compounds emit intense, saturated color light when stimulated by photons. To achieve red, green, and blue (RGB) imaging, a QD-equipped display uses a backlight of blue light-emitting diodes (LEDs) to light up a special optical film containing red and green quantum dots and also provides the blue for color mixing.

The result? Televisions and monitors that reproduce a peak white luminosity of 2,000 nits and a whole bunch of tonal values below that. But that peak white value isn’t a full screen for diffuse white: Rather, it represents specular highlights and not diffuse white. Think of how the sun dances and reflects off moving water, or how intense sunlight appears through a window when standing in a dark room. With HDR, you see everything from the deep shadows to intense beams of light reflecting off a water glass.

You’re probably thinking right now what, exactly, has this to do with interfacing signals? Consider this: To reproduce the additional steps of luminance in an HDR image, we will need far more than the usual 8 bits per color pixel that has become standard in everything from Blu-ray discs to cable television.

No, we need more bits – perhaps 10 bits per pixel, and maybe even 12 bits per pixel to correctly reproduce an HDR signal. And if you’ve been paying attention to previous posts, those additional bits increase the payload traveling through our interfaces and over AV/IT connections.

A demonstration of a 65-inch HDR OLED UHDTV


Indeed; the basic standard for high dynamic range – HDR 10 – uses static metadata and 10-bit color to define an HDR signal. More advanced HDR systems like Dolby Vision and Samsung’s proposed dynamic tone mapping may require 12-bit color for mastering, dithered down to 10 bits for delivery to an HDR display.

So how much of a difference does HDR make? For an Ultra HD signal (3840×2160 pixels), the data rate with a 60 Hz refresh rate and 8-bit RGB color is calculated as –


(4400×2250) x (60) x (3) x (10) = 17.82 gigabits per second (Gb/s)


With 10-bit color, the data rate rises to –


(4400×2250) x (60) x (3) x (12) = 21.3 gigabits per second (Gb/s)


That’s too fast for HDMI 2.0, which has a capped data rate of 18 Gb/s. To get around that obstacle, content mastered at 10 bits per pixel has to be presented with reduced color resolution (4:2:0) –


(4400×2250) x (60) x (1.5) x (12) = 10.7 gigabits per second (Gb/s)


That data rate can easily pass through an HDMI 2.0 connection. 4:2:0 color is recognized by every consumer television and computer monitor, as is the RGB (4:4:4) format. And in fact, the Ultra HD HDR Blu-ray format uses 4:2:0 10-bit color precisely for that reason.

High dynamic range isn’t just about a wider range of luminance values. It also means a much wider color gamut than before; one that is characterized by the International Telecommunications Union (ITU) as Recommendation BT.2020 (Rec.2020). This color space is much wider than the current Rec.BT.709 color space currently used for video content (which is based on the range of colors that can be shown by a CRT display, believe it or not) and which all current models of TVs and displays can easily reproduce.

The color volume in BT.2020 is so much larger that the green locus might only be reached by a laser-powered imaging system. And in fact, we’re starting to see laser-powered cinema projectors come to market for precisely that reason. Quantum dot-equipped LCD TVs can cover a good portion of this wider color space, as can the latest generation of OLED TVs and LED videowalls.

A comparison of the ITU BT.2020 and BT.709 color spaces

Add it all up – 20+ stops of luminance, plus billions (not millions) or color shades, and that’s quite a payload of data we’re jamming through a display interface or compressing for delivery through an IT network. We do have some tools at our disposal to make the job easier: Display Stream Compression, introduced by VESA in 2014, allows us to apply light, entropy-based compression to HDMI 2.1 and DisplayPort 1.3/1.4 signals to reduce the bit rate. 2:1 is easy to do, and 3:1 is practical.

For AV over IT, we have a difficult choice to make. Do we favor light compression (up to 4:1 with Motion JPEG2000) for minimal latency, or do we look for greater transmission efficiency with higher compression using the H.265 HEVC codec? Our 10-bit RGB 4K signal cited in the previous post, with a nominal data rate of 21 Gb/s, can be packed down to 5.25 Gb/s using 4:1 M-JPEG2000, but that’s too fast for a 1 Gb/s Ethernet switch.

By using 4:2:0 color resolution, we can reduce the bit rate to 2.7 Gb/s, but that’s still too fast for a 1 Gb/s switch. How about lowering the frame rate to 30 Hz? Now, we’re operating at 1.34 Gb/s – still too fast for that switch. Looks like we should probably design a network around a 10 Gb/s switch (which we should be doing anyway, as they’re cheap enough) so we don’t run into an HDR bottleneck.

We’re not aware of any AV manufacturers supporting HDMI or DisplayPort with DSC, but that would certainly make our lives a lot simpler – our 21 Gb/s 4K RGB signal could be transported at 10.5 Gb/s, and in fact we could easily step up to 12-bit color (12.5 Gb/s) if need be. So there are ways to lighten the payload, so to speak.

Notice in this discussion how we’ve avoided any mention of 4:2:2 color, which is a standard for broadcast and media production. The reason is that many consumer displays don’t recognize this color mode at all, or incorrectly. The result is that a 4:2:2 signal might be mapped to 4:2:0, resulting in some very strange color on the display. But an HDR signal with 4:2:2 color could easily be accommodated by the methods just described.

The important thing to take away here is that “HDR compatible” or “HDR ready” simply means that the signal management equipment is fast enough to handle the required clock rate. That’s it. No special sauce or magic is required – it’s all about speed, and nothing more.

     (Left) a demonstration of HDR through HDMI 2.1 and (right) through DisplayPort 1.4

So how do we signal HDR content is present? The HDMI 2.0 and DisplayPort 1.4 interface specifications allow for info frames that contain either static or dynamic metadata that tell the display how to format the HDR images. Without this data, the display will simply render HDR images incorrectly as overly-bright, washed-out SDR images – or, very dark images with little detail.

For static HDR data, the extension to the HDMI specification is a single letter. HDMI 2.0a in an interface means it will transport and recognize HDR 10 static metadata, while HDMI 2.0b is used for dynamic metadata (Dolby Vision) and HDMI 2.0g for the data-less hybrid log gamma (HLG) format proposed for HDR broadcasting.

As long as a distribution amplifier, matrix switcher, or signal extender is fast enough to handle an HDMI 2.0 signal (or DP 1.4) and passes through static and dynamic HDR metadata without altering it, that interface is “HDR ready” or “HDR compatible” (or “HDR friendly,” or “HDR BFF,” or whatever the heck you want to call it.)  The devil is truly in the details!

For the time being, HDMI 2.0 will be the dominant interface for transporting HDR content. Look for computer manufacturers to adopt HDMI 1.4 for HDR desktop monitors that are coming to market. As for distribution over IT networks, Motion JPEG2000 can be used, but a 10 Gb/s network will be required to handle HDR. H.265 HEVC is a better choice if additional latency can be tolerated – UHD signals with HDR can be transported at 25 – 35 Mb/s over standard networks. (UHD Blu-ray streams at 100 – 120 Mb/s with 2160p/60 4:2:0 HDR content.)

The post HDR: What’s It All About, and How Does It Affect Interfacing? appeared first on HDTVexpert.

Posted by Pete Putman, February 28, 2017 9:11 AM

More from Pete Putman

» - Currently Reading

About Pete Putman

Peter Putman is the president of ROAM Consulting L.L.C. His company provides training, marketing communications, and product testing/development services to manufacturers, dealers, and end-users of displays, display interfaces, and related products.

Pete edits and publishes HDTVexpert.com, a Web blog focused on digital TV, HDTV, and display technologies. He is also a columnist for Pro AV magazine, the leading trade publication for commercial AV systems integrators.