High Dynamic Range/HDR: What is the imaging tech, and how does it benefit products from TVs to smartphones?
HDR originated in the world of photography, but has since become a fixture on TVs and smartphones from the most expensive tellies right down to the cheapest ones. But what exactly is it, what does it do and how does it benefit what you watch?
Related: Best TVs
What is HDR?
HDR stands for High Dynamic Range and refers to contrast (or the difference) between the brightest and darkest parts of an image.
The idea is that your eyes can perceive whites that are brighter and blacks that are darker than traditional SDR TVs had been able to display. HDR content preserves details in the darkest and brightest areas of a picture, details that are often lost using old imaging standards such as Rec.709.
HDR10 is the standardized version of format that TVs and content are required to support. There are other more advanced versions available in Dolby Vision and HDR10+, which are dynamic forms of HDR. They optimise an image either scene-by-scene or shot-by-shot for a more exacting HDR performance.
The standardised HDR10 format simply sets brightness for the entirety of a programme, so bright and dark aspects of an image are the same throughout.
What makes an HDR TV?
There are two things that define an HDR TV. One is contrast and the other is the range of colours the set can display.
Contrast is one of the most important factors in how good a TV picture looks and a key part of what makes an HDR TV. It refers to the difference between light and dark parts of an image. The greater the difference, the greater the ‘contrast’.
Another is peak brightness, which refers to how bright a TV can go and is measured in ‘nits’. Think of one nit as the equivalent to the brightness of a candle.
The other measurement is black level. Similar to peak brightness, black level refers to how dark an image can appear and is also measured in nits. So for example, a TV could have a peak brightness of 400 nits and a black level of 0.4 nits. The difference between the peak brightness and black level is known as the contrast ratio. The brighter the light emitted by a HDR TV is, the higher the luminance levels and the larger the contrast ratio will be.
When it comes to colour, a TV must be able to process what’s known as 10-bit or ‘deep’ colour. 10-bit equates to a signal that includes over a billion possible colours.
In comparison Blu-ray uses 8-bit colour, which amounts to 16 million possible colours. With 10-bit colour, HDR TVs can produce a vastly expanded range of colours, reducing overtly obvious gradations between shades. Subtle shading helps to make a scene look far more realistic.
However, a TV doesn’t need to be able to display all the colours in a 10-bit signal. It just has to be able to process the signal and produce an image based on that information. TVs from Hisense, for example, feature 8-bit panels with FRC (frame rate control) to deliver the extra colours.
An HDR TV must reproduce a certain amount of what’s known as ‘P3’ colour or the DCI-P3 colour standard. DCI-P3 is a colour space most commonly known for its use in digital film projection. In general terms, it refers to the range of the colour spectrum and in order to display HDR images well, a TV display must cover as much of the P3 colour space as it can. The guidance for the UHD Premium says that more than 90% of the P3 colour must be covered. The DCI-P3 colour space is larger than what SDR TVs use – Rec.709 – which means it can cover more colours.
Simply put, a HDR TV that can cover a wider space within the DCI-P3 colour spectrum will be able to display more colours and be more effective with HDR content.
Related: Best 4K TVs
How do I know if a TV is HDR compatible?
Most TVs available on the market now support HDR. Usually it’ll say so in the description of the TV or with a logo on the box. If you’re still unsure, check online to see what forms of HDR the TV supports. There is the UHD Premium standard to consider too, but not all manufacturers have decided to carry a badge that says they adhere to the standard.
Cheaper sets are only capable of limited brightness, and TVs such as the Hisense Roku TV and Toshiba U29 can only hit a peak brightness of around 350nits. Highlights (often the brightest parts of an image) aren’t as bright, and black levels aren’t as deep either.
The further up the chain you go, the brighter a TV can get. More expensive efforts can hit a peak brightness of around 1000 nits, while Samsung’s QLEDs deliver some of the brightest images on the market, hitting 1500 nits or more. If you want to see the benefits of HDR in action, we’ suggest a TV capable of at least 600 nits or so.
What is HDR10?
HDR10 is the industry standard for consumer sets and all TVs must support it to be considered HDR compatible. Announced back in 2015, it uses the wide colour gamut Rec.2020 colour space and a bit-depth of 10-bits.
It’s an open format, and requires at least an HDMI 2.0a connection to carry the signal from an external source. HDR10 content is mastered at 1000 nits, and unlike other forms of HDR it’s not dynamic, so brightness and darkness levels will stay the same across the running time.
As HDR10 allows for greater contrast than SDR content and picture quality should be improved even on cheaper sets. HDR10 is considered to include WCG (Wide Colour Gamut), which depicts a more diverse range of colours. Not all TVs support WCG though, especially budget sets.
What is HDR10+
HDR10+ is an open standard adopted by Samsung and other manufacturers and video streaming services, the biggest being Prime Video. It improves on HDR10 by using dynamic metadata instead of static metadata.
As a dynamic HDR variant, it adapts the brightness of individual scenes or frames. For example, if a scene was meant to be shown at lower brightness, HDR10+’s dynamic approach would reduce the brightness level in real-time to match the creative intent. While HDR10+ achieves the same objective as Dolby Vision, it leaves a little more freedom for different displays to use their own strengths and processing to the content.
Compared to Dolby Vision, there isn’t as much content available in the format, but it is growing and video services such as Prime Video and Rakuten TV support it.
What is Dolby Vision?
Dolby Vision is a variant of HDR and like HDR10+ it adds a layer of dynamic metadata to the core HDR signal. This dynamic metadata carries scene-by-scene or frame-by-frame instructions as to how a TV should present its HDR images. In doing so it improves everything from brightness to contrast, detailing and colour reproduction.
As it’s used by Hollywood creators to grade their films during production, as well as in cinemas, its more widespread than its nearest rival, with more manufacturers and streaming services delivering content in Dolby Vision HDR. The only significant holdout is Samsung, which pushes HDR10+.
Like HDR10+, content in Dolby Vision looks even better than standard HDR10 content with better luminance levels, more visible detail and brighter highlights. Dolby Vision’s value is more noticeable with cheaper sets, as its presence helps to tone-map (adapt the picture to the display) better. Dolby has also brought out Dolby Vision IQ, which optimises HDR content to suit the changeable lighting conditions of a normal living room. The result is that the viewer can still see detail in the darker parts of an image that would otherwise be washed out by ambient light.
What is HLG?
HLG stands for Hybrid Log Gamma and this is the broadcast version of HDR10. It performs the same functions in adding data for luminance levels so content can achieve greater brightness and darkness, but this version is one you’ll sample on services such as BBC iPlayer and Sky Q. In the US, HLG does not appear to be used.
It was developed by the BBC and NHK (the Japanese national broadcaster) to display a wide dynamic range while still being compatible with SDR transmission standards, so even for people without a HDR TV can receive the same feed as the content will simply downsample to be compatible with SDR displays. This makes HLG a backwards-compatible, more cost-effective version of the format for content producers, who won’t have to produce two feeds to take into account different TVs.
What about HDR on smartphones?
HDR technology has filtered to other products such as smartphones. Phones and tablets are increasingly bumping up their screens’ peak brightness, and there’s been a concerted effort to support the HDR-compatible versions of streaming services such as Netflix or Prime Video.
HDR on phones doesn’t have the same impact or subtlety of a good TV, but it can make a difference with punchier colours and better contrast.
The number of HDR-compatible smartphones is increasing each year with even more affordable efforts joining in on the fun. Samsung’s S20 range (unsurprisingly) supports HDR10+, as does the Motorola Edge. Apple’s iPhones feature HDR and Dolby Vision, somewhat mirroring the split that’s visible in the TV world.
Should you buy an HDR TV?
Virtually all TVs sold are HDR-compatible, so it’d be harder to not buy one. However, the quality varies as cheaper TVs aren’t bright enough to HDR justice. The further up the ladder you go, the better your HDR experience will be.
So it’s worth doing some research on the product before you buy, to ensure you’re getting the specs you need for a true HDR experience and we include on a number of our reviews the peak brightness (in nits) that a HDR TV is capable of to give you an impression of how bright it can go.
HDR TVs keep getting each and it’s never been a better time to get a 4K HDR TV to take advantage of the format.