What is high dynamic range, and how does it benefit displays from TVs to smartphones?
HDR originated in the world of photography, and is best described as a merging of several photos at different exposures to create one high dynamic range photo. It has become a fixture on TVs and smartphones from the most expensive tellies right down to the cheapest ones, though it works differently than it does in photography.
So what exactly is HDR, what does it do and how does it benefit what you watch?
What is HDR?
HDR stands for High Dynamic Range and refers to contrast (or the difference) between the brightest and darkest parts in an image.
The idea is that your eyes can perceive whites that are brighter and blacks that are darker than traditional SDR displays had been able to show. HDR content preserves details in the darkest and brightest areas of a picture, details often lost using older imaging standards such as Rec.709.
HDR10 is the industry standard format, and all devices and content are required to support it. HDR10 sets the brightness for the entirety of a programme, so the level of bright and dark parts are the same throughout.
There are more advanced versions available in Dolby Vision and HDR10+, which are dynamic forms of HDR. They use metadata to optimise an image either scene-by-scene or shot-by-shot for a more fidelious HDR performance.
What makes an HDR TV?
Contrast is one of the most important factors in how good a TV picture looks and a key aspect of HDR TVs. It refers to the difference between brightest and darkest parts of an image. The greater the difference between these two poles, the greater the contrast.
Another is brightness (or luminance), which refers to how bright a TV can go and is measured in ‘nits’. One nit is the equivalent to one candela (one candlepower) per square metre (1cd/m2). Around 500 – 600 nits of peak brightness is considered to be enough to convey the effects of HDR for a TV.
The other measurement is black level. Black level refers to how dark an image can appear and is also measured in nits. For example, a TV could have a peak brightness of 400 nits and a black level of 0.4 nits. The difference between the peak brightness and black level is known as the contrast ratio. Strong black levels and high luminance combine for a bigger contrast ratio, resulting in improved picture performance. Put simply, a higher contrast ratio makes for a stronger looking image.
When it comes to colour, a TV must be able to process what’s known as 10-bit or ‘deep’ colour for HDR content. 10-bit equates to a signal that includes over a billion possible colours. In comparison SDR (Standard Definition Range) Blu-ray uses 8-bit colour, which amounts to 16 million possible colours. With 10-bit colour, HDR TVs can produce a vastly expanded range of colours, reducing overtly obvious gradations between shades. Subtle shading helps to make a scene look more realistic.
However, a TV doesn’t need to be able to display all the colours in a 10-bit signal. It just has to be able to process the signal and produce an image based on that information. TVs from Hisense, for example, feature 8-bit panels with FRC (frame rate control) to deliver the extra colours.
A HDR TV must reproduce a certain amount of what’s known as ‘P3’ colour or the DCI-P3 colour standard. DCI-P3 is a colour space most known for its use in digital film projection. The DCI-P3 colour space is larger than what SDR TVs use – Rec.709 – which means it can display more colours.
In general terms, it refers to the range of the colour spectrum, and to display HDR images well, a TV display must cover as much of the P3 colour space as it can. The guidance for the UHD Premium format says more than 90% of the P3 colour space must be covered.
A HDR TV that can cover a this amount within the DCI-P3 colour spectrum will be able to display more colours and display HDR content more effectively. Manufacturers don’t always reveal this information, but you can find results of DCI-P3 coverage in reviews online.
How do I know if a TV is HDR compatible?
Virtually all TVs on the market are HDR enabled, it’ll be rare to find one that isn’t. It’ll usually say so in the description of the TV or on the box with a logo. If you’re still unsure, check online to see what forms of HDR the TV supports or read one of our reviews.
There is the UHD Premium standard to consider too, but not all TV brands carry a badge that says they adhere to that standard.
Cheaper sets are only capable of limited brightness, with TVs such as the Hisense Roku TV and Toshiba UL21 able to hit a peak brightness of around 350 nits. Highlights (often the brightest parts of an image) aren’t as bright, and black levels aren’t as deep or as consistent either.
The further up the chain you go, the brighter a TV can get. More expensive efforts can hit a peak brightness of around 1000 nits, while Samsung’s Mini LED TVs deliver some of the brightest images on the market, hitting 2000 nits or more. If you want to see the benefits of HDR in action, we’d suggest a TV capable of at least 600 nits or so.
What is HDR10?
HDR10 is the industry standard HDR format for consumer sets and all TVs must support it to be considered HDR compatible. Announced in 2015, it uses the wide colour gamut, Rec.2020 colour space and a bit-depth of 10-bits.
It’s an open format and requires at least an HDMI 2.0a connection to carry the signal from an external source (like a streaming stick or 4K Blu-ray player). HDR10 content is mastered at 1000 nits, and unlike other forms of HDR it’s not dynamic, so brightness and darkness levels stay the same across the running time.
But HDR10 allows for greater contrast than SDR content and picture quality should be improved even on cheaper sets. HDR10 includes WCG (Wide Colour Gamut), which depicts a more diverse range of colours, although not all TVs support WCG though, especially budget sets.
What is HDR10+
HDR10+ is an open standard adopted by several TV manufacturers (such as Samsung) and video streaming services such as Amazon Prime Video. It differs from HDR10 by using dynamic metadata instead of static metadata.
As a dynamic HDR variant, it can adapt the brightness of individual scenes or frames. For example, if a scene was meant to be shown at lower brightness, HDR10+’s dynamic approach would reduce the brightness level in real-time to match the creative intent. While HDR10+ achieves the same objective as Dolby Vision, it offers more freedom for different displays to use their own strengths and processing while Dolby Vision is stricter in its instructions.
Compared to Dolby Vision, there is not as much content available in HDR10+, though Prime Video and Rakuten TV support it and a number of 4K Blu-rays carry the format, too.
What is Dolby Vision?
Dolby Vision is a variant of HDR and like HDR10+ it adds a layer of dynamic metadata to the core HDR signal. This dynamic metadata carries scene-by-scene or frame-by-frame instructions as to how a TV should present its HDR performance. In doing so it improves everything from brightness to contrast, detailing and colour reproduction.
As it is used by Hollywood creators to grade their films, as well as in cinemas, its more widespread than HDR10+, with more manufacturers and streaming services delivering content in Dolby Vision HDR. The only significant holdout is Samsung, which pushes HDR10+.
Dolby Vision’s value is more noticeable with cheaper sets, as its presence assists in improved tone-mapping, essentially the way in which the TV can adapt the image to the capabilities of the display.
Dolby has also brought out Dolby Vision IQ, which optimises HDR content to suit the changeable lighting conditions of a normal living room. The result is the viewer can still see detail in the darker parts of an image that would otherwise be washed out by ambient light.
What is HLG?
HLG stands for Hybrid Log Gamma and is a HDR format used by broadcast services. It performs the same function by adding metadata for luminance levels so content can achieve greater contrast and truer colours.
HLG was developed by the BBC and NHK (the Japanese national broadcaster) to display a wide dynamic range while also remaining compatible with SDR transmission standards. What that means is people without a HDR TV can receive the same broadcast feed instead of a separate one, as content will automatically downsample to be compatible with SDR displays.
This makes HLG a backwards-compatible, more cost-effective HDR format for content creators, who won’t have to produce two feeds for different TVs. BBC’s iPlayer uses it, as does Sky for sports such as football and F1.
What about HDR on smartphones?
Phones and tablets are increasingly bumping up their peak brightness, and there’s been a concerted effort to support the HDR-compatible versions of streaming services such as Netflix or Prime Video.
HDR on phones doesn’t have the same impact or subtlety of a good TV, but it can make a difference with punchier colours and better contrast.
The number of HDR-compatible smartphones is increasing each year with even more affordable efforts joining in the fun. Samsung’ S22 range supports HDR10+, as does the Motorola Edge Lite 20, while Apple’s iPhones feature Dolby Vision to name but a few smartphone brands’ allegiances.
Should you buy an HDR TV?
Virtually all TVs sold today are HDR-compatible, so it’d be harder not to buy one. However, the quality of the HDR performance varies as cheaper TVs aren’t bright enough to HDR justice. The further up the ladder you go, the better the HDR experience will be.
It’s worth doing research on the product before you buy, to ensure you’re getting the specs you need for a true HDR experience. We mention information on peak brightness (in nits) that a HDR TV is capable of to give an indication of how bright it can go.
HDR TVs are getting better each year with new technologies and more capable performance. If you’re looking to upgrade to a 4K TV, then HDR has become a vital aspect of the performance you need to consider.