Though these two HDR formats are different, the goal is the same, to improve the picture quality on your TV.
So how do they go about doing that very task, and which one is better suited to your TV? Here’s a comparison of HDR10 and Dolby Vision.
What is HDR10 and Dolby Vision?
First things first. An explanation of what HDR10 and Dolby Vision are.
HDR10 was announced in 2015, and refers to the contrast/difference between the brightest and darkest parts in an image. HDR content can preserve the detail in the darkest and brightest part of an image, so you can see more detail within than you would normally do with an SDR (Standard Dynamic Range) image.
You also benefit from getting brighter whites and deeper blacks to aid contrast, making for an image that can often look three-dimensional. HDR10 is the industry standard, mandated to be supported by all HDR devices to ensure compatibility across the board.
Dolby Vision differs from HDR10 in that it uses dynamic metadata compared to HDR10’s static metadata. Dolby Vision can adapt the picture’s light/dark values either in scene-to-scene or frame-by-frame, carrying instructions to the TV to tell it how bright or dark it should be at any given moment, improving contrast, detail and colour reproduction.
Dolby Vision HDR is optional, and not required by all devices. It also isn’t free to use, requiring license from Dolby to support it on a TV, mobile device, etc.
What’s the difference?
HDR10 uses WCG (Wide Colour gamut), which expands the range of colours depicted on a screen, the Rec.2020 colour space and carries a bit-depth of 10-bits. Bit-depth refers to how many colours can be shown, and in this instance it’s 1024 shades of primary colours.
With devices that support a HDR connection, it needs at the very least a HDMI 2.0a connection to carry the HDR signal from one device to another. Content is mastered at 1000 nits or brightness, though technically it can be mastered at a maximum of 10,000 nits. You’re more likely to find HDR10 content mastered at 1000 or 4000 nits.
However, not all TVs can produce 1000 or 4000 nits on a constant basis, and unlike Dolby Vision, HDR10 is a static form of HDR. What this means is that the brightness and darkness levels stay the same through the duration of a vide, for example, so black and white levels are always the same. There is metadata within the HDR10 signal that provides information to the display to help adjust how it appears, but this is not the same as ‘instructing’ it on how the image should look. Some TVs ignore this information altogether.
It often comes down to the TV’s tone-mapping, how the display adjusts brightness and contrast at a lower brightness to preserve the detail within the image. Every manufacturer’s tone-mapping differs, so you may see different images on different TVs.
Dolby Vision was introduced in 2014, and is used in TVs, mobile phones and in theatres through Dolby Cinema. It’s compatible up to 8K resolution, supports 12-bit video to show a wider range of colours than HDR10, and can be mastered up to a peak brightness of 10,000 nits, mastering at that level of brightness is considered to be rare.
Where it significantly deviates from HDR10 is that the Vision HDR adds a layer of dynamic metadata to the core HDR signal. As noted in the previous section, this instructs the display on everything from brightness, contrast, detail, colour reproduction and sharpness in an image.
It’ll also use tone-mapping to help preserve detail, avoiding clipping (the loss of detail in brighter areas) and improve shadow detail (detail in the darkest part of the image), though it’ll depend on the TV’s capabilities in order to show this level of detail. Expensive TVs are more capable in this respect than affordable ones.
Who supports Dolby Vision and HDR10?
As we mentioned earlier, HDR10 is the industry standard HDR format, so any HDR capable device should support it.
As Dolby Vision HDR is optional, it’s up to the manufacturer to include it. However, use of Vision HDR has become widespread on TVs from affordable brands to more expensive sets, and Apple includes the HDR format on its iPhones and iPads. It’s rarer to find Dolby Vision on Android devices, which adopt HDR10+, the rival to Dolby Vision. Some Android phones do support it though, like the Xiaomi 13 Pro and OnePlus 11.
There’s also plenty of content available in Dolby Vision, from 4K Blu-rays to all the major video streaming services. If your display is not Dolby Vision compatible, then you’ll see the HDR10 version instead.
Which is better?
If you want the best performance from HDR content, Dolby Vision is better by the fact that it uses dynamic metadata to adapt the TV’s brightness and colour performance.
That doesn’t always mean that what you see will be the most accurate presentation because it will depend on how good your TV is a reproducing the original signal, but it will help to preserve the original intent of the creators as best it can.
From our experience colours look better and there’s better contrast when viewing content in Dolby Vision. The image does tend to look darker than HDR10, but the increased sense of contrast does help in producing more pronounced highlights (the brightest part of the image) and gives it greater visual impact. HDR10 can be very good, especially for those with TVs that can display images at a peak brightness of 1000 nits, but Dolby Vision is more accommodating, especially for budget TVs and ultimately, results in a better image.