There are two main HDR formats, HDR10 and Dolby Vision (DV), and they have different approaches to HDR (see HDR vs SDR). Here are the different ways these formats deal with the key aspects of HDR.
|Tone Mapping||Varies per Manufacturer||Better|
|Metadata||Static, for now||Dynamic|
|Content Availability||Average, but growing fast||Limited|
Bit depth describes the amount of graduations of colors in an image. SDR content is typically mastered at 8-bit which allows for 16.7 million colors, in the world of HDR this is changing. For more information, have a look at our article on gradients.
Dolby Vision content allows for up to 12 bit color; HDR10 is only 10 bit. It might not sound like a lot of difference, but you have to remember that the difference of 2 bits here is the difference between 1.07 billion colors and 68.7 billion. This means much smoother graduations between colors and no color banding in skies.
12 bit is simply better than 10. With that said, though, don't think bit depth makes content displayed more colorful. Its importance is when displaying different tones of the same color in a gradient. The higher the bit depth, the smoother it will be.
Winner: Dolby Vision. However, even if Dolby Vision is capable of 12 bit, today's TV panels are a maximum of 10 bit. You would be hard-pressed to see a difference in current TVs. HDR10 will probably be updated to 12 bit by the time a TV that supports it appears.
Brightness and black level are the most important things to a good image because good contrast (the difference between them) is the most important part of a high-quality picture.
DV content is mastered at 4000 cd/m2; HDR10 content is mastered at a variety of levels from 1000 to 4000 cd/m2 depending on the title.
Both standards cater for images of up to 10,000 cd/m2, although no display can currently reach that level. Therefore there is no real difference between the formats as they both top out at 4000 cd/m2.
Winner: Dolby Vision. It wins by a small margin. Dolby Vision content will have more consistent mastering since HDR10 isn't as specific in its requirements. Keep in mind, though, few 2016 TVs even go above 1000 cd/m2, so it doesn't matter for now.
It is crucial how TVs with a relatively low peak brightness deals with a film that has been mastered on much higher level display. If you have a TV with a maximum brightness of 1400 cd/m2 how does it deal with highlights on a film of 4000?
The easiest way is clipping. In our example of a 1400 cd/m2 TV, everything from 1400 to 4000 would be clipped. What does this mean? It means that there would be no detail seen in that range of brightness and there would be no discernible colors in this region. This is simply because the TV cannot reproduce details in areas this bright as it is above its maximum output. At this point in time, some manufacturers clip highlights which are above their maximum brightness.
The alternative is tone mapping. On a 1400 cd/m2 TV, the highlights from 1400 to 4000 are remapped to fall below 1400 cd/m2. In practice, this means that there is some gentle roll off of color in the highlights starting around 1000 cd/m2.
This would mean that a TV that uses tone mapping would appear slightly dimmer than the same TV which employs clipping. While this is inevitable, a tone-mapped picture would show a lot more detail in the highlights than one which is clipped.
With DV, there is a Dolby chip that checks the TV's model and applies tone mapping using the TV's limitations as a reference. With HDR10, tone mapping entirely the manufacturer's choice, which can lead to inconsistency.
Winner: Dolby Vision.
Metadata is used to describe various facets of the content; it is contained alongside the series or film and helps the display deal with the content in the most effective way.
The way the two formats differ is in their use of dynamic metadata. HDR10 only asks for static metadata. Dolby, on the other hand, has dynamic metadata which allows it to give information on a frame by frame basis. What does this change? Well, with static metadata, the boundaries in brightness are set for the entirety of the movie.
For example: If you set the boundary as 0 to 1000 cd/m2, the moment you enter a very dark scene where none of the colors reach above 50 cd/m2, only 5% of the bit depth will be available for use since the 1.07B colors are spread over 1000 cd/m2.
With dynamic metadata, the boundary can be adapted to the scene. In the same scenario, the dark scene would have the full 10 bit distributed in the much smaller amount it needs. To put it into perspective, the HDR10 scene would use just over 50M colors, which while being much better than SDR, is a lot less than the 1.07B the Dolby Vision would have.
Winner: Dolby Vision. It's better at adapting to scenes that have very different lighting. This is probably short-lived, though, since TV manufacturers will presumably add their own dynamic metadata very soon.
|Xbox One S||Yes||Later in 2018|
|NVIDIA GTX 900 Series and up||Yes*||Yes|
|AMD Radeon RX and up||Yes*||No|
Dolby Vision content is only available via streaming services at the time of writing, both Netflix and Amazon have several films and series encoded with DV. Vudu also has a limited amount of content for Vizio owners. If you want to watch Dolby Vision content via a streaming service you either need a TV that supports it or you have to buy a Chromecast Ultra. It's important to note however that if the TV itself doesn't support Dolby Vision, using an external source won't make a difference.
For HDR10 on the other hand, virtually every platform that supports Dolby Vision also supports it. In addition to that, a slew of Blu-Ray discs and players are readily available. Find out where to find HDR content here.
Recent NVIDIA and AMD graphics cards support HDR10, but only with DirectX full screen games. Some applications have added support for a 10-bit pixel format, including Adboe Photoshop, but these require professional graphics cards such as the NVIDIA Quadro Series or AMD Radeon Pro/FireGL cards.
Winner: HDR 10.
In the US, only a handful of TV's from Vizio and LG support Dolby Vision. In 2018 Sony has added Dolby Vision support to most of the higher-end TVs, but external devices will require a firmware update to be compatible with Sony TVs. All the Dolby Vision TVs also support HDR10, in addition to many TVs from all the other major manufacturers.
You shouldn't expect the cheaper HDR TVs to make use of all the extra capabilities of the formats. For most of them, you won't even be able to see a difference. Only high-end TVs can take advantage of it.
Winner: HDR 10.
Samsung has shown an enhancement to HDR10 which can make use of dynamic metadata by upgrading the TV's firmware. This development addresses one of the fundamental differences between DV and HDR10 although it is not currently implemented in any TVs.
There is also HLG or hybrid log gamma. This has been developed by the BBC and NHK for live broadcasts. LG have shown an E6 OLED TV running custom firmware able to decode the HLG stream. Since it doesn't make use of metadata, every TV that supports HDR today should be able to support it via a simple software update on the TV.
Dolby Vision can be considered the more advanced HDR format, but the lack of content and supported TVs is holding it back at the moment. HDR10 has the distinct advantage of having more content available and being supported on TVs with a higher peak brightness, effectively giving a better result in the end.
Ultimately, the difference between the two formats isn't that important. The quality of the TV itself has a much bigger impact on HDR (see our recommendations for the best HDR TVs). It’s still quite early days for HDR. Both formats have the ability to produce much more dynamic images than we are seeing on the best TVs today. The limitation is down to both the TV technology and the way the titles are mastered. We can’t yet reach the 10,000 cd/m2 maximum peak brightness and the expanded 12 bit color range.