HDR10 vs Dolby Vision: Which is better?

There are two main HDR formats, HDR10 and Dolby Vision (DV), and they have different approaches to HDR (see HDR vs SDR). Here are the different ways these formats deal with the key aspects of HDR.


What it is: Open standard for HDR.

Dolby Vision

What it is: Proprietary standard for HDR made by Dolby.

  HDR10 DV
Bit Depth Good Great
Peak brightness Great Great
Tone Mapping Varies per Manufacturer Better
Metadata Static, for now  Dynamic 
TV Support Great Limited
Content Availability Average, but growing fast  Limited

Bit Depth


  • 10 bit
  • 1.07 billion colors

Dolby Vision

  • 12 bit
  • 68.7 billion colors

Bit depth describes the amount of graduations of colors in an image. SDR content is typically mastered at 8-bit which allows for 16.7 million colors, in the world of HDR this is changing. For more information, have a look at our article on gradients.

Dolby Vision content allows for up to 12 bit color; HDR10 is only 10 bit. It might not sound like a lot of difference, but you have to remember that the difference of 2 bits here is the difference between 1.07 billion colors and 68.7 billion. This means much smoother graduations between colors and no color banding in skies.

12 bit is simply better than 10. With that said, though, don't think bit depth makes content displayed more colorful. Its importance is when displaying different tones of the same color in a gradient. The higher the bit depth, the smoother it will be.

Winner: Dolby Vision. However, even if Dolby Vision is capable of 12 bit, today's TV panels are a maximum of 10 bit. You would be hard-pressed to see a difference in current TVs. HDR10 will probably be updated to 12 bit by the time a TV that supports it appears.

Peak Brightness


  • Mastered from 1000 to 4000 cd/m2

Dolby Vision

  • Always mastered at 4000 cd/m2

Brightness and black level are the most important things to a good image because good contrast (the difference between them) is the most important part of a high-quality picture.

DV content is mastered at 4000 cd/m2; HDR10 content is mastered at a variety of levels from 1000 to 4000 cd/m2 depending on the title.

Both standards cater for images of up to 10,000 cd/m2, although no display can currently reach that level. Therefore there is no real difference between the formats as they both top out at 4000 cd/m2.

Winner: Dolby Vision. It wins by a small margin. Dolby Vision content will have more consistent mastering since HDR10 isn't as specific in its requirements. Keep in mind, though, few 2016 TVs even go above 1000 cd/m2, so it doesn't matter for now.

Tone Mapping


  • Tones that extend past the TV's range are mapped using the PQ transfer function.

Dolby Vision

  • Tones that extend past the TV's range are mapped by a Dolby chip using the PQ transfer function.

It is crucial how TVs with a relatively low peak brightness deals with a film that has been mastered on much higher level display. If you have a TV with a maximum brightness of 1400 cd/m2 how does it deal with highlights on a film of 4000?

The easiest way is clipping. In our example of a 1400 cd/m2 TV, everything from 1400 to 4000 would be clipped. What does this mean? It means that there would be no detail seen in that range of brightness and there would be no discernible colors in this region. This is simply because the TV cannot reproduce details in areas this bright as it is above its maximum output. At this point in time, some manufacturers clip highlights which are above their maximum brightness.

The alternative is tone mapping. On a 1400 cd/m2 TV, the highlights from 1400 to 4000 are remapped to fall below 1400 cd/m2. In practice, this means that there is some gentle roll off of color in the highlights starting around 1000 cd/m2.

This would mean that a TV that uses tone mapping would appear slightly dimmer than the same TV which employs clipping. While this is inevitable, a tone-mapped picture would show a lot more detail in the highlights than one which is clipped.

With DV, there is a Dolby chip that checks the TV's model and applies tone mapping using the TV's limitations as a reference. With HDR10, tone mapping entirely the manufacturer's choice, which can lead to inconsistency.

Winner: Dolby Vision.


Metadata is used to describe various facets of the content; it is contained alongside the series or film and helps the display deal with the content in the most effective way.

The way the two formats differ is in their use of dynamic metadata. HDR10 only asks for static metadata. Dolby, on the other hand, has dynamic metadata which allows it to give information on a frame by frame basis. What does this change? Well, with static metadata, the boundaries in brightness are set for the entirety of the movie.
For example: If you set the boundary as 0 to 1000 cd/m2, the moment you enter a very dark scene where none of the colors reach above 50 cd/m2, only 5% of the bit depth will be available for use since the 1.07B colors are spread over 1000 cd/m2.

With dynamic metadata, the boundary can be adapted to the scene. In the same scenario, the dark scene would have the full 10 bit distributed in the much smaller amount it needs. To put it into perspective, the HDR10 scene would use just over 50M colors, which while being much better than SDR, is a lot less than the 1.07B the Dolby Vision would have.

Winner: Dolby Vision. It's better at adapting to scenes that have very different lighting. This is probably short-lived, though, since TV manufacturers will presumably add their own dynamic metadata very soon.

Available Content & Playback Hardware

  HDR10 DV
UHD Blu-Ray Yes Planned for 2017
Netflix Yes Yes
Amazon Video Yes Yes
Vudu Yes Yes
PS4/PS4 Pro Yes No
Xbox One S Yes No
Samsung UBD-K8500 Yes No
Panasonic DMP-UB900 Yes No
Philips BDP-7501 Yes No
Nvidia Shield Yes No
Chromecast Ultra Yes Yes
Nvidia GTX 900 Series and up Yes Yes
AMD Radeon RX and up Yes No

Dolby Vision content is only available via streaming services at the time of writing, both Netflix and Amazon have several films and series encoded with DV. Vudu also has a limited amount of content for Vizio owners. If you want to watch Dolby Vision content via a streaming service you either need a TV that supports it or you have to buy a Chromecast Ultra. There is currently no other external device that supports it. It's important to note however that if the TV itself doesn't support Dolby Vision, using an external source won't make a difference. 

For HDR10 on the other hand, virtually every platform that supports Dolby Vision also supports it. In addition to that, a slew of Blu-Ray discs and players are readily available. Find out where to find HDR content here.

Winner: HDR 10.

Supported TVs

In the US, only a handful of TV's from Vizio and LG support Dolby Vision. All the Dolby Vision TVs also support HDR10, in addition to many TVs from all the other major manufacturers.

You shouldn't expect the cheaper HDR TVs to make use of all the extra capabilities of the formats. For most of them, you won't even be able to see a difference. Only high-end TVs can take advantage of it.

Winner: HDR 10.

Further Developments

Samsung has shown an enhancement to HDR10 which can make use of dynamic metadata by upgrading the TV's firmware. This development addresses one of the fundamental differences between DV and HDR10 although it is not currently implemented in any TVs.

There is also HLG or hybrid log gamma. This has been developed by the BBC and NHK for live broadcasts. LG have shown an E6 OLED TV running custom firmware able to decode the HLG stream. Since it doesn't make use of metadata, every TV that supports HDR today should be able to support it via a simple software update on the TV.


Dolby Vision can be considered the more advanced HDR format, but the lack of content and supported TVs is holding it back at the moment. HDR10 has the distinct advantage of having more content available and being supported on TVs with a higher peak brightness, effectively giving a better result in the end.

Ultimately, the difference between the two formats isn't that important. The quality of the TV itself has a much bigger impact on HDR (see our recommendations for the best HDR TVs). It’s still quite early days for HDR. Both formats have the ability to produce much more dynamic images than we are seeing on the best TVs today. The limitation is down to both the TV technology and the way the titles are mastered. We can’t yet reach the 10,000 cd/m2 maximum peak brightness and the expanded 12 bit color range.

Questions Found an error?

Let us know what is wrong in this question or in the answer.


Questions & Answers

Thanks for all the reviews! I have been considering the Samsung KS8000 or the Vizio P. I understand that the KS8000 is a better TV for the same price however I am really concerned that it does not support Dolby Vision (I don't want a useless TV in 2 years). I have heard different stories, mostly that the Samsung KS8000 actually has the chip for Dolby Vision and can be supported via a firmware update. Is this true? And what would you recommend considering this?
Both HDR10 and Dolby Vision currently exceed the capabilities of today's TVs, so you won't get worse picture quality by using HDR10. For future proofing, it's extremely unlikely that any content will be made for Dolby Vision but not HDR10, so you should be able to play all HDR content for the foreseeable future.

The KS8000 is better than the Vizio P in a bright room due to its higher peak brightness and amazing handling of reflections, but the Vizio P is better in a dark room due to its great local dimming. It is also better for gaming due to its low input lag and it can be better as a PC monitor because it can receive a 120 Hz input.

I have the EF9500 OLED. If I purchase an external streaming device that supports Dolby Vision, will the TV recognize the format?
No. A specific piece of hardware is needed in the TV to decode the Dolby Vision format.
Do you know if the LG EF9500 incorporates tone mapping or clipping for hdr10?
The LG EF9500 uses tone mapping up to a certain point, but after that it clips. Nearly all TVs have this behaviour.
Your HDR10 vs DV comparison shows that Nvidia cards support HDR10 but not Dolby Vision, but with a recent Driver update Nvidia does indeed support Dolby Vision! Source (Anandtech)
Thank you for the heads up! The article has been updated.
Hey, I just ordered an LG OLED 65 inch and I'm just wondering what are the maximum peak brightness levels of HDR or Dolby Vision content currently available on Netflix and Amazon Video? I understand OLEDs are known for their dimmer peak brightness but I'd imagine this shouldn't be an issue since I (and most people) only really watch TV or game at night, but if the LG OLED65E6P only has a peak brightness of around 650 cd/m2, then how bright do the brightest scenes in Netflix and Amazon Video even get?
Most HDR content is mastered for either 1000 cd/m2 or 4000 cd/m2 peak brightness, so the 2016 OLEDs will not be able to get bright enough for the really bright highlights in HDR content. This is an issue even in dark rooms because HDR content is intended to be shown at maximum brightness on the TV, even in a dark room (though you can turn it down if you find it too bright).

However in practice the 2016 OLEDs are some of the best TVs for producing bright highlights in HDR content. Our HDR Real Scene test is a good measure of how well a TV will be able to brighten highlights in HDR, and only one 2016 TV scored better than the OLEDs, the Sony X930D. Few TVs can beat the OLEDs because even though many LED TVs advertise very high peak brightness, they can only reach that brightness in very ideal cases, such as our 2% and 10% white window tests, and not when watching most HDR content.

Questions Have a question?

Before asking a question, make sure you use the search function of our website. The majority of the answers are already here.

Current estimated response time, based on # of pending questions: 6.6 business days.

A valid email is required. We answer most questions directly by email to prevent cluttering the site.