Preferred tv store
Updated

HDR10 vs HDR10+ vs Dolby Vision
Which is better?

There are three main HDR formats, HDR10, HDR10+, and Dolby Vision, and they have different approaches to HDR (see HDR vs SDR), each with their own advantages. When shopping for a new TV, you shouldn't worry too much about which formats it supports, as they all deliver a great experience with the proper equipment. If you do want to get the most out of your favorite content, here are the different ways these formats deal with the key aspects of HDR.

Test results

 

HDR10

What it is: Open standard for HDR.

HDR10+

What it is: Royalty-free standard for HDR

Dolby Vision

What it is: Proprietary standard for HDR made by Dolby.

  HDR10 HDR10+ Dolby Vision
Bit Depth Good Good Great
Peak brightness Great Great Excellent
Tone Mapping Varies per Manufacturer Better Better
Metadata Both Dynamic Dynamic 
TV Support Great Limited Limited
Content Availability Good  Limited Limited, but growing

Bit Depth

HDR10

  • 10 bit
  • 1.07 billion colors

HDR10+

  • 10 bit
  • 1.07 billion colors

Dolby Vision

  • 12 bit
  • 68.7 billion colors

Bit depth describes the amount of graduations of colors in an image. SDR content is typically mastered at 8-bit, which allows for 16.7 million colors. HDR content is usually mastered at 10 bit, which allows for up to 1.07 billion colors. The more colors you can display, the more realistic the image appears, with less banding and a more subtle transition in areas of similar color. For more information, have a look at our article on gradients.

Dolby Vision content allows for up to 12-bit color; HDR10 and HDR10+ are only 10 bit. This might not sound like a big difference, but 10-bit color equals 1.07 Billion colors, whereas 12 bit increases that to an impressive 68.7 Billion colors. This allows for much finer control over gradations, resulting in a more life-like image, with no banding.

Winner: Dolby Vision, technically. Even though Dolby Vision supports 12-bit color, there are almost no devices out there that can take advantage of it. Hopefully, 12-bit panels will start to hit the market in the coming years.

Peak Brightness

HDR10

  • Mastered from 1000 to 4000 cd/m²

HDR10+

  • Mastered from 1000 to 4000 cd/m²

Dolby Vision

  • Currently mastered at 4000 cd/m², but supports up to 10000 cd/m²

Most Dolby Vision content is currently mastered at 4000 cd/m²; HDR10 and HDR10+ content are mastered at a variety of levels from 1000 to 4000 cd/m² depending on the title.

All three standards cater for images of up to 10,000 cd/m², although no display can currently reach that level. Therefore there is no real difference between the formats as they both top out at 4000 cd/m².

Winner: Dolby Vision. Dolby Vision content will have more consistent mastering since HDR10 and HDR10+ are not as specific in their requirements. Keep in mind, though, few 2018 TVs even go above 1000 cd/m², so it doesn't matter for now.

Tone Mapping

HDR10

  • Tones that extend past the TV's range are mapped using the PQ transfer function.

HDR10+

  • Tones that extend past the TV's range are mapped using the PQ transfer function.

Dolby Vision

  • Tones that extend past the TV's range are mapped by a Dolby chip* using the PQ transfer function.

If you have a TV with a maximum brightness of 1400 cd/m², how does it deal with highlights on a film that was mastered at 4000 cd/m²? It is crucial how TVs with a relatively low peak brightness deals with a film that has been mastered at much higher peak brightness.

The easiest way is clipping. In our example of a 1400 cd/m2 TV, everything from 1400 to 4000 would be clipped. What does this mean? It means that there would be no detail seen in that range of brightness and there would be no discernible colors in this region. This is simply because the TV cannot reproduce details in areas this bright as it is above its maximum output. Thankfully, most decent TVs don't use this method anymore.

The alternative is tone mapping. On a 1400 cd/m2 TV, the highlights from 1400 to 4000 are remapped to fall below 1400 cd/m2. In practice, this means that there is some gentle roll off of color in the highlights starting around 1000 cd/m2.

This would mean that some highlights on a TV that uses tone mapping would appear slightly dimmer than if the same TV used clipping. While this is inevitable, a tone-mapped picture shows a lot more detail in the highlights than one which is clipped.

*With Dolby Vision, the first devices that supported Dolby Vision needed a proprietary Dolby chip that checks the TV's model and applies tone mapping using the TV's limitations as a reference. Dolby has relaxed that requirement, and some TVs, including Sony TVs, do this via software instead. With HDR10 and HDR10+, tone mapping is entirely the manufacturer's choice, which can lead to inconsistency.

Winner: Dolby Vision.

Metadata

Metadata can be thought of as a sort of instruction manual, that describes various facets of the content; it is contained alongside the series or film and helps the display deal with the content in the most effective way.

One of the ways the three formats differ is in their use of metadata. HDR10 only asks for static metadata. With static metadata, the boundaries in brightness are set once, for the entire movie/show, and are determined by taking the brightness range of the brightest scene. Dolby Vision and HDR10+ improve on this by using dynamic metadata, which allows them to tell the TV how to apply tone-mapping on a scene-by-scene, or even a frame-by-frame basis.

Winner: Dolby Vision and HDR10+. They are better at adapting to scenes that have very different lighting.

Availability

Supported Devices

  HDR10 HDR10+ DV
UHD Blu-Ray Yes Yes Yes
Netflix Yes No Yes
Amazon Video Yes Yes Yes
Vudu Yes No Yes
NVIDIA Shield Yes No No
Fire TV Stick 4k Yes Yes Yes
Apple TV 4k Yes No Yes
Chromecast Ultra Yes No Yes
NVIDIA GTX 900 Series and up Yes No Yes
AMD Radeon RX and up Yes No No

Availability of the new HDR formats has drastically improved in recent years. Both Dolby Vision and HDR10 can now be found on UHD Blu-rays, and are supported by most external devices, although there are still only a few movies available on disc that support Dolby Vision. The first HDR10+ movies have been released on UHD Blu-ray, but most external playback devices don't support it yet. It's important to note however that if the TV itself doesn't support Dolby Vision, using an external source won't make a difference. Find out where to find HDR content here.

Winner: HDR10.

Supported TVs

While the vast majority of TVs support HDR10 and many models support at least one of the more advanced formats, none of the models available in the U.S. support all formats. Sony, LG, Vizio, Hisense, and TCL support HDR10 and Dolby Vision on most of their models, whereas Samsung remains invested in HDR10+, and does not currently support Dolby Vision on any of their models. Outside of the U.S., some Panasonic TVs support all three formats.

You shouldn't expect the cheaper HDR TVs to make use of all the extra capabilities of the formats. For most of them, you won't even be able to see a difference. Only high-end TVs can take advantage of it.

Winner: HDR10.

Mobile Devices

Although still limited, most high-end phones now support HDR. Most high-end Samsung phones support HDR10+, including the new S10, S10e, and S10+. Apple iPhones have supported Dolby Vision since the iPhone 8, but only the iPhone X, XS, and XS Max feature true HDR screens.

  HDR10 HDR10+ DV
iPhone 8 Yes No Yes
iPhone X Yes No Yes
iPhone XS Yes No Yes
LG G6 Yes No Yes
Galaxy S10+ Yes Yes No
Huawei P20 Yes No No

Winner: HDR10. As you can't really connect an external player to a phone, HDR on phones is mainly limited to streaming services. While most streaming services offer some content in Dolby Vision and HDR10+, the majority of it is still HDR10.

Gaming

  HDR10 HDR10+ DV
PS4/PS4 Pro Yes No No
Xbox One Yes No Yes
Nintendo Switch No No No
PC Yes No Yes

Although HDR was initially designed for movies, the advantages for gaming are undeniable. Most modern consoles support HDR10, including the original PlayStation 4 as well as the PlayStation 4 Pro, and new Xbox Ones. The Xbox One also supports Dolby Vision, although it isn't supported by all TVs, as it uses a new 'low-latency' Dolby Vision format, ensuring the extra processing used for Dolby Vision doesn't add too much latency for gaming. 

Like with movies, game developers have to enable support for HDR in their games. Unfortunately, HDR10 isn't always implemented properly, so the actual results may vary. There are a handful of Dolby Vision games available now, including Assassin's Creed Origins, Battlefield 1, and Overwatch, just to name a few.

See our recommendations for the Best 4k HDR Gaming TVs.

Winner: HDR10

Monitors

Monitors have been very slow to adopt HDR. It is starting to change though, with more and more monitors now supporting HDR10. The VESA standards group has been pushing for new standards for HDR on monitors, which typically can't hit the brightness levels necessary for a great HDR experience. Despite this, monitors are still a few years behind TVs. The few monitors that deliver a decent HDR experience are very expensive and still fall short of the HDR experience on TVs. No monitors support HDR10+ or Dolby Vision, although a few laptops have been released with Dolby Vision support. 

Winner: HDR10

HLG

There is also HLG or hybrid log gamma. It is currently supported by most major TV manufacturers. HLG aims to simplify things by combining SDR and HDR into one signal. This is ideal for live broadcasts, as one signal can be played by any device receiving it. If the device supports HDR, it will display in HDR, if it doesn't, the SDR portion of the signal is played. As it is intended for live broadcasts, there is very little HLG content available.

Conclusion

Dolby Vision is arguably the most advanced HDR format from a technical standpoint, but although it has improved significantly, the lack of content is holding it back a bit. HDR10 has the distinct advantage of having more content available and being supported on the majority of TVs. HDR10+ almost matches the capabilities of Dolby Vision, but is extremely lacking in content, and in the U.S. at least, is only supported on Samsung TVs.

Ultimately, the difference between the two formats isn't that important. The quality of the TV itself has a much bigger impact on HDR (see our recommendations for the best HDR TVs). Although the technology has improved significantly in recent years, it’s still quite early days for HDR. Both formats have the ability to produce much more dynamic images than we are seeing on the best TVs today. The limitation is down to both the TV technology and the way the titles are mastered. We can’t yet reach the 10,000 cd/m2 maximum peak brightness and the expanded 12-bit color range.

LOG IN

JOIN RTINGS.com

Be part of the most informed community and take advantage of our advanced tools to find the best product for your needs.
Join our mailing list:
Become an insider

Unlimited access to full product reviews, test measurements and scores

test table UI

Product prices across the site on reviews, tables and tools

product prices UI

Additional votes for our
next reviews

Additional votes UI

Early Access
to our reviews and test measurements

Early Access UI

Create Discussion