HDR vs SDR: Is it worth upgrading to a new TV?

When comparing TVs of today to the models of yesteryear, the biggest difference between the two is the widespread ability to read an HDR signal. The question then asks itself, how big is the impact of HDR? Does it make previous TVs obsolete?

Unlike the upgrade from 1080p to 4k, the difference between the two bears no relation to the environment the TV is watched in. The biggest factor is the picture quality of the TV itself since what HDR actually is is a new, more precise way to describe what the TV needs to display.


What is HDR?

HDR is an initialism for High Dynamic Range. The term has been used for a long time now but nowadays when talking about HDR video; it is entirely about metadata. Well, what is metadata then? HDR metadata is simply additional information sent with the video signal. This information is complete color descriptions that the TV can read and then display precisely. Metadata is then distributed through two different standards, HDR10 and Dolby Vision. While this is the only parameter required to support HDR, TVs also need to reach other requirements to make use of this additional information and show a difference.

Think of it this way: With SDR, a car would be ordered to apply "full throttle" or "50% throttle." Instead, the HDR car would be asked to "go to 120 mph" and "go to 40 mph." Some vehicles would provide a worse experience than others working towards this task, and most might not even succeed. TVs are the same. In the past, the signal would be a level of power, while on HDR, it is a specific set goal.

The latter is set beyond what current TVs are capable of, requiring levels of brightness and colors the display might not be capable of reproducing. With HDR, TVs with higher peak brightness and wider color gamuts gain a purpose. Flagship models are capable of getting closer to the requirements and offer a more accurate picture compared to lower end models with limited performance in these aspects.

For the purpose of this test, we will compare two different TVs over three scenarios: A high-end TV fed an HDR signal, the same TV being sent an SDR signal, and a mid-range HDR TV in HDR mode. For the SDR signal, we used an HDFury Linker and replaced the TV's EDID with one that hid HDR support effectively making the player think it was connected to an SDR TV.

Wide Color Gamut

High-end HDR TV High-end TV with HDR
Coverage: 88.08% of DCI-P3
High-end TV with HDR disabled High-end TV with SDR
Coverage: <80% of DCI-P3 
Low-end HDR TV Low-end TV with HDR
Coverage: 75.30% of DCI-P3

A TV that supports a wider color gamut is capable of displaying a palette of colors with more saturation than a standard TV. While this isn't a necessity for HDR, they go hand in hand. In the past, even if a TV was capable of supporting one, almost all of the content was produced to fit into a smaller color gamut.

As you can see in this comparison, there is a noticeable difference between HDR and SDR, mostly in the greens and reds which are where most of the expansion went towards. However, you can also see that this isn't inherently a factor of HDR. There is little difference in color saturation between the SDR picture and the low-end HDR one. Since a lot of budget HDR TVs lack a wide color gamut, they will see no benefit from this aspect of HDR.

Winner: HDR, but only on a TV with a wide color gamut.

Color Depth

High-end HDR TV High-end TV with HDR
Bit Depth: 10-bit
High-end TV with HDR disabled High-end TV with SDR
Bit Depth: 8-bit 
Low-end HDR TV Low-end TV with HDR
Bit Depth: 10-bit

Much like color gamut, color depth refers to the different colors a TV can display. The difference between the two can be a bit confusing. Color gamut refers to the level of saturation the TV can display, while color depth refers to the number of colors the TV is capable of showing within that palette. A limited color gamut would stop a TV from displaying the red of an apple accurately, while a limitation in color depth would make the red gradients on that apple look uneven and with visible steps. 

What is commonly called an 8-bit TV will have 256 shades of red, green and blue or about 16.7 Million in total. This seems like quite a small amount when compared to 10-bit TVs which would have 1,024 shades of each channel or 1.07 Billion colors.

Color depth affects gradients the most: a TV with a lower bit depth will have to spread it over a far smaller amount of steps. A limited bit depth can lead to blockiness and uneven gradients, which you can often see on skies like on the SDR picture above. This is one of the rare cases where the HDR related features on the lower-end TVs will find a purpose since most of them have a 10-bit panel nowadays. Unfortunately, it's the least visually impactful one.

Winner: HDR

Dynamic Range

High-end HDR TV High-end TV with HDR
Contrast: 2843:1
Peak Brightness: 1054 cd/m2
High-end TV with HDR disabled High-end TV with SDR
Contrast: 2843:1
Peak Brightness: 1054 cd/m2
Low-end HDR TV Low-end TV with HDR
Contrast: 992:1
Peak Brightness: 496 cd/m2

Finally, we take a look at dynamic range. This is where an HDR TVs will show the biggest difference. HDR content makes use of their higher brightness capabilities to show lifelike highlights. When a TV has a limited dynamic range, it can only display highlights while crushing the dark elements and vice versa. A TV with higher dynamic range is capable of displaying more of both at the same time. Peak brightness, contrast as well as the quality of the tone mapping have the biggest impact on this aspect. 

The difference is visible on the X930D, one of the brightest TVs we've tested this year. The amount of detail resolved in the sky and the mountain in the background is better than in the other two examples, all while maintaining detail in the shadows under the cars. The difference can still seem minimal, though, and that is both because capturing it with a camera is impossible and because even the best TVs of today barely scratch the surface of what HDR brings forward.

Winner: HDR, but only if peak brightness is high enough to be noticeable.


In the grand scheme of things, HDR is a great advancement in the world of TVs. It however only just started to gain traction. Current TVs, even the flagship ones, don't really make much use out of it. You won't gain much visually from having an HDR signal sent to a mid-range TV since any picture quality enhancements are reliant on the capabilities of the set itself. High-end models do see a benefit, but it's a limited one (see our recommended HDR TVs). If you bought a good TV recently and you were thinking about upgrading for HDR support, it's not worth it just yet.

Questions Found an error?

Let us know what is wrong in this question or in the answer.


Questions & Answers

Hi - First of all, I really appreciate your reviews. You guys are doing the best TV reviewing on the internet, and have largely solved the problem of helping people choose what TV to buy with your excellent tests and data presentation. Regarding HDR - thank you for the clarification article here. I'm wondering if these images you've shot have been tonemapped in any way? If so, were the edits done to the images exactly the same for each example? It *seems* like the 'High-end TV with HDR' image has had its shadows lifted a bit more than the 'High-end TV with SDR', which would make a direct perceptual comparison impossible between these images. Also, were the exposures the same for all these images? Basically, I'm just wondering how comparable these examples (of the mountain with SUVs) are for comparing overall brightness and contrast, or if the shots have been shot with different exposures and then processed differently afterward. Many thanks in advance!
Thank you for your excellent feedback. These two photos were taken with the same camera, camera settings and processing.
In the SDR photo the backlight isn't at max, but is reduced to produce the same average brightness (APL), resulting in deeper blacks. When viewing HDR content it is necessary to use the maximum backlight setting to produce the bright highlights. Due to limitations on the native contrast ratio of any LCD TV, this results in a slightly raised black level (visible in the picture). Ideally, good local dimming should counteract this effect.
My question involves the Samsung KS8000 TV and the LG B6 OLED. I really want the LG, but OLED is still kinda prohibitively expensive for me, so I'm leaning more toward the KS8000 but one reservation I have is that I've read that with edge-lit televisions, it compromises the quality of the HDR. Something about how when you have really bright and really dark portions on the same screen, the picture will become washed out? In your testing, is the difference in HDR quality between the OLEDs and the KS8000 really that pronounced, or is it negligible? This will be my first 4K/HDR TV and I really want the best PQ I can get, so I want to figure out whether it's in my best interest to go for the KS8000 or be patient and save a little longer for an OLED. Thank you very much, and I just have to say that I love your website! It's been immeasurably helpful in my search for a new TV.
We're glad you like the site!

While local dimming with an edge lit backlight typically isn't as good as local dimming with a full array backlight, and is certainly not as good as the perfect "local dimming" of an OLED, some edge lit TVs like the X930D still have good local dimming. That said, the edge lit local dimming of the KS8000 isn't very good. When a very bright area is shown the black space above and below are also bright, producing a bright column from the top to the bottom of the screen. This is visible in our local dimming video for the KS8000. However this "blooming" effect isn't as noticeable during typical movie scenes and we still recommend leaving local dimming on. The very high native contrast ratio of the KS8000 also helps mitigate this problem.

Overall the KS8000 still provides very good picture quality for HDR content. The LG B6 is a little better but it's not a huge difference.

I am considering purchasing the Vizio P series; however, I noticed it was very far down the list regarding color gamut. How does HDR look on the P series? Even though its color gamut is not as wide as the competition, does HDR still impress the viewer? How does the overall clarity of the P series' image compare to other high-end TVs such as the Samsung KS8000 and the LG B6? I would only be watching movies and TV shows on the P series in a dark room. I do not play video games.
The Vizio P series 2016 looks great when showing HDR content, one of the best from 2016. It has excellent local dimming that really improves the HDR experience, and it has great performance in all the other tests in our HDR movies score except color gamut. Its local dimming gives it a better HDR experience than the KS8000, despite the KS8000's wider color gamut. The LG B6 is better than the Vizio P for HDR content, but it is also much more expensive.
Questions Have a question?

Before asking a question, make sure you use the search function of our website. The majority of the answers are already here.

Current estimated response time, based on # of pending questions: 6.6 business days.

A valid email is required. We answer most questions directly by email to prevent cluttering the site.