Is it worth upgrading to a new TV?

When comparing TVs of today to the models of yesteryear, the biggest difference between the two is the widespread ability to read an HDR signal. The question then asks itself, how big is the impact of HDR? Does it make previous TVs obsolete?

Unlike the upgrade from 1080p to 4k, the difference between the two bears no relation to the environment the TV is watched in. The biggest factor is the picture quality of the TV itself since what HDR actually is is a new, more precise way to describe what the TV needs to display.


What is HDR?

HDR is an initialism for High Dynamic Range. The term has been used for a long time now but nowadays when talking about HDR video; it is entirely about metadata. Well, what is metadata then? HDR metadata is simply additional information sent with the video signal. This information is complete color descriptions that the TV can read and then display precisely. Metadata is then distributed through two different standards, HDR10 and Dolby Vision. While this is the only parameter required to support HDR, TVs also need to reach other requirements to make use of this additional information and show a difference.

Think of it this way: With SDR, a car would be ordered to apply "full throttle" or "50% throttle." Instead, the HDR car would be asked to "go to 120 mph" and "go to 40 mph." Some vehicles would provide a worse experience than others working towards this task, and most might not even succeed. TVs are the same. In the past, the signal would be a level of power, while on HDR, it is a specific set goal.

The latter is set beyond what current TVs are capable of, requiring levels of brightness and colors the display might not be capable of reproducing. With HDR, TVs with higher peak brightness and wider color gamuts gain a purpose. Flagship models are capable of getting closer to the requirements and offer a more accurate picture compared to lower end models with limited performance in these aspects.

For the purpose of this test, we will compare two different TVs over three scenarios: A high-end TV fed an HDR signal, the same TV being sent an SDR signal, and a mid-range HDR TV in HDR mode. For the SDR signal, we used an HDFury Linker and replaced the TV's EDID with one that hid HDR support effectively making the player think it was connected to an SDR TV.

Wide Color Gamut

High-end HDR TV High-end TV with HDR
Coverage: 88.08% of DCI-P3
High-end TV with HDR disabled High-end TV with SDR
Coverage: <80% of DCI-P3 
Low-end HDR TV Low-end TV with HDR
Coverage: 75.30% of DCI-P3

A TV that supports a wider color gamut is capable of displaying a palette of colors with more saturation than a standard TV. While this isn't a necessity for HDR, they go hand in hand. In the past, even if a TV was capable of supporting one, almost all of the content was produced to fit into a smaller color gamut.

As you can see in this comparison, there is a noticeable difference between HDR and SDR, mostly in the greens and reds which are where most of the expansion went towards. However, you can also see that this isn't inherently a factor of HDR. There is little difference in color saturation between the SDR picture and the low-end HDR one. Since a lot of budget HDR TVs lack a wide color gamut, they will see no benefit from this aspect of HDR.

Winner: HDR, but only on a TV with a wide color gamut.

Color Depth

High-end HDR TV High-end TV with HDR
Bit Depth: 10-bit
High-end TV with HDR disabled High-end TV with SDR
Bit Depth: 8-bit 
Low-end HDR TV Low-end TV with HDR
Bit Depth: 10-bit

Much like color gamut, color depth refers to the different colors a TV can display. The difference between the two can be a bit confusing. Color gamut refers to the level of saturation the TV can display, while color depth refers to the number of colors the TV is capable of showing within that palette. A limited color gamut would stop a TV from displaying the red of an apple accurately, while a limitation in color depth would make the red gradients on that apple look uneven and with visible steps. 

What is commonly called an 8-bit TV will have 256 shades of red, green and blue or about 16.7 Million in total. This seems like quite a small amount when compared to 10-bit TVs which would have 1,024 shades of each channel or 1.07 Billion colors.

Color depth affects gradients the most: a TV with a lower bit depth will have to spread it over a far smaller amount of steps. A limited bit depth can lead to blockiness and uneven gradients, which you can often see on skies like on the SDR picture above. This is one of the rare cases where the HDR related features on the lower-end TVs will find a purpose since most of them have a 10-bit panel nowadays. Unfortunately, it's the least visually impactful one.

Winner: HDR

Dynamic Range

High-end HDR TV High-end TV with HDR
Contrast: 2843:1
Peak Brightness: 1054 cd/m2
High-end TV with HDR disabled High-end TV with SDR
Contrast: 2843:1
Peak Brightness: 1054 cd/m2
Low-end HDR TV Low-end TV with HDR
Contrast: 992:1
Peak Brightness: 496 cd/m2

Finally, we take a look at dynamic range. This is where an HDR TVs will show the biggest difference. HDR content makes use of their higher brightness capabilities to show lifelike highlights. When a TV has a limited dynamic range, it can only display highlights while crushing the dark elements and vice versa. A TV with higher dynamic range is capable of displaying more of both at the same time. Peak brightness, contrast as well as the quality of the tone mapping have the biggest impact on this aspect. 

The difference is visible on the X930D, one of the brightest TVs we've tested this year. The amount of detail resolved in the sky and the mountain in the background is better than in the other two examples, all while maintaining detail in the shadows under the cars. The difference can still seem minimal, though, and that is both because capturing it with a camera is impossible and because even the best TVs of today barely scratch the surface of what HDR brings forward.

Winner: HDR, but only if peak brightness is high enough to be noticeable.


In the grand scheme of things, HDR is a great advancement in the world of TVs. It however only just started to gain traction. Current TVs, even the flagship ones, don't really make much use out of it. You won't gain much visually from having an HDR signal sent to a mid-range TV since any picture quality enhancements are reliant on the capabilities of the set itself. High-end models do see a benefit, but it's a limited one (see our recommended HDR TVs). If you bought a good TV recently and you were thinking about upgrading for HDR support, it's not worth it just yet.

Questions Found an error?

Let us know what is wrong in this question or in the answer.


Questions & Answers

Hi - First of all, I really appreciate your reviews. You guys are doing the best TV reviewing on the internet, and have largely solved the problem of helping people choose what TV to buy with your excellent tests and data presentation. Regarding HDR - thank you for the clarification article here. I'm wondering if these images you've shot have been tonemapped in any way? If so, were the edits done to the images exactly the same for each example? It *seems* like the 'High-end TV with HDR' image has had its shadows lifted a bit more than the 'High-end TV with SDR', which would make a direct perceptual comparison impossible between these images. Also, were the exposures the same for all these images? Basically, I'm just wondering how comparable these examples (of the mountain with SUVs) are for comparing overall brightness and contrast, or if the shots have been shot with different exposures and then processed differently afterward. Many thanks in advance!
Thank you for your excellent feedback. These two photos were taken with the same camera, camera settings and processing.
In the SDR photo the backlight isn't at max, but is reduced to produce the same average brightness (APL), resulting in deeper blacks. When viewing HDR content it is necessary to use the maximum backlight setting to produce the bright highlights. Due to limitations on the native contrast ratio of any LCD TV, this results in a slightly raised black level (visible in the picture). Ideally, good local dimming should counteract this effect.
I am considering purchasing the Vizio P series; however, I noticed it was very far down the list regarding color gamut. How does HDR look on the P series? Even though its color gamut is not as wide as the competition, does HDR still impress the viewer? How does the overall clarity of the P series' image compare to other high-end TVs such as the Samsung KS8000 and the LG B6? I would only be watching movies and TV shows on the P series in a dark room. I do not play video games.
The Vizio P series 2016 looks great when showing HDR content, one of the best from 2016. It has excellent local dimming that really improves the HDR experience, and it has great performance in all the other tests in our HDR movies score except color gamut. Its local dimming gives it a better HDR experience than the KS8000, despite the KS8000's wider color gamut. The LG B6 is better than the Vizio P for HDR content, but it is also much more expensive.
My question involves the Samsung KS8000 TV and the LG B6 OLED. I really want the LG, but OLED is still kinda prohibitively expensive for me, so I'm leaning more toward the KS8000 but one reservation I have is that I've read that with edge-lit televisions, it compromises the quality of the HDR. Something about how when you have really bright and really dark portions on the same screen, the picture will become washed out? In your testing, is the difference in HDR quality between the OLEDs and the KS8000 really that pronounced, or is it negligible? This will be my first 4K/HDR TV and I really want the best PQ I can get, so I want to figure out whether it's in my best interest to go for the KS8000 or be patient and save a little longer for an OLED. Thank you very much, and I just have to say that I love your website! It's been immeasurably helpful in my search for a new TV.
We're glad you like the site!

While local dimming with an edge lit backlight typically isn't as good as local dimming with a full array backlight, and is certainly not as good as the perfect "local dimming" of an OLED, some edge lit TVs like the X930D still have good local dimming. That said, the edge lit local dimming of the KS8000 isn't very good. When a very bright area is shown the black space above and below are also bright, producing a bright column from the top to the bottom of the screen. This is visible in our local dimming video for the KS8000. However this "blooming" effect isn't as noticeable during typical movie scenes and we still recommend leaving local dimming on. The very high native contrast ratio of the KS8000 also helps mitigate this problem.

Overall the KS8000 still provides very good picture quality for HDR content. The LG B6 is a little better but it's not a huge difference.

Hello! I love this site and it has helped me in purchasing a television. My concern now is, I just bought a Samsung MU8000 55" to go in the living room of my new apartment. And after reading this, did I make a bad decision, in regards to how you recommend to wait on HDR? Should I return the TV and purchase a cheaper 1080p set for the living room and wait 2 years for HDR?

Your site helped me purchase the MU8000 as it was on sale at BestBuy and it fit my budget perfectly.

The MU8000 is still fairly good for HDR, with its great contrast ratio, decent brightness and wide color gamut. It's a matter of opinion whether it's better to keep it or go with a cheaper 4k TV (like the TCL P607 or S405) and save up for later. 1080p TVs aren't recommended if you have the budget, as there are now many cheap 4k TVs like the TCL S405.

HDR TVs still have a few areas they can improve on. Local dimming has been steadily improving year over year, and should be even better in two years. Another problem that remains with HDR (except HLG) is that it's mastered for a dark room, so it's often too dim when viewed in a bright room. So far only Vizio has given users the ability to drastically brighten everything for bright room viewing, but its possible other manufacturers may do so in the future. Overall, the MU8000 will likely still be a decent HDR TV in two years, so it's a toss up whether it's best to keep it or buy cheaper and save up for future improvements.

There's a Sam's Club sale this Saturday on the Samsung M #UN58MU6070. Does this TV have an HDR Effective HDR, Limited HDR, or No Effect HDR? The sale is $648 plus comes with a Samsung sound bar 2.1 Bluetooth with a wireless subwoofer. Picture quality is the most important to me. Samsung and LG are my preferred TVs.
The 58MU6070 is a variant of the 58MU6100 which we reviewed here. We expect it to have similar performance. It does have full HDR support, but it doesn't have a wide color gamut. However unless the sound bar is important to you you'd be better off getting the $650 TCL P607 or the $700 Vizio M Series 2017. Both had much better performance than the MU6100.

Dear team, since you mentioned most probably very correctly that you consider an 8-bit display with FRC (dithering) to be on par with a 10-bit one in terms of HDR performance, I want to bring your attention to the contradicting statement you give in this article when it comes to the color depth.

While the observation that lower bit depths may lead to "blockiness and uneven gradients" is certainly not wrong and easy to witness in practice, it gives a distorted impression to the usual reader that this would be a direct and necessary consequence of the bit depth. That it is not can be proven by dither where in theory there isn't any banding even when using only one single bit per color channel. Of course, the image becomes extremely noisy but still, banding won't be an issue.

Thus from my understanding, assuming an ideal dither implementation, a given bit depth doesn't limit the amount of possible colors or the color space or the gamut but "only" the SNR. The lower the bit rate, the higher the noise floor will be, effectively limiting the dynamic range if one defines it to be the ratio of the maximum not-yet-clipping brightness to the darkest parts before masked by the noise.

At least for audio, this is definitely true (there, the bit depth doesn't determine the loudness levels but only the noise floor when dithering is applied) and since a video signal in its analog variant is nothing different than an audio signal except for its (way) higher bandwidth, it must apply to video in general as well.

Hi and thanks for contacting us.

Since we only look at the final display performance (the way the gradient looks) we don't distinguish between a 10-bit panel and a good 8-bit panel with dithering. If a display is able to display a smooth 10 bit gradient using an 8 bit panel with FRC then we classify it the same as native 10 bit panels. Sometimes dithered 8 bit panels look better than native 10 bit panels when displaying a gradient.

You're correct that theoretically dithering will raise the noise floor slightly, however in practice we haven't seen any implementations which are bad enough on TVs for this dithering to be noticeable. This is likely because it is happening at a high frequency, and between two very similar colors.

Questions Have a question?

Before asking a question, make sure you use the search function of our website. The majority of the answers are already here.

Current estimated response time, based on # of pending questions: 13.5 business days.

A valid email is required. We answer most questions directly by email to prevent cluttering the site.