When comparing recent TVs to older models, one of the biggest differences between the two is the widespread ability to read an HDR signal. How important is HDR though, and is it worth upgrading for?
Unlike the upgrade from 1080p to 4k, the difference between the two has little to do with the environment the TV is watched in. The biggest factor is the picture quality of the TV itself, since HDR is actually a new, more precise way to describe what the TV needs to display.
HDR stands for High Dynamic Range. The term has been used for a long time now, but nowadays when talking about HDR video, it is entirely about metadata. Well, what is metadata then? HDR metadata is simply additional information sent with the video signal, that tells the TV how to display the content properly. Metadata is then distributed through one of many different standards, including HDR10, HDR10+, and Dolby Vision. Although metadata is one important factor in HDR, the TV also needs to be able to display the content it's being asked to display.
Think of it this way: with SDR, a car would be ordered to apply "full throttle" or "50% throttle." Instead, the HDR car would be asked to "go to 120 mph" and "go to 40 mph." Some vehicles would provide a worse experience than others working towards this task, and most might not even succeed. TVs are the same. In the past, the signal would be a level of power, while on HDR, it is a specific set goal.
This "goal" varies depending on the content, but in almost all cases, there are very few TVs that can achieve it. This is where high-end TVs really shine, as they are able to produce wider color gamuts and brighter highlights, inching ever-closer to the content creator's intent.
For the purpose of this test, we will compare two different TVs over three scenarios: A high-end TV fed an HDR signal, the same TV being sent an SDR signal, and a mid-range HDR TV in HDR mode. For the SDR signal, we used an HDFury Linker and replaced the TV's EDID with one that hid HDR support effectively making the player think it was connected to an SDR TV.
A TV that supports a wider color gamut is capable of displaying a palette of colors with more saturation than a standard TV. While this isn't a necessity for HDR, they go hand in hand. In the past, even if a TV was capable of supporting one, almost all of the content was produced to fit into a smaller color gamut.
As you can see in this comparison, there is a noticeable difference between HDR and SDR, mostly in the greens and reds which are where most of the expansion went towards. However, you can also see that this isn't inherently a factor of HDR. There is little difference in color saturation between the SDR picture and the low-end HDR one. Since a lot of budget HDR TVs lack a wide color gamut, they will see no benefit from this aspect of HDR.
Winner: HDR, but only on a TV with a wide color gamut.
Much like color gamut, color depth refers to the different colors a TV can display. The difference between the two can be a bit confusing. Color gamut refers to the level of saturation the TV can display, while color depth refers to the number of colors the TV is capable of showing within that palette. A limited color gamut would stop a TV from displaying the red of an apple accurately, while a limitation in color depth would make the red gradients on that apple look uneven and with visible steps.
What is commonly called an 8-bit TV will have 256 shades of red, green, and blue, or about 16.7 million in total. This seems like quite a small amount when compared to 10-bit TVs which would have 1,024 shades of each channel or 1.07 billion colors.
Color depth affects gradients the most: a TV with a lower bit depth will have to spread it over a far smaller amount of steps. A limited bit depth can lead to blockiness and uneven gradients, which you can often see on skies like on the SDR picture above. This is one of the rare cases where the HDR related features on the lower-end TVs will find a purpose, since most of them have a 10-bit panel nowadays. Unfortunately, it's the least visually impactful one.
Finally, we take a look at dynamic range. This is where HDR TVs will show the biggest difference. HDR content makes use of their higher brightness capabilities to show lifelike highlights. When a TV has a limited dynamic range, it can only display highlights while crushing the dark elements and vice versa. A TV with higher dynamic range is capable of displaying more of both at the same time. Peak brightness, contrast, as well as the quality of the tone mapping have the biggest impact on this aspect.
The difference is visible on the X930D. The amount of detail resolved in the sky and the mountain in the background is better than in the other two examples, all while maintaining detail in the shadows under the cars. The difference can still seem minimal, though, and that is both because capturing it with a camera is impossible and because even the best TVs of today barely scratch the surface of what HDR brings forward.
Winner: HDR, but only if peak brightness is high enough to be noticeable.
In the grand scheme of things, HDR is arguably one of the greatest improvements in recent TVs. As technology has improved in recent years, high-end TVs are getting much brighter and can display much wider color gamuts than ever before. You won't gain much visually from having an HDR signal sent to a mid-range TV since any picture quality enhancements are reliant on the capabilities of the set itself. High-end models see a significant benefit (see our recommended HDR TVs).