A color gradient is a stretch of gradually changing color. For example, the far left side might be a dark green that progressively changes to a lighter green on the right. If a TV is able to display a gradient smoothly, that means it is able to capture small differences in color and is therefore good at reproducing details in color. More detailed color is one of the promises of HDR media, so it’s worth getting a TV that performs well on this test if you’re interested in HDR.
For this test, we display an image with multiple horizontal color gradients and evaluate how smoothly the TV is able to reproduce all of them. We use this result to determine the color bit depth of the TV’s panel.
Performance with gradients illustrates how well-equipped a TV is to reproduce fine details in color. In particular, a good performer on this test should have much less of the banding that can often be seen on wide spaces of gradually changing color. Compare a TV that performed well in this area (above-left) with a TV that performed poorly in this area (above-right). Since detailed color is meant to be one of the benefits of HDR video, the results of this test are quite important for people interested in that kind of media.
Poor performance with gradients is unlikely to be a deal-breaker for most, so this isn’t the most important category we test. However, as you can see from the side-by-side comparison above, there is a noticeable benefit to a TV that does well with capturing details in color, so if you want to watch HDR video, it’s still worthwhile to get a TV that performs well.
Our picture test captures the appearance of gradients on a TV’s screen. This is meant to give you an idea of how well the TV can display slight differences in color, with worse reproduction taking the form of bands of color in the image. Note that because this photo's appearance is limited by the color capabilities of your computer, screen, browser, and even the type of file used to save the image, banding that is noticeable in person may not be as apparent in the image. Above, you can compare good gradient reproduction (left) with worse reproduction (right). If you look closely, you can see more obvious banding in the right image (particularly in the green gradients).
To evaluate gradient reproduction, we take a photo of our gradient pattern in a pitch black room, with the following camera settings: F4.0, ISO-200, 1/15 sec shutter time. The image file we use is a ‘.tiff,’ as most typical image files (JPG, PNG, etc.) either don’t support 10-bit color or don’t support it well.
To display the image, we connect our test PC to the TV via HDMI, with the signal output via an NVIDIA GTX 1060 6GB graphics card. We display our gradient test image via the Nvidia ‘High Dynamic Range Display SDK’ program, as it is able to output a 1080p @ 60 Hz @ 10-bit signal, bypassing the Microsoft Windows environment, which is limited to an 8-bit.
After determining the highest possible color depth of the TV (see test below), we take a photo of the TV’s screen while it is displaying our gradient test image at that color depth.
Color depth is the number of bits of information used to tell a pixel which color to display. 10-bit color depth means a TV uses 10 bits for all three subpixels of each pixel, compared to the standard 8 bits. This allows 10-bit to specifically display many more colors; 8-bit TVs can display 2^(8*3) colors (16.7 million colors), versus 10-bit’s 2^(10*3) (1.07 billion colors). The images below provide an idea of what difference this makes.
10-bit is capable of capturing more nuance in the colors being displayed because there is less of a 'leap' from unique color to unique color.
A 10-bit display is only useful if you are watching a 10-bit content, which is really rare. Currently, almost everything is 8 bit, including Windows or game consoles. HDR takes advantage of 10-bit, and so getting a TV that supports 10-bit color is only important if you intend to watch HDR videos.
We verify color depth while performing our picture test. Using the Nvidia ‘High Dynamic Range Display SDK’ program, while outputting a 1080p @ 60Hz @ 12-bit resolution, we display our 16-bit gradient test image and we analyze the displayed image and look for any sign of 8-bit banding. If we do not see any 8-bit banding, it means the TV supports 10-bit color. We do not differentiate between native 10-bit color and 8-bit color + dithering because we score the end result of how smooth the gradient is.
Note: Current TVs max out at 10-bit color, but sending a 12-bit signal helps to allow processing (like white balance adjustments) to be enabled without adding banding.
Our gradient score is based on the subjective analysis of the visible banding from the gradient test picture. The test picture is divided into 24 distinct images and each image is then visually analyzed by one of our staff and given a score depending on if there is visual banding or not (this includes banding from 8-bit panel). The final gradient score is determined from the test result and calculated to give the final gradient rating. The maximum rating for an 8-bit TV is 9.0 since a 1.0 point penalty is automatically given to any 8-bit having visible 8-bit gradient banding.
Generally, a TV supporting 10-bit color should score higher than a TV that only supports 8-bit color, but this is not always true. Some 10-bit TVs struggle to display gradients smoothly, and some 8-bit TVs are very good.
Two things happen with ‘banding’: colors that are only fairly similar are made to look very dissimilar, and very similar colors that are meant to be reproduced uniquely are grouped together and made to look the same. This combination results in the appearance of bands of colors on the screen. Image processing can also create banding.
Therefore, if you see lots of banding in a gradient, it means one of three things:
With a high bit-depth signal played on a TV that supports it (and minimal processing enabled), more information is being used to determine which colors are displayed. This allows the TV to differentiate between similar colors more easily, and thereby minimize banding.
There are two kinds of dithering, both of which can simulate the reproduction of colors:
With good dithering, the result can look very much like higher bit-depth, and many TVs use this process to smooth out gradients onscreen. 8-bit TVs can use dithering to generate a picture that looks very much like it has 10-bit color depth.
Both the screen and the signal need to have high bit-depth for the more detailed color, which means for minimal banding with TVs, you must watch a 10-bit media source on a 10-bit TV panel.
When watching HDR media from an external device, like a UHD Blu-ray player, you should also make sure that the enhanced signal format setting is enabled for the input in question. Leaving this disabled will result in banding.
If you have met these steps and still see banding, try disabling any processing features that are enabled. Things like ‘Dynamic Contrast,’ and 2 pt./10 pt. white calibration settings can result in banding in the image.
A TV’s reproduction of a color gradient indicates how well it can display details in color. It’s an important part of HDR pictures, so if you want something that will handle HDR well, you should make sure to get a TV that does well on this test. For this test, we determine a TV’s maximum color depth, photograph a gradient test image displayed at that color depth, and then assign a score based on how well the test image was reproduced.
For best results with color depth, you should get a TV that is capable of displaying 10-bit color, and then play HDR media on that TV. If you meet those requirements and still experience banding, try disabling any processing features that you still have turned on, as those can lead to banding as well.