Notice: Your browser is not supported or outdated so some features of the site might not be available.
  1. Discussion

Gradient handling is worst when set to 10 or 12bit

2
1
0
1
0

Hey RTINGS team!

Today i received my unit and i’ve been doing some testing. Especially with gradient handling there is some weird behavior.

When HDR is enabled the gradient handling is significantly better when running in 8bit mode compared to 10bit or 12bit. When checking my display settings in Windows it’s actually running at “8bit with dithering” (FRC) when set to 8bit. Is it actually possible that 8bit+FRC looks better than native 10bit or even 12bit?

When HDR is disabled there is absolutely no difference in gradient handling between 8bit, 10bit and 12bit with my unit and test pattern. When checking the windows display settings it says either 8bit or 10/12bit, but not “with dithering”.

Basically by far the best gradient handling is in HDR using 8bit. The worst gradient handling is in HDR using 10bit or 12bit. Without HDR it sits about in the middle of the spectrum and there is no difference between different bit depths.

To test this i’m using the “7. Bit-Depth/Precision” test pattern in the official VESA DisplayHDR app.

What settings did you use (both on the TV and on the PC) to evaluate your gradient score?

Can you maybe try to recreate this and confirm that i’m not crazy, blind or both?

Another matter? Are you able to confirm if Calman’s Autocal already works with this generation of TV’s?

Thanks in advance!

Sort by:
oldest first
    PreviewBack to editorFormat guide