When you say "4k @ 60Hz + HDR" in Input Lag tests, does that mean it's using 4:4:4? I know that HDMI 2.0 doesn't support that, but perhaps some TVs accept it as unsupported resolution. Those of us who want to use a 4k TV as PC monitor might interpret that as supporting 4k @ 60Hz + HDR + 4:4:4, and buy an expensive TV only to end up disappointed because they are stuck with 8-Bit colors, even though they have high end graphics cards. A 4k TV can replace 4x 1080p monitors, and with a product like HDHomeRun Prime, one can watch cable TV (including HBO) on a window, while working on an app next to it. If it matters, the TV I am interested in is the latest Samsung QLED Q9FN. I have an Nvidia GeForce GTX 980 Ti, which supports HDMI 2.0 / DisplayPort 1.2. I don't really care about 60 Hz, I could live with 30 Hz, as long as I get 10-Bit colors with 4:4:4 subsampling. I hope that you add a new entry: "Maximum or Supported refresh rates for 4K + HDR + 4:4:4". Groups that would be interested on this are photographers and video editors who want the best color accuracy without any compression or subsampling. Thank you.