Input lag is the amount of time it takes for your TV to display a signal on the screen from when the source sends it. It's especially important for playing reaction-based video games because you want the lowest input lag possible for a responsive gaming experience. Having low input lag tends to come at the cost of less image processing on TVs, which is why there are specific Game Modes for low input lag, and even though TVs aren't as good as monitors in this regard, technology is slowly catching up.
We measure the input lag using a specialized tool, and we test for it at different resolutions using different settings.
Input lag matters the most for playing video games, either on a console or on a PC. With fast-paced games like fighting or FPS games, quick reflexes are key. Lower input lag can mean the difference between a well-timed reaction for a win or a late response that results in a loss. It doesn’t matter for watching movies, though, so unless you’re a gamer, you have nothing to worry about. You might notice some delay when scrolling through your smart TV's menu, but it's rarely an issue for most people.
Before we get into the details of how we test, let's first talk about the causes of input lag. Three main factors contribute to the input lag on the TV: acquiring the source image, processing the image, and displaying it.
The more time it takes for the TV to receive the source image, the more input lag there will be. With modern digital TVs, using an HDMI cable will allow you to minimize the acquisition time, as that will transfer from the source to the TV almost instantly. This phase of the input lag is rarely an issue on modern TVs as it was more of an issue in the past with analog connections like from early gaming consoles.
Once the image is in a format understandable by the video processor, it will apply at least some processing to alter the image in some way. A few examples:
The time this step takes is affected by the speed of the video processor and the amount of processing needed. Though you can't control the speed of the processor, you can exercise some control over how many operations it needs to do by enabling and disabling settings. Only more demanding video processing settings, like motion interpolation, will usually add input lag, while others, like the brightness, won't.
Once the television has processed the image, it's ready to be displayed on the screen, and the processor sends the video to the screen. However, the screen can't make it appear instantly, and the amount of time it takes to appear depends on the technology and the panel. Unfortunately, there's no way to improve or control the amount of time needed in this part, as it changes from TV to TV. However, this is different from the response time, which is the amount of time it takes for the pixels to change colors, and effects motion.
Now, let's talk about how we measure the input lag. It's a rather simple test because everything is done by our dedicated photodiode tool and special software. We use this same tool for our response time tests, but it measures something differently with those. For the input lag, we place the photodiode tool at the center of the screen because that's where it records the data in the middle of the refresh rate cycle, so it skews the results to the beginning or end of the cycle. We connect our test PC to the tool and the TV. The tool flashes a white square on the screen and records the amount of time it takes until the screen starts to change the white square; this is an input lag measurement. It stops the measurement the moment the pixels start to change color, so we don't account for the response time during our testing. It records multiple data points, and our software records an average of all the measurements, not considering any outliers.
When a TV displays a new image, it progressively displays it on the screen from top to bottom, so the image first appears at the top. As we have the photodiode tool placed in the middle, it records the image when it's halfway through its refresh rate cycle. On a 120Hz TV, it displays 120 images every second, so every image takes 8.33 ms to be displayed on the screen. Since we have the tool in the middle of the screen, we're measuring it halfway through the cycle, so it takes 4.17 ms to get there; this is the minimum input lag we can measure on a 120Hz TV. If we measure an input lag of 5.17 ms, then in reality it's only taking an extra millisecond of lag to appear of the screen. For a 60Hz TV, the minimum is 8.33 ms.
We always measure the input lag in the TV's Game Mode unless our test indicates that we're supposed to use different settings.
Some people may confuse our response time and our input lag tests. For input lag, we measure the time it takes from when the photodiode tool sends the signal to when it appears on-screen. We use flashing white squares, and the tool stops the measurement the moment the screen changes color so that it doesn't include the response time measurement. As for the response time test, we use grayscale slides, and this test is to measure the time it takes to make a full transition from one gray shade to the next. In simple words, the input lag test stops when the color on the screen changes, and the response time starts when the colors change.
This test measures the input lag of 1080p signals with a 60Hz refresh rate. This is especially important for older console games (like the PS4 or Xbox One) or PC gamers who play with a lower resolution at 60Hz. As with other tests, this is done in Game Mode, and unless otherwise stated, our tests are done in SDR.
We repeat the same process but with Game Mode disabled. This is to show the difference between in and out of Game Mode. It could be important if you scroll a lot through your TV's smart OS and you easily notice delay, so if you find it's too high and it's bothering you, simply switch into Game Mode when you need to scroll through menus.
This result can also be important if you want to play video games with the TV's full image processing. You might consider this if you're playing a non-reaction-based game.
This result is important if you play 1440p games, like from an Xbox or a PC. However, 1440p games are still considered niche, and not all TVs support this resolution, so we can't measure the 1440p input lag of those.
The 4k @ 60Hz input lag is probably the most important result for most console gamers. Along with 1080p @ 60Hz input lag, it carries the most weight in the final scoring since most gamers are playing at this resolution. We expect this input lag to be lower than the 4k @ 60Hz with HDR, chroma 4:4:4, or motion interpolation results because it requires the least amount of image processing.
With the PC sending a 4k @ 60Hz signal, we use an HDFury Linker to add an HDR signal. This is important if you play HDR games, and while it may add some extra lag, it's still low for most TVs.
This test is important for people wanting to use the TV as a PC monitor. Chroma 4:4:4 is a video signal format that doesn't use any image compression, which is necessary if you want proper text clarity. We want to know how much delay is added, but for nearly all of our TVs, it doesn't add any delay at all compared to the 4k @ 60Hz input lag.
Like with 1080p @ 60Hz Outside Game Mode, we measure the input lag outside of Game Mode in 4k. Since most TVs have a native 4k resolution, this number is more important than the 1080p lag while you're scrolling through the menus.
Motion interpolation is an image processing technique that increases the frame rate to a higher one, like if you want to increase a 30 fps video up to 60 fps. However, for most TVs, you need to disable the Game Mode to enable the motion interpolation setting, as only Samsung offers motion interpolation in Game Mode. As such, most TVs will have a high input lag with motion interpolation. Also, we measure this with the motion interpolation settings at their highest because we want to see how the input lag will increase at the strongest, like a worst-case scenario.
This test is only important if you have an 8k TV, and your graphics card can output 8k content at 60 fps.
We repeat most of the same tests but with 120 fps signals instead. This is especially important for gaming on some gaming consoles, like the Xbox Series X or Xbox One X, as some other devices don't output signals at 120 fps. The 120Hz input lag should be around half the 60Hz input lag, but it's not going to be exactly half.
Once again, this result is only important for PC and Xbox gamers because they use 1440p signals. Not all TVs support this resolution either, so we can't always test for it.
This test is important if you're a gamer with an HDMI 2.1 graphics card or console. Since most 4k @ 120Hz signals require HDMI 2.1 bandwidth, you don't have to worry about this if your TV or gaming console is limited to HDMI 2.0. For this test, we use our HDMI 2.1 PC with an NVIDIA RTX 3070 graphics card because we need an HDMI 2.1 source to test it.
We also measure the input lag with any variable refresh rate (VRR) support enabled, if the TV has it. VRR is a feature gamers use to match the TV's refresh rate with the frame rate of the game, even if the frame rate drops. Enabling VRR could potentially add lag, so that's why we measure it, but most TVs don't have any issue with this. We measure this test by setting the TV to its maximum refresh rate and enabling one of its VRR formats, like FreeSync or G-SYNC.
We repeat the VRR testing with 1440p signals. If the TV doesn't support 1440p, we skip this test.
Like with 1440p and 1080p, we measure the 4k input lag with VRR enabled. Once again, this is important for gamers.
On 8k TVs, we measure the input lag with an 8k signal and VRR enabled.
Input lag is not an official spec advertised by most TV companies because it depends on two varying factors: the type of source and the settings of the television. The easiest way you can measure it is by connecting a computer to the TV and displaying the same timer on both screens. Then, if you take a picture of both screens, the time difference will be your input lag. This is, however, an approximation, because your computer does not necessarily output both signals at the same time. In this example image, an input lag of 40 ms (1:06:260 – 1:06:220) is indicated. However, our tests are a lot more accurate than that because of our tool.
Most people will only notice delays when the TV is out of Game Mode, but some gamers might be more sensitive to input lag even in Game Mode. Keep in mind that the input lag of the TV isn't the absolute lag of your entire setup; there's still your PC/console and your keyboard/controller. Every device adds a bit of delay, and the TV is just one piece in a line of electronics that we use while gaming. If you want to know how much lag you're sensitive to, check out this input lag simulator. You can simulate what it's like to add a certain amount of lag, but keep in mind this tool is relative to your current setup's lag, so even if you set it to 0 ms, there's still the default delay.
Input lag is the time it takes a TV to display an image on the screen from when it first receives the signals. It's important to have low input lag for gaming, and while high input lag may be noticeable if you're scrolling through Netflix or other apps, it's not as important for that use. We test for input lag using a special tool, and we measure the input lag at different resolutions, refresh rates, and with different settings enabled to see how changing the signal type can affect the input lag.