A TV's refresh rate is how many times the screen refreshes itself every second. It's different from frames per second (fps), which defines how many frames the video source displays every second. The refresh rate is important for motion handling, as a higher refresh rate generally results in smoother motion, but it's not always the case. It's also important to not get confused with marketing terms that often inflate the refresh rate.
This article explains the differences between a 60Hz and 120Hz refresh rate, how it affects the image, and how companies will try to trick you into thinking the TV has a higher refresh rate than it actually has.
Even though we don't see it, our TV is constantly refreshing itself every second. A refresh rate defines how many times per second it draws a new image on the screen, and it's written out in Hertz (Hz). A 60Hz refresh rate means that the screen is refreshing itself 60 times every second, and at 120Hz, it's refreshing itself 120 times every second. This is different from frame rate, which is how many times per second the source sends a new frame. So if you have a source that's displaying 60 frames per second, you always want your TV to be refreshing itself 60 times/second so that the refresh and frame rate match up, otherwise motion may look blurry.
The refresh rate affects motion handling; the more times the display can draw a new image the better it is for fast-moving content. Modern TVs either have a 60Hz or 120Hz refresh rate. Most high-end TVs have a 120Hz refresh rate, but it doesn't mean they're inherently better at motion handling either. The response time determines how good motion looks; a quick response time means that motion looks clear, while a TV with a slow response time leads to motion blur. Response time and refresh rate are indirectly related as a 120Hz panel is expected to have a better response time than a 60Hz panel, but it's not a guarantee.
Since not all content will automatically have the same frame rate as your TV's refresh rate, there are also ways a TV increases the frame rate to match up with the refresh rate, improving the appearance of motion.
A 60 fps video played on a 120Hz TV should look almost identical to the same content played on a 60Hz TV. In a case like this, the TV either adjusts itself to match the refresh rate of the source, which effectively turns it into a 60Hz TV, or it simply doubles every frame.
As you can see from the picture above, a TV with a higher refresh rate doesn't produce less motion blur. Since both of these TVs have a very similar response time, 60 fps content results in an almost identical picture.
To better showcase these differences, we compared two TVs side-by-side; a 60Hz model, and a 120Hz model, with similar response times. We filmed these TVs in slow motion to easily compare each individual frame.
While a 120Hz TV doesn't inherently produce better motion, it can provide a few advantages over standard 60Hz TVs. One of the most important advantages is the ability to play back content that is meant to be displayed at 24 fps, which is often found in movies. Most TVs can simply lower their own refresh rate to 24Hz when the content is 24 fps, but some sources, such as Chromecast, output video at 60 fps, even if the content is 24 fps. This means that the TV's refresh rate remains at 60Hz, and motion won't appear smooth, which is an effect called judder. A 60Hz TV has trouble removing 24 fps judder because 60 isn't a multiple of 24. To display this type of content, a technique known as a "3:2 pulldown" is used. Basically, 12 of the 24 frames repeat three times, while the other 12 repeat twice, totaling 60 frames. Not everybody notices this, but it causes some scenes, notably panning shots, to appear juddery. However, 120Hz TVs have an advantage here because they can simply display each frame five times since 120 is a multiple of 24.
There are a few sources that display 120 fps, such as the Xbox Series X or the PS5, and having a 120Hz TV helps display this content at its max frame rate. While it's rare to find content other than games with this frame rate, displaying 120 fps has a significant impact on the perceived motion. As you can see in the picture below, content looks much smoother at 120 fps than at 60 fps on a 120Hz TV.
With the release of HDMI 2.1, there may be more 120 fps sources available in the coming years. This new HDMI standard allows TVs to display 4k images up to 120 fps, whereas HDMI 2.0 allows up to 60 fps. This means that 120Hz TVs may slowly become the norm.
Another place where 120Hz is useful is if you enjoy the motion interpolation feature found on TVs (also known as the Soap Opera Effect). It allows the TV to generate frames between existing ones, increasing the frame rate to match up to the refresh rate. Most TVs have this feature; a 60Hz TV can interpolate 30 fps content, while a 120Hz TV can interpolate 30 and 60 fps content. This is why a 120Hz TV is an advantage over 60Hz since it can interpolate more types of content.
There are other ways to produce a similarly clear image as a 120Hz refresh rate. Many TVs these days have a feature called Black Frame Insertion. Essentially, the TV displays a black screen between each frame, which most people can't see, but it can also make the screen dimmer. On most LED TVs, this is achieved by adjusting the flicker frequency of the backlight, which results in the backlight being turned off for half the frame. On OLED TVs, which don't have a backlight, this is done by inserting a black frame in-between each frame.
Persistence blur occurs when your eyes move past a static image, such as each static frame that makes up moving content. With Black Frame Insertion, the static frame is present for a shorter duration, so the length of the persistence blur is shorter. Unfortunately, though, not everyone can stand the flickering, and some people may get annoyed after a while.
A TV is only as good as the content you are playing, and unfortunately, very little 120 fps content actually exists. With the new HDMI 2.1 standard, gaming consoles like the Xbox Series X and PS5 support 120 fps, but there isn't much online content available at such a high frame rate. We've compiled a couple of lists of common entertainment sources as well as their respective refresh rates.
|Netflix||24 fps to 60 fps|
|Amazon Video||24 fps to 60 fps|
|Blu-ray movies||24 fps|
|YouTube||30 or 60 fps|
|Cable/Broadcast TV (NTSC)||30 or 60 fps|
|Cable/Broadcast TV (PAL)||25 or 50 fps|
|Xbox One S/X||24Hz to 120Hz|
|Xbox Series X||24Hz to 120Hz|
|PS4/PS4 Pro||24Hz to 60Hz|
|PS5||24Hz to 120Hz|
|Blu-ray players||24Hz to 60Hz|
|PC||Up to 240Hz|
|Apple TV||24Hz to 60Hz|
A source's frame rate isn't always constant, especially in games. It may drop and if that happens, it can lead to screen tearing because the frame rate of the game and the refresh rate of your TV don't match up. There's a feature called variable refresh rate (VRR) that aims to match the refresh and frame rate on-the-go, so if the frame rate of the game drops, the TV automatically lowers its refresh rate as well. This is only possible if both the TV and the source support VRR.
There are different formats of VRR, with AMD's FreeSync, NVIDIA's G-SYNC, and HDMI Forum VRR being the three most common types. G-SYNC is usually reserved for monitors, but some TVs are compatible with it. Higher-end Samsung and LG TVs have FreeSync, and support for HDMI Forum VRR is starting to grow on HDMI 2.1 TVs. As for compatible devices, the Xbox Series X supports FreeSync and HDMI Forum VRR, while the PS5 should receive an update in 2021 for HDMI Forum VRR.
TV companies will often market their refresh rates in ways to make it seem like it's higher than it actually is. A company like Samsung uses the term 'Motion Rate'; the Motion Rate on a 60Hz TV is 120, while a 120Hz model has a Motion Rate of 240; they effectively double the refresh rate to come up with this number, and there's no real explanation as to why it's marketed like that. LG uses 'TruMotion', Vizio has 'Effective Refresh Rate', and Sony has two terms: 'MotionFlow XR' and 'X-Motion Clarity'. These marketing numbers don't really mean anything, and you need to check the TV's specs to find the real refresh rate.
LCD TVs are lit by LED lights, and most TVs use Pulse Width Modulation (PWM) to dim the backlight. What this means is that the backlight turns itself off every few seconds, so it doesn't get too bright. It's not visible to the human eye because of how fast the frequency is. Flicker frequency, like refresh rate, is measured in Hz because we want to know how many times it flickers every second.
If the flicker frequency doesn't match up with the refresh or frame rates, it can create some image duplication. As you can see in the images below, motion on the LG UN8500 has image duplication because its backlight flickers at 120Hz; this is double the 60 fps source. However, the Sony X800H has a flicker-free backlight, so there's no image duplication - the motion blur is caused by a slower response time.
A refresh rate defines how often the screen refreshes itself every second. Although we can't see it, the TV draws a new image from the source every few milliseconds. Generally, a higher refresh rate TV results in better motion handling, but it's not always the case as there are other factors that come into play with motion. It's important that your source's frame rate and the TV's refresh rate each match in order to create smooth, stutter-free motion. For most people, a TV with a 60Hz refresh rate is good enough since there isn't much 4k content that goes past 60 fps. However, 120Hz TVs with HDMI 2.1 support are beneficial to gamers as they allow for higher frame rates.