The benefit of 4k/UHD over 1080p seems obvious on paper – 4k has four times as many pixels as 1080p, which means it should have a clearer picture – but a UHD TV only improves the picture quality if:
- You are watching native 4k content
- You sit close enough to notice the difference
A 4k UHD TV doesn't improve the picture quality of lower quality content like 1080p Blu-rays.
What it is: A TV with 2160 rows and 3840 columns of pixels.
Who should buy it: Most people buying TVs today.
What it is: A TV with 1080 rows and 1920 columns of pixels.
Who should buy it: If you sit far, have a small screen, and if you're only looking for a budget TV.
|Sizes available||Great||Great in the past, limited today|
|Height in pixels||2160||1080|
|Width in pixels||3840||1920|
|Content Availability||Average, but growing fast||Excellent|
|TV Sizes Available||40" to 100"+||32" to 55"|
Native 4k vs Native 1080p
Input: 4k Resolution
The two photos above illustrate an identical image at different native resolutions, which means the file resolution and the TV resolution are exactly the same. The first photo is a 4k image displayed on a Vizio M Series 4k TV, and the second is a 1080p image displayed on a Vizio E Series 1080p TV.
The 4k image is smoother and has more detail than the 1080p image. Look closely and you’ll see that the edges around objects in the 1080p picture are noticeably more jagged. The difference is because the higher pixel count of a 4k screen allows for a more natural representation of the picture, with smoother outlines for distinct objects and added detail in the image.
If you’re underwhelmed by the difference, it’s because there are diminishing returns at higher resolutions. With 4k, you do get more detail than with 1080p, but the upgrade isn’t as staggering as the one between SD and HD.
It’s important to note that this comparison uses a real 4k image. 4k content is more widespread now, but most of what you watch will probably be lower-resolution content upscaled to UHD, which will look different from native 4k and 1080p.
1080p Upscaled to 4k vs Native 1080p
Input: 1080p Resolution
To present lower-resolution material on a 4k TV, the TV has to perform a process called upscaling. This process increases the pixel count of a lower-resolution image, allowing a picture meant for a screen with fewer pixels to fit a screen with many more. It’s important to remember that since the amount of information in the signal doesn’t change, there won’t be more detail present.
The first image is a 1080p picture upscaled to 4k on the Vizio M, and the second is a native 1080p image on the Vizio E.
The Vizio M’s upscaling resulted in a bit of added smoothness, but overall the two images look very similar. There isn’t any more detail in the upscaled picture than you can see in the native 1080p picture, so whether or not it looks better is entirely subjective.
Not every TV upscales the same though. Some 4k TVs might produce an image that is a too soft. This doesn't mean that 4k is inherently worse since most TVs do not have this problem, but it is important to make sure the model you're buying doesn't have any issue with this before going through with the purchase.
HDR is a new format for transmitting video signals to televisions that enhances the dynamic range of content and allows them to use a wider range of colors. This has quite an impact on picture quality when it is done well, making content look a lot more life-like.
While 4k isn't a requirement for HDR support (see HDR vs SDR), we've yet to see a 1080p TV with HDR support. Because of this, if you're interested in upgrading to HDR, 4k UHD is the only option.
Having a 4k TV and genuine 4k content isn’t enough. There are limits to what the eye can perceive, so if you sit too far from your TV (the distance depending on the TV’s size), you won't be able to see all the detail in the image. That means that if you sit too far away from a 4k TV, the picture will look like what you’d get on a TV with a lower resolution screen.
This chart illustrates the dividing line for normal 20/20 vision. To use the chart, check your viewing distance on the vertical axis and the size of the TV on the horizontal one. If the resulting position is above the line, you probably won't see a major difference between a 1080p and a 4k TV.
That doesn’t mean you won’t see any difference at all – it just means it won’t be significant. You should also know that this chart assumes lossless media. Nowadays, retail stores only use such media on their 4k TVs, while their 1080p TVs display highly-compressed media. This makes in-person comparison hard and is done to boost sales of pricier sets.
Winner: 4k. While 4k won't give you much of a benefit after a certain distance, it will always be better from up close. Because it gives you a lot of freedom, it is the safer choice.
720p vs 1080i Broadcast Signal: What Is the Difference?
In the US, there are two standard resolutions for TV broadcasts: 720p and 1080i. Much like 1080p, the number refers to the vertical resolution of the screen, 720 and 1080 pixels. The letter, though, refers to either Progressive Scan or Interlaced Scan. Every TV sold today uses Progressive scan, but that doesn't mean they are not compatible with a 1080i signal.
In an interlaced video signal, the image is separated into even and odd horizontal lines. Alternating frames resolve even and odd lines, meaning that each individual frame of the signal (a video being a series of frames in quick succession) is only half of the image. Progressive scan, on the other hand, resolves the entirety of the image on every frame, so it is a bit more costly to distribute.
In the end, 1080i and 720p end up using about the same amount of bandwidth, even if 1080i covers over twice as many pixels. This means that still images look sharper on 1080i, but it isn't perfect. As you can see in the pictures above, 720p looks much clearer with motion. This is why sports channels use 720p since fast moving content will often look striped or looks like it is vibrating vertically with 1080i, which is distracting. Since it is very similar in logic, an interlaced video will also suffer from similar artifacts to chroma subsampling.
A frequently updated list of HD US channels with their respective resolution can be found here.
Winner: 720p for sports, 1080i for still images.
4k vs 8k
8k prototype TVs have appeared at conventions over the past few years, and questions about its usefulness are often asked. Reasonably so: even at a field of view filling 6ft, a very large 75 inch TV will not show a discernible difference between 4k and 8k. That is over 40% closer than the recommended seating distance for that size!
8k does have value in other applications such as Virtual Reality headsets that have your eyes an inch away from the screen, as well as computer monitors in uses where screen real estate is important. Movies have started to be filmed in 8k today as well, but that is mostly for freedom during production. None of them will get published in that format. It's very hard then to justify its value for TVs. Unless a strong shift in the way content is produced that makes you sit closer to your TV than you currently are, there is no need to wait for 8k to arrive.
Winner: 4k. While 8k is technically superior, the difference with 4k is minor for a TV.
If you're shopping for a TV today, a 4k TV is worth buying over a 1080p TV, provided you sit close enough to see the extra detail and are watching native UHD content. If you're only watching 1080p or even smaller resolution content, it won't give you a boost in quality. If you currently own a good 1080p TV and don't sit close enough to notice the pixels, it isn't worth spending money as you will probably not benefit much from the upgrade unless you spend for fancy features such as local dimming, OLED, and HDR.
Nowadays though, it is difficult to find anything other than a 4k TV. 1080p is usually reserved for cheap budget options and all the better TVs will have a UHD resolution. Premium features named earlier are not found on lower resolutions anymore.