Hey, that’s a great question! 4:4:4 is only needed for desktop PC use, it’s not needed for gaming at all, even from a PC. For console gaming it really won’t make any difference at all, as consoles generally don’t support 4:4:4 even if you tried. On a PC you might see a difference when gaming with it on, but most game engines are designed for 4:2:0 anyway, so it’s not really needed there, either.
Hi, thanks for the answer! I have another question if I may ask.
I was under the impression that next gen consoles could output 4:4:4 as they have HDMI 2.1. Wouldn’t it make a difference in that case? Or is this there something game engine wise that causes this to not make a difference.
Hi, I have a question about how SDR Peak brightness is measured.
You mention you are using a PC to measure peak brightness on a TV.
Do you measure the 109% peak white (RGB value 255) or the video level at 100% (RGB value 235)? I believe these may differ slightly on some TVs regardless of the video range setting.
This is just an issue with the TV.
8 bit with dithering does improve some issues with windows posterization but it does not fix the issues in HDR inherent in the set.
I’ve tried everything from 8 bit, RGB to 444
To 10 bit 444, RGB, 12 bit with both. Limited full.
10 bit 444 is the correct setting for this TV for gaming in SDR and HDR for the least amount of processing
There’s only 3 games that output Dolby vision so the PC is just sending HDR10 and the tv will fall back into HDR10. Setting the tv to DV PC looks even worse.
Newest Windows insider dev build has a Dolby vision toggle now, so that could prove interesting if it works.
Hi, could you elaborate what you mean with an issue with the TV? Do you mean this is a hardware issue?
I was specifically interested how different modes influence the TVs display processing rather than what signal is being fed to the TV. The TV by default does not allow full chroma to be displayed in its default mode. That requires the specific 4:4:4/PC mode to be enabled which also has a separate set of presets.
Seeing that Dolby vision does not have posterization issues, it does seem to be an issue with the TVs internal processing rather than a hardware issue. Dolby vision mostly bypasses the color processing of the TV, since its fed into the Dolby Vision color engine.
How is the HDR gradient when the PC icon or 4:4:4 pass through switch is enabled?
On previous models, this slightly improved the HDR gradient. I think this is because 4:4:4 bypasses some of the color processing the TV is doing. Considering Dolby vision also bypasses this as it has its own color engine, it would be interesting to see if this improves some of the countouring were seeing in HDR10.
(Obviously this isn’t a complete solution as it disables all motion processing, including real cinema).
Hey there. You’re only locked out of using the peak brightness setting in SDR with the 4:4:4 passthrough option enabled or with your input label set to ‘PC’. HDR is unaffected by that.
Hi, thank you for the answer.
On my C3 currently, the peak brightness says ‘High’ when I turn on 4:4:4 pass through in HDR. However, it is grayed out, so I don’t know for sure if it is working.
Is this the case on the G5 as well? And does the peak brightness still work when it is grayed out in this case?
Hi, I have a question about the video latency comparison of the soundbar vs when it is directly connected to the TV.
Since 120hz support is becoming common on soundbars as well, do you plan on testing the video latency for a 120hz signal as well? Since the latest TVs can achieve a very low latency of sub 5ms with 120hz, I’d imagine the difference could be bigger than it is with a 60hz signal.
No worries! You shouldn’t run into any issues using Dolby Vision or HDR on the Series X with Freesync enabled.
Thanks you for the answer. I have another question if I may ask.
I read that for HDR support, the Freesync Premium Pro tier is required. The xbox seems to support this, but the LG only supports Freesync Premium (no pro). How does this work and does it have any implications for HDR while using Freesync?
Hey! I’m really not sure why it’s disabled by default, but I’m assuming it’s for compatibility reasons, since not all consoles support Freesync. There are no downsides if you enable it since you’re gaming on a Series X. You likely wouldn’t notice much of a difference at all between HDMI Forum VRR and Freesync, but Freesync does have a few minor things going on under the hood that can help give you slightly better motion, so I’d recommend using it. I hope you’re enjoying your C4! It’s an awesome TV and pairs very well with the Series X! Thanks for reaching out.
Thanks for the answer!
I read online that there used to be issues with Dolby vision with this mode. Have these all been resolved? So I won’t encounter issues with HDR or Dolby vision?
No worries! There shouldn’t be any difference with the ABL dimming with 4:4:4 passthrough enabled. It’s not something we encountered at all during our testing.
Thank you so much! This clears up any doubt I had.
Another question if I may ask, should I use the PC icon, or enable the 4:4:4 option? Are there any differences in brightness performance between the two?
Hey there! It’s really not necessary for console gaming, as enabling that actually locks you out from using several picture settings. You also likely won’t notice any difference with it enabled. You can try it out and keep it enabled if you prefer the image, but you’ll likely be better off leaving that disabled. Thanks and I hope that helps!
Hi, thanks for the answer. I have another question I’m hoping you have an answer to.
I am trying to hook up a gaming PC to the TV, on which 4:4:4 pass through is recommended. However, I’m wondering if there’s a difference between automatic brightness limiting when having pass through enabled or not, particularly in HDR.
I already found out thanks to you that the peak brightness is about the same as in normal game mode on the C3 and thus likely the C4 as well, but I feel the ABL dimming is more aggressive with the pass through option enabled, while using the game console icon compared to it being disabled. Is this true?
No, this was done in the Game Optimizer mode with Prevent Input Delay set to ‘Boost’. The 444 and PC icon don’t change the input lag, we measured the same delay with all of those settings.
Hi, sorry if this wasn’t clear. I was specifically interested in the HDR brightness.
Hi, we don’t plan to test this as it doesn’t fall into the monitor use case (4:4:4, PC Label, Game Optimizer On, HGIG) or the playback use case (4:4:4 Off, no PC Label, Game Optimizer Off, no HGIG).
In the TV methodology we test Game Mode HDR Brightness with Game Optimizer On and 4:4:4 Passthrough Off. In the results for the LG C4 TV, the HDR real scene measurements still decrease in “Game Mode” when compared to “Out of Game Mode”. With this in mind, we expect minimal or no difference in HDR Real Scene performance with 4:4:4 Passthrough Off, and the label not set to PC while Game Optimizer is On.
Hi, thanks for the answering. I was hoping to see if we can make this screen brighter by giving up on full chroma, which doesn’t have a large impact on video games, while retaining low input lag.
I realize that it is absolutely essential for text clarity, but the impact of mid tone brightness is far greater in video games, as most text isn’t scaled 1:1.
Hello! I’ve retested HDR peak brightness with 4:4:4 passthrough enabled on the latest firmware version (03.30.60), and there are no notable changes in peak brightness performance. Thank you for reaching out.
Hi there, I have a another question about this if I may ask.
I’ve recently been configuring this on my TV again on another HDMI port, and I can’t figure out why to my eyes leaving 4:4:4 pass through disabled makes the TV appear brighter in small element highlights. Is this something subjective due to differences in color saturation?
I would like to use this option again as it clears up some posterization issues this TV has at 120hz, but the image is appears more visually pleasing when it comes to HDR highlights when this option is disabled.
Hi there, thanks for the review. I have a question about real scene HDR brightness.
In the review you have tested it with both 4:4:4 pass through disabled and enabled, but with different picture settings (game mode with HGiG and cinema with DTM off).
Can you test real scene brightness with the following?
4:4:4 pass through option Disabled.
the label NOT set to PC (as this forces full chroma as well).
the picture mode set to game optimizer with dynamic tone mapping set to HGiG.
It would be interesting to see if the display performs differently with 4:4:4 chroma disabled using the same picture presets. Particularly in real scenes rather than windowed measurements.
The option is still new and we are still investigating the differences between enabling it versus using PC mode. For now there do seem to be some differences, such as PC mode disabling more processing features than the passthrough option. We’ll let you know as soon as we’ve done more research. However we can tell you that you certainly do not need to have both the PC label AND the passthrough enabled to get 4:4:4 passthrough; use one or the other. We’d recommend you use the option that makes the most sense for your usage; if you are on a gaming console for example, enable the 4:4:4 passthrough option over using the PC label. Inversely if you’re connected to a computer, use the PC label over using the passthrough option. Hope this helps!
Hi, sorry for reviving an old discussion, but have you ever found a difference between the two options?
They seem to share the same presets from my own testing.
Hello and thanks for reaching out!
Unfortunately, this is a setting we usually disable for our testing, so we don’t have any of our own data about what it does. If you’d like to learn more, it may be worth contacting the manufacturer for additional information!
Thanks for the answer. I’ll have to see if LG has something to say about this setting.
Another question, this option does not affect image quality in any way?
in SDR , PC mode will limit the brightness and remove the ABL/ASBL. But in HDR it will be the same as in HDR Game Optimizer mode.
Hi,
Sorry if I’m reviving an old discussion, but is it possible to retest this on the latest firmware?
I’m getting the impression peak brightness in HDR is somewhat lowered when enabling the 4:4:4 passthrough option, maybe due to a change in how ABL/ASBL reacts when enabled.
Hi,
I see. In this case, I would suggest turning the brightness up (from the ~100 nit settings I suggested) until you find a level that you find satisfactory. The actual nit value should be less important here and what looks best to you, should be the desired spot. Due to panel variance and calibration, our numbers will also be slightly different — not to mention, I am taking a reading of a static pattern and relaying a value. So this may not be the best representation of overall scene appearance for your use case/needs.
I hope this helps!
Thanks for the answer!
I’ve settled on around 25 to 27.
One last question, I’m struggling to make games appear the same brightness. I am however using game optimizer in PC mode. Is there a brightness difference in PC mode compared to normal HDMI mode? Or are they the same?
Hi,
The nit level and the overall brightness depends on what you are displaying on the screen. In this case, these numbers are based off our checkerboard pattern, and we use a luminance meter to take a measurement. This number will change if I display a white window of varying sizes, or a real scene, for instance. Time makes a difference too, of course. Unless you are working in a workflow and trying to take measurements this number shouldn’t really matter, and I would suggest setting your TV to what looks good to you. Why are you interested in the 100 nit setting?
Because I keep changing it and I am unable to settle for a value. I want it to look similar to a 100 nits output in HDR, so switching between the two isn’t as jarring. I have my HDR modes set to use no dynamic time mapping and HGiG in game. And I always use between 100 and 120 nits if configurable inside a game for the midtones.
I used a setting similar to previous models (27 for filmmaker mode and 42 for game), but it seems the C3 is significantly brighter based on this information.
I might also go for 120 nits if I find 100 to dim. Do you happen to have the values for that are on the c3?
Hi there,
For our suggested SDR setting, that will be around OLED Pixel Brightness 13 (and ~23 with Peak Brightness off). For Game Optimizer (Peak Brightness off) that will be somewhere around 31.
Hope that helps!
Hi, thanks for the answer. This is al lot lower than what I’ve seen on the C2, where most calibrators seems to put it between 27-35 (peak brightness off). Is the C3 significantly brighter in SDR? And are these measurements based on 123 nits at 109%?
From my experience, I have never needed over 120 nits (55% in game mode) as long as the blinds are somewhat closed. As long as you don’t have direct sunlight shining into your room, the brightness should be fine. That being said, these TVs are marketed towards dark room viewing, and rtings specifically have recommendations for bright room viewing, mostly being QLEDs.
Since you mention contrast, do you notice backlight bleed on your current TV under your normal viewing conditions? If you do, than I’d say the conditions are dark enough for an OLED.
Thanks for the reply, Currently I have a C9 and I use Cinema mode with ALLM on, I thought that the C3 would behavior the same way, the input lag is like on game mode but you can use what ever mode you want with ALLM, and I agree with you, LG OLED are awesome, I just want to update to get a bigger screen because to me the C9 are already perfect.
On AVSforums, there are some unverified reports that the input lag in other modes with ALLM enabled vary between 1 and 3 frames of extra delay, which is not terrible, but not very good either. It’s likely the same if not worse on the C9, so if you don’t notice that than it shouldn’t be a problem on the C3.
However, I think that there’s not a lot of reasons to not be using game mode. You get HGIG and that alone alleviates a lot of tone mapping issues. In filmmaker mode, you have to almost always use dynamic tone mapping (as games do not send metadata to the TV), which is painful to look at at night, and completely blows out the picture.
yup that’s the case (in HDR at least) the brightness is the same in game mode + game optimizer + PC mode as it would be just in game mode (in HDR).
Antoher question if I may ask. Is there a difference between ABL/ASBL dimming aggressiveness in pc mode? Or is it the same as the normal game mode outside pc mode?
I know this is an older post, but this seems to be a bug with the YouTube app not switching it’s rendering to 4k when switching out of an HDMI port or broadcast. If you switch to another app like netflix and then back to YouTube it is temporary fixed.
I’m not sure how long you’ve had the TV. But every OLED I owned usually had uniformity issues for the first days of use. It goes away afer the TV has run a few compensation cycles.
HDTVtest recently confirmed that the Dolby Vision game mode exhibits a raised EOTF overbrightening the picture.
However, I’m now also reading users reporting the same issue when using 4:4:4 chroma passthrough in HDR10. Can you test this in the lab? 4:4:4 chroma is important for a proper text representation and an inaccurate EOTF in HDR would make this TV less attractive for PC gaming.
Edited 2 years ago: added clarification of the mode involved
No it doesn’t seem to cause any noticeable input lag, I had the B7 for a long time and looked at various work arounds to dark HDR, all dynamic contrast seems to do is mess with the gamma values nothing more than that.
The dynamic contrast low in cinema mode is something different, that was a prototype of LG’s dynamic tone mapping feature that came in later models, but this only works in cinema mode, wont do the same in game mode.
Most games should have HDR calibration settings at this point were you can hopefully offset the problem.
The 8 series onwards changed how they measured HDR metadata and more recent models have HGIG alongside LG’s own dynamic tone mapping feature available in game mode.
It was possible to alter the HDR data for the way the 7 series was designed but this was complicated and involved the use of a HD Fury device to inject the data sitting between console and TV, you also had to alter it based on each game, I forget the details beyond that but it was more hassle than it was worth I thought at the time.
Do you recommend using the dynamic contrast option or changing the brightness settings in-game?
Hi reerden,
If you’re looking for an Xbox Series X compatible headset with good frequency consistency, you might want to try the Astro A40 TR Headset + MixAmp Pro 2019. They’re great for wired gaming, have wired and analog compatibility with the Series X, and have much better frequency response consistency than the Arctis 9x. Another option could be the Audeze Penrose Wireless. They also have much better frequency response consistency, but only have analog compatibility with the Series X. Feel free to let me know what you think! :)
Thanks for the answer. The Penrose (or the Penrose X, which probably sounds similar) seems like a good option, but caries a rather hefty price tag. There are also concerns about its comfort.
What about the Astro A50 gen 4? That seems like the wireless versions of the A40. Is there a reason it’s not listed in the recommendations?
I just notice that there are actually three presets suggested, cinema under the HDR title and Technicolour under HDR10. Now I’m even more confused.
I’ve read you recommend Cinema these days in other discussions, but it’s still not clear what the color temperature should be in HDR10. Is warm 1 recommended? And what about the color temperature in Dolby Vision? Should I set Dolby to warm 1 as well?
Up to 10 applies edge smoothing without any sharpening effect. This has been shown to occasionally cause ringing artifacts on the 2017 models and probably the 2018 models. 0 is the true no processing added sharpness.
My guess is that the edge smoothing has been approved in the 2019 models, so this is why 10 is now the recommended setting by rtings.
My personal opinion is that 10 is better than 0, given the inherit sharpness of edges on OLED compared to LCD. I’ve personally never seen any artifacting at 10, nor any sharpening effect, and the perceived sharpness is better at 10.
Hi, thanks for the answer! I have another question if I may ask.
I was under the impression that next gen consoles could output 4:4:4 as they have HDMI 2.1. Wouldn’t it make a difference in that case? Or is this there something game engine wise that causes this to not make a difference.
Hi, I have a question about how SDR Peak brightness is measured.
You mention you are using a PC to measure peak brightness on a TV.
Do you measure the 109% peak white (RGB value 255) or the video level at 100% (RGB value 235)? I believe these may differ slightly on some TVs regardless of the video range setting.
Hi, could you elaborate what you mean with an issue with the TV? Do you mean this is a hardware issue?
I was specifically interested how different modes influence the TVs display processing rather than what signal is being fed to the TV. The TV by default does not allow full chroma to be displayed in its default mode. That requires the specific 4:4:4/PC mode to be enabled which also has a separate set of presets.
Seeing that Dolby vision does not have posterization issues, it does seem to be an issue with the TVs internal processing rather than a hardware issue. Dolby vision mostly bypasses the color processing of the TV, since its fed into the Dolby Vision color engine.
How is the HDR gradient when the PC icon or 4:4:4 pass through switch is enabled?
On previous models, this slightly improved the HDR gradient. I think this is because 4:4:4 bypasses some of the color processing the TV is doing. Considering Dolby vision also bypasses this as it has its own color engine, it would be interesting to see if this improves some of the countouring were seeing in HDR10.
(Obviously this isn’t a complete solution as it disables all motion processing, including real cinema).
Hi, thank you for the answer.
On my C3 currently, the peak brightness says ‘High’ when I turn on 4:4:4 pass through in HDR. However, it is grayed out, so I don’t know for sure if it is working.
Is this the case on the G5 as well? And does the peak brightness still work when it is grayed out in this case?
Hi, I have a question about the video latency comparison of the soundbar vs when it is directly connected to the TV.
Since 120hz support is becoming common on soundbars as well, do you plan on testing the video latency for a 120hz signal as well? Since the latest TVs can achieve a very low latency of sub 5ms with 120hz, I’d imagine the difference could be bigger than it is with a 60hz signal.
Thanks you for the answer. I have another question if I may ask.
I read that for HDR support, the Freesync Premium Pro tier is required. The xbox seems to support this, but the LG only supports Freesync Premium (no pro). How does this work and does it have any implications for HDR while using Freesync?
Thanks for the answer!
I read online that there used to be issues with Dolby vision with this mode. Have these all been resolved? So I won’t encounter issues with HDR or Dolby vision?
Thank you so much! This clears up any doubt I had.
Another question if I may ask, should I use the PC icon, or enable the 4:4:4 option? Are there any differences in brightness performance between the two?
Hi, thanks for the answer. I have another question I’m hoping you have an answer to.
I am trying to hook up a gaming PC to the TV, on which 4:4:4 pass through is recommended. However, I’m wondering if there’s a difference between automatic brightness limiting when having pass through enabled or not, particularly in HDR.
I already found out thanks to you that the peak brightness is about the same as in normal game mode on the C3 and thus likely the C4 as well, but I feel the ABL dimming is more aggressive with the pass through option enabled, while using the game console icon compared to it being disabled. Is this true?
Hi, sorry if this wasn’t clear. I was specifically interested in the HDR brightness.
Is it also the same just like the input lag?
Hi, thanks for the answering. I was hoping to see if we can make this screen brighter by giving up on full chroma, which doesn’t have a large impact on video games, while retaining low input lag.
I realize that it is absolutely essential for text clarity, but the impact of mid tone brightness is far greater in video games, as most text isn’t scaled 1:1.
Hi, I have a question about the HDR brightness measurement in game mode. Was this done with the input icon set to PC (or 4:4:4 pass through enabled)?
And is there a difference between having ‘PC’ icon (or 4:4:4) enabled or not?
Hi there, I have a another question about this if I may ask.
I’ve recently been configuring this on my TV again on another HDMI port, and I can’t figure out why to my eyes leaving 4:4:4 pass through disabled makes the TV appear brighter in small element highlights. Is this something subjective due to differences in color saturation?
I would like to use this option again as it clears up some posterization issues this TV has at 120hz, but the image is appears more visually pleasing when it comes to HDR highlights when this option is disabled.
Hi there, thanks for the review. I have a question about real scene HDR brightness.
In the review you have tested it with both 4:4:4 pass through disabled and enabled, but with different picture settings (game mode with HGiG and cinema with DTM off).
Can you test real scene brightness with the following?
It would be interesting to see if the display performs differently with 4:4:4 chroma disabled using the same picture presets. Particularly in real scenes rather than windowed measurements.
Hi, sorry for reviving an old discussion, but have you ever found a difference between the two options?
They seem to share the same presets from my own testing.
Thanks for the answer. I’ll have to see if LG has something to say about this setting.
Another question, this option does not affect image quality in any way?
Hi,
Sorry if I’m reviving an old discussion, but is it possible to retest this on the latest firmware?
I’m getting the impression peak brightness in HDR is somewhat lowered when enabling the 4:4:4 passthrough option, maybe due to a change in how ABL/ASBL reacts when enabled.
Thanks for the answer!
I’ve settled on around 25 to 27.
One last question, I’m struggling to make games appear the same brightness. I am however using game optimizer in PC mode. Is there a brightness difference in PC mode compared to normal HDMI mode? Or are they the same?
Because I keep changing it and I am unable to settle for a value. I want it to look similar to a 100 nits output in HDR, so switching between the two isn’t as jarring. I have my HDR modes set to use no dynamic time mapping and HGiG in game. And I always use between 100 and 120 nits if configurable inside a game for the midtones.
I used a setting similar to previous models (27 for filmmaker mode and 42 for game), but it seems the C3 is significantly brighter based on this information.
I might also go for 120 nits if I find 100 to dim. Do you happen to have the values for that are on the c3?
Hi, thanks for the answer. This is al lot lower than what I’ve seen on the C2, where most calibrators seems to put it between 27-35 (peak brightness off). Is the C3 significantly brighter in SDR? And are these measurements based on 123 nits at 109%?
From my experience, I have never needed over 120 nits (55% in game mode) as long as the blinds are somewhat closed. As long as you don’t have direct sunlight shining into your room, the brightness should be fine. That being said, these TVs are marketed towards dark room viewing, and rtings specifically have recommendations for bright room viewing, mostly being QLEDs.
Since you mention contrast, do you notice backlight bleed on your current TV under your normal viewing conditions? If you do, than I’d say the conditions are dark enough for an OLED.
On AVSforums, there are some unverified reports that the input lag in other modes with ALLM enabled vary between 1 and 3 frames of extra delay, which is not terrible, but not very good either. It’s likely the same if not worse on the C9, so if you don’t notice that than it shouldn’t be a problem on the C3.
However, I think that there’s not a lot of reasons to not be using game mode. You get HGIG and that alone alleviates a lot of tone mapping issues. In filmmaker mode, you have to almost always use dynamic tone mapping (as games do not send metadata to the TV), which is painful to look at at night, and completely blows out the picture.
Antoher question if I may ask. Is there a difference between ABL/ASBL dimming aggressiveness in pc mode? Or is it the same as the normal game mode outside pc mode?
Hi, thanks for the answer!
Does that also mean it remains the same across “picture modes”? For instance, selecting game optimizer while in PC mode won’t make it extra dim?
I know this is an older post, but this seems to be a bug with the YouTube app not switching it’s rendering to 4k when switching out of an HDMI port or broadcast. If you switch to another app like netflix and then back to YouTube it is temporary fixed.
I’m not sure how long you’ve had the TV. But every OLED I owned usually had uniformity issues for the first days of use. It goes away afer the TV has run a few compensation cycles.
Hi,
HDTVtest recently confirmed that the Dolby Vision game mode exhibits a raised EOTF overbrightening the picture.
However, I’m now also reading users reporting the same issue when using 4:4:4 chroma passthrough in HDR10. Can you test this in the lab? 4:4:4 chroma is important for a proper text representation and an inaccurate EOTF in HDR would make this TV less attractive for PC gaming.
Do you recommend using the dynamic contrast option or changing the brightness settings in-game?
Thanks for the answer. The Penrose (or the Penrose X, which probably sounds similar) seems like a good option, but caries a rather hefty price tag. There are also concerns about its comfort.
What about the Astro A50 gen 4? That seems like the wireless versions of the A40. Is there a reason it’s not listed in the recommendations?
I just notice that there are actually three presets suggested, cinema under the HDR title and Technicolour under HDR10. Now I’m even more confused.
I’ve read you recommend Cinema these days in other discussions, but it’s still not clear what the color temperature should be in HDR10. Is warm 1 recommended? And what about the color temperature in Dolby Vision? Should I set Dolby to warm 1 as well?
Up to 10 applies edge smoothing without any sharpening effect. This has been shown to occasionally cause ringing artifacts on the 2017 models and probably the 2018 models. 0 is the true no processing added sharpness.
My guess is that the edge smoothing has been approved in the 2019 models, so this is why 10 is now the recommended setting by rtings.
My personal opinion is that 10 is better than 0, given the inherit sharpness of edges on OLED compared to LCD. I’ve personally never seen any artifacting at 10, nor any sharpening effect, and the perceived sharpness is better at 10.