Tested using Methodology v2.1
Updated Jun 26, 2025 07:37 PM
Tested using Methodology v2.1
Updated Jul 25, 2025 04:11 PM
LG 27GS95QE-B
ASUS ROG Strix OLED XG27AQDMG
The LG 27GS95QE-B and the ASUS ROG Strix OLED XG27AQDMG are very similar 27-inch OLED gaming monitors. The ASUS is a better option if you prefer the vividness of a glossy screen. The ASUS is also a better choice if you find VRR flicker distracting, as it has a setting to reduce this, though it causes some stutter. However, the LG is better if you prefer a more consistent display in different lighting conditions with its matte screen. It's also a better choice if you plan to use the Xbox Series X|S with your monitor, as it's more compatible with that console.
LG 27GS95QE-B
ASUS ROG Strix OLED XG27AQDMG
Comments
LG 27GS95QE-B vs ASUS ROG Strix OLED XG27AQDMG: Main Discussion
What do you think of these products? Let us know below.
Looking for a personalized buying advice from the RTINGS.com experts? Insiders have direct access to buying advice on our insider forum.
Is macOS doesn’t support scaling with 4K 120Hz HDR output? When I connect one through USB Type C (Innocn 27M2V), I can select 4K 120Hz with HDR But when I set to 1440p scaling with 120Hz, HDR is no longer exist as an option but when I switch to 1440p 60Hz, the HDR option is back.
How many rows and columns of miniLED does it has?
How many rows and columns of miniLED does it has?
No, I think this is a inherit problem from miniLED tech in monitor space. I really hope rtings to develop a similar testing methodology like brightness in difference % windows and test the min blackness a monitor can still show proper contrast.
Will we have a best by type list of miniLED/full array local dimming like other type such as OLED? https://www.rtings.com/tv/reviews/best/by-type/oled
Ignore OLED, which one is the best local dimming monitor? Is there a list showing the best of miniLED monitor?
Since even local dimming score higher, it doesn’t necessary mean it’s a better miniLED display, black crush, blooming, backlight response time, dimming zone are all complicated metric.
Will the test kit/the test pattern be downloadable somewhere? If the test pattern is a website like Blur Buster’s TestUFO.com, will it be available to be access online for everyone to test? Or will it be open sourced on github?
Is there any suggested HDR setting to avoid black crush? It seems like for some movie with darker scene, e.g. House of Dragon, Dune, the screens back light goes completely off and nothing left to be seen.
I know there are 5,000+ dimming zones, but how many rows and columns exactly? I think this should be important info in the local dimming section
There is an Chinese review about it here https://www.youtube.com/watch?v=6s5E1uujK9Q
It was marketed as a TV or a single panel multi monitor setup more than just a 4k monitor
Can heat and coil whine be measured as well?
From TV longevity test, it seems like heat is one of the major issue that cause panels to die, given this is a miniLED display, how’s the heat of this monitor without a fan compare to other miniLED display?
Also, some reddit users stated there is coil whine in their unit, is it something will be checked?
That’s unexpected
When you guys have a chance to update or write a new one, it would be nice to have some insight on how to choose which connection base on the desired resolution and refresh rate. e.g. You should use HDMI 2.1 for 4K120, but why you should not use DP 1.4 with DSC or with difference color format like 4:2:2? Does adding DSC cause extra latency or image artefact?
I see there is this tool that state the max refresh rate base on the setting, but there is very little information of which one to choose.
Well, that’s disappointing. Is non-OLED flagship just not “mainstream” anymore?
How will you guys test the local dimming? There is option to turn FALD on off in SDR but FALD is forced on in HDR.
Please test the local dimming in both SDR and HDR mode, feels like the blooming control are different and blooming (gray backgtound with text especially) in HDR feels less obvious
Still a bit difficult. Monitor, earphone, etc all have a “diminishing returns”. Some $10000 product may provide very similar quality to $5000 product. The quality may be 10% better, but 100% more expensive yet it’s may still worth the price as it can provide the absolute best quality among the category.
e.g. Samsung microLED TV right now are super expensive, like 10 times the price of a highest end OLED TV, but they are indeed the best you can get as a TV with no burn in and much brighter. Is it worth 10 times the price? Maybe not, but the quality it offers is in it’s own class, so if you want to have the absolute best TV, it may be worth it for some people.
If RTINGS start providing suggested price tag, it may cause unnecessary process to adjust it according to market every year when technology advance.
I think that’s just old vote that has been here since this monitor launched. This is now the leader after losing so many rounds (2 months per round).
I think it will be very difficult to set that standard. How to differentiate the score properly?
Putting 5 as a “fair” deal?
Can you guys test if the HDMI 2.1 have some compatibility issue? There seems to be issue with Chromecast Ultra not able to output 4K when connected to it but if switch the monitor to HDMI 2.0, Chromecast can output 4K HDR
I think the cycle needs to be much shorter to like 2 weeks to a month max and have a more transparency on what criteria a certain product being bought. Or maybe top 3 of each cycle should be bought instead of just the top one.
A lot of new products may be released in 2 to 3 months. Beside this monitor, the current leader, the 27M2V also lost the last cycle in the last day of the voting, I mentioned the detail in another forum post.
The lack of transparency on when and why some product jump the queue and some stuck in the voting stage for multiple cycles just waste/lock people’s vote on older products.
This was an issue in the monitor section during the last voting cycle.
The Innocn 27M2V has been gaining positions throughout the month and slowly climbed to the top spot with a week remaining. Then the flagship LG OLED, the 45GR95QE-B, was released and somehow “sniped” the top spot in the last day of the cycle. I think these big brand flagships should already be planned in RTINGS’ review queue(look at the new 500Hz Alienware AW2524H) and there should be no need for voting. This flagships voting reduce the opportunity for lesser known products to gain visibility for RTINGS.
Another suggestion is to add the list of planned review in the IN PROGRESS section, instead of just showing the products that are purchased, “committing to review” a upcoming flagship so people don’t need to waste vote on it. This give better chance for smaller lesser know but interesting products getting the exposure they need.
This is a 27" mini LED monitor from a lesser know Chinese brand with the following spec:
On paper, this looks very interesting and worth a proper RTINGS review and some recommended settings
Maybe in the list for this page, add a timestamp for the last updated date? Currently, need to cross check on other post to know which on on the list is updated
You are commenting on a post named “How Long Should A TV Last? Our 100 TV Accelerated Longevity Test” under Home > TV > Tests section, asking why a monitor is not included in a TV test?
FYI aw3423dw is just a QD OLED panel, which there are 2 in this test, Samsung S95B and Sony A95K, just use those 2 if you care QD OLED longevity