Hi Rtings-Team, when will the test review of this product be available? (its in the lab since May now…) In the meantime i did buy it by myself and it is one of the best pieces i have ever bought. But i am particular interested in the review of the keyboard. Its very good for a tabletkeyboard. Also the display is really good. No flickering from my perspective and fast.
Hey!
Glad you’re happy with your purchase! It should be the next one we tackle, but we’re currently working on a new update for the benchmarking on laptops, so it’s likely it’ll come out once we’ve finished making sure all the scoring makes more sense and we’ve put the new tests through the products.
That said, we’re slowly going through the backlog so it might come out before the update is live, so you’ll be able to see the screen testing earlier if the timing works out. I’ll talk to the team to see if we can make it the next laptop to be tested.
Are the latency tests at 0.1mm? Just wondering, normally settings below that aren’t that usable but the latency seems extremely worth the price at first glance if they’re at settings that are usable ingame.
Hey!
I can confirm that it was tested at 0.1mm. We always use the settings that will give us the best latency when running these tests. If ever there’s any exceptions, it will get written in the text as to why, but in this case we’re good! :)
Is it possible you used the USB 3.2 port instead of the USB 4 port? I’m not sure if that would make a difference but I know the specs for the two are definitely different.
Hey!
I can confirm that I used both ports. In any case, USB 3.2 is rated for 100W and USB 4 is rated for 240W, though I don’t actually expect to get anywhere near those results in actual testing.
So far I’ve been able to get a max of 28W which seems odd. I’m going to investigate it a bit more tomorrow though to see if I have another device I can use to charge.
I looked into this a little bit today! So, the USB-C is rated for 100W but, I wasn’t able to get anywhere near that. The highest I was able to get was 10W, however, I saw that the battery of the laptop wasn’t quite at 100%. I’ll give it another look on Monday once the laptop is fully charged!
Hey Alexander,
This could be from the Armoury Crate. People have been deleting the Crate ever since the G-Helper came out to replace it. The default app makes the laptop run hotter and makes freezes in the system as well as in games. Maybe this is something you can check out.
I literally had this problem on my wife’s laptop this weekend with Armoury Crate and used G-Helper instead, which solved the problem. I might eke out some time to see if there’s any noticeable difference between the two software as well.
Thanks for the update!
With my spectro, I do however get a very minimal deviation with blue peaks at different brightness. I am not completely sure what is causing this, but the display’s white point (after fine tuning) is sometimes not stable, even after letting it warm up. Here is the SPD graph I created. Both give me the same white point though. Most of the macbook pro m4s sold in the UK were as warm as 6000K when I tested them. The one that I received (CTO) from china, was almost same as yours - 6380K x 0.3140 y 0.3330.
https://postimg.cc/0bqBHcBn
It could be any changes you’re making as well. We don’t currently calibrate our laptop displays, so ours is out of the box and any deviations outside of that can cause some micro changes in the colours that the tool can pickup. 6000K is quite warm so I’m surprised to hear that!
What spectro did you use to measure the spectral chart? I measured my nano texture m4 pro quantum dot with my efi es-3000 at 3.3nm mode and it pretty much matches the one in this review. I am guessing they shifted to quantum dot because it increases colour accuracy and also are much easier to calibrate now. The phosphor macbook displays reached only like 97-98 P3.
Hey!
We use the Colorimetry CR-250 Spectroradiometer. There’s a few reasons why they may have shifted to quantum dot, but I’m not complaining about it. I’m glad that even though we’re using different models of spectro it still looks the same, even across the model line. Apple does a good job at ensuring their products give the same experience across the product line. Not a lot of variance there.
Hi! I buy this laptop and i love it! but i am confused, in the area of “Serviceability” It says it only have to remove 4 screws, but i read that are 2 more screws hidden in the large rubber foot that it had to be removedbto opening. It is true? because i want to upgrade de ssd to 2 tb. Thanks!
Hey!
On our model, there were no screws under the foot. It could be an error on either their documentation, in the building of this model (or batch of models) or it’s possible that different variants of this model do not have these screws. To be safe, I’d remove the foot on yours to check before you try to dismantle it and upgrade to a 2TB SSD. Good luck!
Hey, did you enable Mullvad’s Lockdown Mode ? I feel this should be mentioned in the article.
Hey!
Yes, we always enabled the most secure version of the kill-switch for each VPN. You can check out the main R&D article we put out talking more in detail about these settings, though I’ll forward this to the writing team to see if we’d mention it in the reviews themselves.
The performance tests seem strange. Both the download speeds and the upload latency are better with the VPN turned on. The latency results without VPN also seem much higher than in other VPN reviews. Could you look into this?
Hey!
We’re currently looking into beefing up our VPN performance tests, so the good news is that it’s already being worked on! There’s still value in collecting as much data as we can, as that will help us identify exactly where and what issues exist with the current iteration of the test.
That said, it is possible for the speeds to be faster on the VPN than without. It’s quite complicated, but essentially there is a possibility that the server without the VPN is taking a less efficient path to the speed test servers than the VPN is, so that translates to the VPN being faster.
As for the latency, it’s actually quite good. While the highs and lows you can see on the graph are a bit exceptional compared to most of the VPNs we have tested, the consistency is still considered good. You can see this behaviour on one of our best performing VPNs (when it comes to latency and how we score it) in Windscribe as well.
Thanks for the interest in the reviews though! And if there’s any other questions, don’t hesitate to ask!
Hey!
Just wondering when the update is coming or if it is coming? Thanks!
Hey igort!
Just wanted to touch base with you again. Been working on the benchmarks on this laptop, but I’ve been running into some issues with it. After updating the laptop, there are a lot of stability issues on the system itself. Lots of reboots and crashes are making it difficult to collect the data I need. I’m still working on it, but I just wanted to let you know that there’s definitely some bugs to iron out with the update. I’ll be completely wiping it and starting from scratch to see if I can either pinpoint the issue or it resolves itself.
Hey!
Just wondering when the update is coming or if it is coming? Thanks!
Hey!
I have not forgotten about you, don’t worry! I was out sick for a few days so it’s been delayed a little longer than I initially thought. It’s on it’s way though!
Mullvad doesn’t allow for port forwarding which hinders torrent seeding. Shouldn’t this affect the score?
Hey!
Currently, we don’t look at port forwarding as a feature yet. I know it’s a feature that a lot of people care about and I’ve already added it to our suggestions for the next test bench update on VPNs.
That said, our ‘perfect’ score on the Downloads box in the review is a 9.5 because we feel we’re not at a state where we can say an aspect of the VPN is a real 10/10 when it comes to that test. But, we’re happy to iterate upon our testing in the future to make sure we hit everything that matters when it comes to a VPN.
Hello RTINGS!
There has been an update made to BIOS system for intel processors fixing the inconsistent performance issues. Would be nice to see if 1% lows are now much better and if average fps will be increased.
I have seen couple of Youtubers comparing before and after and results are pretty surprising!
Hey!
Yeah, I’ve been reading about this more and more and it seems like there’s some interest here! I’ll make some time to give this a shot early/mid next week so you can expect it to be reflected in the review (whether there’s any change or not) end of next week. Hopefully there’s some noticeable improvements!
How can you measure uniformity with a black picture? The uniformity should be measured with a on eg. a grey picture with a measurement device.
You are mixing it up with e.g. IPS bleed.
I think you should learn from www.prad.de
Moreover:
What kind of measurment device do you use when measuring the panels
When was the last time you emulated your measurement device?
Hey!
It seems like there might be some confusion about what we’re testing. On laptops, as far as uniformity goes we are only testing the black uniformity. So we’re evaluating the uniformity of black on the screen. This will include IPS bleed as an aspect that gets accounted for on the black uniformity, but it’s still important to know what to expect on a black screen. In the future, we might add a grey uniformity test to the test bench, which would be a grey picture instead of black.
There are a few different ways to get the uniformity of a screen and I’m aware that some people use measurement tools, but our method is different. In the case of laptops, we take a photo of the screen (with specific settings on both the camera and the display) and run it through a job which compares the pixel data to each other pixel, with a low pass filter to make sure we cut out any noise that could be caught by our camera. We also remove the center around the cross from the comparison as we want to have the scene have a highlight on it, since many real scenes have a lot of darkness with some highlights. At the end, we provide the std. dev of this as a reference for how different they are.
You can read about how it’s done across our other products such as TVs and monitors. We don’t have a dedicated article for laptops as of yet, but it’s a similar design principle here. I hope that answers your question, but feel free to ask for any clarifications if you need.
Both were supposed to start at the same time, actually. But the truth is was that a member of the team was off so it just delayed one of them by a day. No need to worry though, as they are both in progress as we speak.
Can you use Steam remote play on this laptop to stream a game to it?
Hey!
You absolutely can. Best option is to download the app itself onto the laptop rather than Steam Link or anything and remote play directly through the app.
Does this or does this not support HDR in Windows (not video streaming, but for 2D windows and games)?
Yep! This works with Windows HDR which you can enable in the display settings of Windows. Usually you’ll need to enable it in games separately if it doesn’t automatically detect that it’s on.
It would be good to see these updated since the performance issues were resolved in the BIOS months ago. It’s a very consistent performer now. I’m also surprised that the brightness of the display was so low when so many other review sites have it at over 400 nits and even higher in HDR mode…
I still think this is one of the best multipurpose gaming laptops on the market!
Hey!
We’re hoping to do a larger scale update to the testing in general for laptops that will do a better job of capturing the performance of the laptop. I agree though, love this laptop personally and it’s quite a good contender in general.
The way we do our brightness tests is definitely different than most other reviewers, as we take an average of five separate points on the screen using a checkerboard pattern. This can explain the differences between us and other reviewers, but we feel it matches a regular person’s usage of the laptop display more than simply just trying to get the brightest number possible.
HDR is something we do check on some laptops, particularly ones that advertise the quality of their screen/HDR, so it’s something we can definitely look into for this model in the future.
I’m looking forward to the retest, I’m considering this router for using with Virtual Desktop with VR, I usually play about 40 feet away from the router around a corner, I currently use the ASUS RT-AX86S router which has worked very well for my current set up, but I believe the router is starting to fail, so I’m in the market for a new router.
We’ll be getting to it over the next few days, so it’ll likely be live by the end of the week or latest very early next week.
Could you guys retest this now that firmware 1.1.1 is out, which brings a TON of features/fixes/etc? Especially would love to see its MLO (re)tested, since it has had issues for sure, and might still, but that’s what the test is for!
Hey!
Wanted to give you an update! We should be able to give this a look next week! I’ll make another post when either we’ve finished the retesting of it or when we’re starting depending on whether it’s the beginning or end of the week.
This review is wrong , the switches are replacable without soldering and the switches works low adjust settings works flaweslly.
https://www.youtube.com/watch?v=0rB1iHXGizs&t=224s look at the 1:46 the dude removed the switches
https://www.youtube.com/watch?v=IeulNKIVDJE&t=728s Look at the 13:32 the dude just remove the switch with switch puller
They are using the REASHA Magnetic which is pretty decent.
Hey! You’re absolutely right, just went to double check our model and it is the case. The review should be updated shortly! Thanks for catching that!
Could you guys retest this now that firmware 1.1.1 is out, which brings a TON of features/fixes/etc? Especially would love to see its MLO (re)tested, since it has had issues for sure, and might still, but that’s what the test is for!
Hey!
Yeah, we can give this another looksy with the new firmware. I expect we can fit it in within the next couple of weeks, so I can update you when I have a better idea on when, but I think it’s definitely worth checking into.
I’d be interested to know if the latency has been improved in the newer firmware updates. The latest release notes mention improvements to 2.4G and Bluetooth. I wonder if it resolves the outliers seen in the 2.4G latency testing. Is this something you could consider updating?
Hey!
We’re keeping an eye on the updates so we’ll likely get to this in the near future. We’re planning on going through a few keyboards, but I don’t have an exact timeline yet. It should be fairly soon, but I can bump this when I have a better idea of when we’ll be able to get to this in particular.
Thanks for bringing it to our attention though, it helps us prioritize which products to focus.
Unfortunately, the keyboard doesn’t have true NKRO. When you enable the setting in the software, it is still limited to 16 keys and not a true NKRO feature.
Hi,
I have the same router and my data doesn’t match your results.
What device do you use as a receiver?
I suspect you have some interference in the 5ghz band as your results don’t make sense.
I have a server with OpenSpeedTest connected by cable to the router(1Gbps port), and using a pixel 7a as receiver I get the same speed on 5ghz and 6ghz when being close to the router, almost 1Gbps. When I get into a room, so a bit far and behind a wall, 6ghz performance drops much more than 5ghz.
Also if you find a receiver device that has 4x4 wifi, and have a OpenSpeedTest server in the 2.5Gbps port, in theory you would see double speed on the 5ghz band.
An advantage of the 6ghz is the latency, perhaps because I have several devices connected to the 5ghz and just one device on the 6ghz, but a ping to my server was faster using the 6ghz band. Latency would be a good test to do in a proper test environment.
Anyways, the main advantage of 6ghz band is that I don’t have any interference as I’m the only one with a 6e router in my area :)
Hey!
I’m not completely surprised for us to have different results, especially after reading your setup. Our methodology appears to be very different, but we absolutely have some interference that, for now, we’re unable to isolate. This is something we’re completely aware of and are working on updates in the future. So if you have any feedback you’d like to share, we take it really seriously.
As for our setup, we’ve got a server wired to the router that we run iperf3 and we use a laptop as a measuring device. The laptop is a Lenovo P15 Gen 2 with an Intel BE200 Wi-Fi 7 nic running Clear Linux. In our experience, it gave us the most stable results. We do expect it to be higher than our results, but a big value with this test in general is that it shows the loss in speed over distance rather than the total speed possible. This is also something we’ll be tackling in a test bench update, so we’ll be able to make the improvements we need in the future!
Nice! At home, I’m in the same position so my 6ghz band is empty except for me.
Hey,
seems like there was no time for someone to take a look at this. Just replying to give this a little bump!
Best regards
Hey!
Yeah, sorry! December always has more vacations and such going on than I anticipate. That said, I’ve made the task and it’ll be pushed through to the next available tester so we should have the results this week! Sorry for the delay!
Have you noticed the MAX brightness is significantly dimmer when the “automatically adjust brightness” setting is turned on? Is this consistent with other models or a new feature?
Heyo!
Absolutely! It’s consistent with other models for sure. We always turn it off for our testing since we want it to be a stable brightness whether the lights are on or off, but it’s been on every Macbook that I can remember testing. In a brighter environment, it dims more than in a darker one, but it’s never as bright as when you turn the setting off.
In the Storage Drive Performance section, yyou mention “Drive speeds depend on the SoC configuration and capacity.” Can you provide any reference for what the speeds are for the different SoC configurations? I’m trying to get an idea of how much the different drives vary and I can’t seem to find anything. I’m interested in the 2TB drive.
As far as the noise, can you add to the review a video of the noise so we can get an idea of what the “worst case” scenario would be? Do you plan to review any of the new Macbook Pro’s with Nano Texture displays?
Hey!
Typically, the speed increases with the size of the drive on our testing with the Apple silicon. In general, larger SSDs are faster than smaller SSDs, usually this is due to the fact that these drives are typically made of many smaller chips rather than one larger one. The drive will be able to read/write multiple things at the same time because of this, as well as having larger caching chips, etc. It does have a maximum speed so I wouldn’t expect it to get much higher using this test. On other tests, the larger drives have hit as high as ~8000mb/s, but they’re not directly comparable to our test. In either case, these drives are plenty fast and are among the fastest we have in general, so if you’re worried at all about speed, know that you’re getting some of the fastest drives out there with any size.
I would expect that our method for testing the storage speed to change in the next year, as our current test is more on the basic side and with feedback like yours, we’d like to improve how we show it and what tools we use to do so.
Ahh I see, imo frequency response accuracy, more importantly bass extension for laptops is more important as most laptops struggle heavily with this except Apple
I appreciate the feedback and I’ll make sure to bring it up when we revisit the scoring/update the test! Thanks for the comment and feel free to post more if you have more comments on the tests!
I see, I assume u guys tested it in thx spatial mode for music, will it be possible to test the frequency response again but with music and just normal stereo? The spatial mode makes everything sound tinny and bass isn’t as prominent as stereo due to the boost in highs to make everything sound more spatial🤣 The blade 18 has a larger speaker enclosure and larger force cancelling woofers than the MacBook Pro 16 and imo with the correct eq it should perform similarly but with reduced peak volume which the Mac also has compared to the blade 18
Yes, that’s correct. We don’t go too much into the speakers as we can’t be sure which mode is best without more extensive testing, as we don’t want to lean too much into subjectivity with this test bench. That said, it’s totally possible that we shift this philosophy for future test benches.
Our current scoring does take value peak volume very highly (35.2%), so if switching it does reduce the peak volume and decrease the bass extension on our readings, the overall score will trend downwards from it’s current value, so it might appear worse on our current test, even though it may technically sound better. This is absolutely a flaw with our current test, so I’ll happily add it to our suggestions for next year.
Hi, may I know what speaker mode/eq in synapse was used for the speaker tests as well as will it be possible to test the miniled panel variant of the laptop instead as it is a far more popular and superior display choice for this laptop. Thanks!!
Hey!
We always test in the default speaker modes. In this case, it was in “Music” setting. As for the mini-LED panel variant, I’d personally love to test that one to see the differences, but I can confidently tell you that it’ll be a better buy if you’re willing to spend the money. You can definitely expect better contrast, better brightness and I’d expect it to have better out of the box color accuracy.
Any of these kinds of enthusiast gaming machines are going to be really good across the board, but you’re definitely paying a heftier price for it. There might be that option on the next Razer laptop we buy, so if you want to vote for another one so we can prioritize and get it out check out the voting poll here.
Am I crazy, or is that a 60Hz flicker I’m seeing in the flicker-free test? It looks like the backlight operates with a 15KHz flicker and a 60Hz flicker at the same time, with the 60Hz flicker being much more prominent. Shouldn’t it be listed as having a 60Hz flicker then and judged accordingly? Was this test taken in 60Hz mode, and does it exhibit the same behavior at 120Hz?
Hey!
You are not crazy! But I can help clarify some of the information for you. So, there is indeed a flicker in a flicker here. It’s fairly uncommon, but we’ve seen it before on other displays, particularly Apple displays. As you mentioned, there is the backlight which is operating at ~15Khz and it appears to be in a pattern of 60Hz. All of this is true, however, we need to be clear on the screen flicker and the modulation of the actual display itself.
The backlight itself is flickering at 15Khz, but the screen itself is refreshing at 60Hz, which is being picked up by our tool. So while there is a flow of 60Hz to the actual display, the flickering that one would actually catch with their eye would be the 15Khz. I can absolutely vouch for it, as 60Hz flicker is absolutely unusable to me and I’d absolutely recommend staying away from displays that have backlight flicker at 60Hz at all times.
The test itself is taken in 60Hz and there isn’t as easy a way to collect the data at 120Hz on Apple with our current setup. On other Macbooks, when you shift to 120Hz, the larger pattern does indeed change to 120Hz, but the backlight itself remains at 15Khz. As far as flickering goes, the experience is identical between the two modes. I did run out and double check and I can confirm the same behaviour on the 2024 Macbooks as well.
So just to end this all off, it’s more of the same as far as flicker goes on the M-chip macbooks. Hope that answers your question!
Yessir! I’ll add it to our retest list so that we can get to it ASAP. I expect we’ll be able to catch up on a lot of these types of things early next week, so keep an eye out for some updates to come!
Will you retest the keyboard after installing the keyboard deflection kit?
Hey!
Our unit did not come with the keyboard deflection kit. Currently, we’re not planning on buying it but this could change and if it does, we will absolutely retest the keyboard performance.
Dear RTINGs-Team, could you please verify if the key-release delay is also present in wired mode? 33ms on key-release is 100% noticeable.
Hey!
I’ve added a note for us to go and take a look at it, but it’s likely we’ll only be able to get to it around early December. That said, I quickly plugged it in and gave it a try and wasn’t able to feel a difference, but the data will help to show what the difference is once we’ve got a tester available to jump onto this task.
K70 MAX has new firmware out (v. 1.14.90) and iCUE (v. 5.20.89). The actuation now is possibile from value 0.1mm to 4.0mm. Before was limited at 0.4 to 3.6 mm.
WARNING: After updating iCUE and firmware, close iCUE app immediately and re-calibrate the keyboard first before testing! Otherwise, in iCUE “Key Actuations > Actuations Presets > Primary Actuation Point”, when you set the value to 0.1 they will trigger keys randomly!
Hey!
I’ll create a task for us to look into this. Thanks for bringing it up!
Hi there, were the speakers on the Dolby Balanced profile or was Dolby completely off when you tested the speakers? Thanks.
Hey! The Dolby was completely off and typically it is. The way out speakers test is done, we want to put them all on the same playing field and so we make sure there are no active sound profiles or anything enabled, if we can.
One of the reasons for this is that our test doesn’t play very nicely with Dolby profiles in particular. We use a lot of pink noise and sweeps, and the Dolby profiles, tend to make it significantly quieter compared to when you use them on real content, so it negatively impacts the way we test them currently.
Did the iPad Pro M4 measure any flicker?
Is there any super simple “no set up” tool to measure flicker at home?
Rereading your comments, still some confusion whether the 13" MacBook from 2021 or 2022 measured flicker-free, or if both measured flicker-free?
When you say “IF” the 2021 13" MacBook is flicker-free then according to the Apple Tech the larger variants from that year should also be flicker-free, it is the Rtings review of the 2021 13" MacBook that says it is flicker-free, so are you thinking it is possible there is flicker on that model but the Rtings review measurement tool just was not able to pick it up, or how should I understand the “IF”?
Yo! The iPad Pro M4 does measure with flicker at 480Hz.
Opple Lightmaster meters are a consumer-grade luminance tool that will give you a bunch of information on the display (brightness, color temperature, flicker, etc). I’ve not personally used one as I have access to much more sophisticated tools, but the reviews seem to be positive about the tool itself, but the app being more of an annoyance than anything. I’m hesitant just because it doesn’t tell me all the thresholds for it’s reading capability, so I’m concerned it wouldn’t be able to identify flicker above a certain cap, which they don’t tell us.
Another method is to use a camera and set the shutter speed to 1/4000 and you might be able to identify it. The thing you’ll see more of with this method is the stroboscopic visibility, which would be the seen as wide gaps in the line. The Macbooks don’t have a wide stroboscopic visibility so this method isn’t really useful for you in this particular circumstance.
Sorry, I’ll try to make it a bit clearer. It does get a bit confusing.
So, the M1 Macbook Air we tested from 2021 does have flicker (this should be updated in the review soon). The M2 Macbook Air from 2022 does not have flicker.
What I meant by “if” was more a reference to the information I received from the Apple Store genius. I can confirm what I have here but any variants of our products (so in this case a larger M2 Macbook from 2022), I cannot verify. I can only say what Apple employees have told me. So the “if” is a lack of my ability to 100% guarantee that information. What I can guarantee is that the M2 Macbook Air 13" 2022 does not have flicker.
I’m sorry but I think I goofed. I just looked on rtings.com at the 2021 13" macbook air review and that is the model that is flicker free, so I was hoping you could find out if the larger models were also flicker free for that year. I realize you’ve already spent quite a bit of time on this.
It’s all good! It wasn’t a waste or anything, so no worries about that. :)
If the information I received from Apple is correct, than if one of the models is flicker-free than all models regardless of size within that same category should be flicker free as well. So the 2021 13" Macbook Air being flicker free should mean that the larger sizes of the Macbook Air from that year should be the same.
That would be so great! Thank you so much for your help. FYI, I’m interested in their largest laptop sizes without flicker. But more info on any sizes always better (or what u can even find). I say this b/c Im not sure if they have three sizes like 13, 15, and 17, or just 13 and one bigger one. Id prob prefer 16-17.
Hey! Sorry for the delay! The largest I was able to get my hands on was a 15in of the 2022 and it does have the same flicker. I also spoke with some people at my local Apple store as I happened to be in the area and managed to talk my way into a conversation with one of their techs. They told me that all displays per model type (Air, Pro, etc) are tuned in the same ways. So the 14in Pro would have the same screen specs and be tuned the same way as it’s larger variant.
I know it’s not the best answer you were looking for, but it appears to be the reality and what the folks who are extremely sensitive to flicker feel is holding them back from buying Apple in the first place.
Hi, it looks like there might be a math error(s) in the “Portability” Volume measurement. If we multiply the Thickness x Width x Depth we should get Volume 72.3 in³ (1,135.0 cm³) right?
0.7 x 12.3 x 8.4 = 72.324
1.7 x 31.2 x 21.4 = 1,135.056
What’s the math behind the Volume measurement shown in the review?
Hey!
So this isn’t technically a math error, but an error with the amount of characters we can put in the box. All of the results you see are rounded, however, the actual volume that’s left is from the pre-rounded values. The way the review is structured, it does this automatically for those particular boxes, but not for the volume. So you’re right in the math you’ve presented, but the real numbers for this one (as an example) is 1.68 x 31.2 x 21.39 = 1121.178. These tiny differences add up once you’ve multiplied them together.
However, when we input this data into the site it rounds it up to 1.7 and 21.4, respectively. It’s already a known issue and is expected to be fixed in our next test bench update. I know it’s a little bit weird, but it’s an old issue that exists on almost every laptop currently and once we’ve updated our test bench, we can actually change how it’s presented so it won’t automatically round in the final review. Hope that clears it up!
Hey! Unfortunately, we don’t have any plans to retest or add any additional testing to this test at this time. It’s one of the most accurate mice we have, regardless of the polling rate, which has little to no effect on the SRAV for this mouse in our testing and we expect it to perform similarly in the most common DPI settings (400, 800, 1600, 3200, 6000).
Could you please retest the mouse’s latency with the latest firmware? It underperformed a bit at lower than 8000Hz compared to most other top tier mice in your testing and I would be interested in knowing if it is on par or even better now than the GPX2 for example.
Hey! Just wanted to let you know that I’ve added this retest to our list of 8K updates. We’ll give it a once over and you should see an update as to whether there are any changes soon. We’ve got limited resources at the moment so it’ll be a few days before we can properly start the retests, but it’s on the list.
Thank you for testing the new dongle!
Are the test results at 1000Hz also with the new dongle? Or are these the old ones? The performance at 1000Hz should also be better with the new dongle.
Hey! They are the old ones. However, we’ll take a look at this on the next retests we do on the Lamzu mice and we can check to see what the difference there is, if any! If it’s significant, we’ll be sure to update this review as well, but if it’s within our own testing parameters, we’ll let you guys know so that if you’re only using 1000Hz, you’d know that you may not need the new dongle. Either way, keep an eye out! :)
I would like to see 8khz and 4khz data using atlantis mini4k and Thorn’s 8k dongle!
Hey! Thanks for the feedback! Thankfully, we’ve already planned to retest these two mice (along with some others), but due to limited resources, it’ll be a few more days before we can get started on the retests. But we will be retesting them soon, so keep an eye out on their reviews to see updates!
Would you consider updating the bluetooth for macOS’s bluetooth firmware? They have updated the bandwidth of their bluetooth controllers hence a latency reduction may be seen (difference between Windows and MacOS peripheral handling).
Hey! In order to get the most accurate results for the keyboard latency, we intercept the data between the device and the keyboard. This allows us to get the keyboard latency without considering system latency, as that can vary wildly depending on your personal system. Because we need a tool to be able to collect this data, we don’t directly connect to the computer via bluetooth and instead we connect to this tool to be able to collect the data.
All that to say that updating the macOS bluetooth firmware will not affect our results, since our tool is the middleman between the keyboard and the computer itself.
This is sort of the ELI5 answer, but you can find more details here.
Thank you so much again! You are going above and beyond to help me figure this out, and I really appreciate it. I hope they give you a raise because it sounds like you deserve it haha.
There’s not much more I can ask of you, other than if there’s any way to find out if the larger 2022 MacBooks also have a flicker-free screen like the 13-inch one. I tried going to the Apple Store, but they had no idea. I also tried calling customer support, and I don’t think they even knew what flicker was.
If there is some Apple customer support number to actually talk to their engineers, then maybe I could get an answer myself, but I don’t know if there is or who to contact. Do you know anyone who owns that model and could measure it, or, if not, do you have any way to do it? Please let me know, or if you have any ideas on who I could contact. If you can do it, great, but if not, I understand.
Yeah, unfortunately the level of sensitivity to the flicker that most of their products produce seems to be very rare, so I’m not surprised that the employees don’t have any knowledge of it. It does seem to be a combination of brightness, colour space & contrast more than the flicker, in my opinion.
That said, if you’ll give me some time, I can see if I can find someone who’ll lend me their laptop and try it out for you. I can make another post towards the end of the week if you’d like as to whether I was successful or not and we can proceed from there. How’s that sound?
Thanks so much. Sites like these are not usually so helpful so I really REALLY appreciate the quick and detailed response. Sounds like Macbook air 13 inch 2022 model is the only flicker free macbook? Do they have any 15+ inch flicker free ones? “Is there anyone at Apple you can contact, or any way you know to find out if the 2022 MacBook Air, larger sizes than 13 inches, also have fully flicker-free screens?
Any flicker free IMACs?
Also, do you know if any of the iPads or iPhones have flicker-free screens? Or, for that matter, is there any smartphone, period, with a flicker-free screen? I cannot find a phone/tablet menu on RTings.com to look for myself, and all the research on my current flickering monitor is really starting to bother me. So, if you can somehow help me get to the bottom of this so I can finally pick something, it would be a gigantic help—more than I care to go into details about publicly.
I wish I could say with 100% certainty that the larger models would also not have any PWM flicker. We haven’t actually tested it, so I can’t guarantee that it would and because it’s a different screen entirely, I wouldn’t be confident suggesting it to you. I can try to contact someone, but we don’t really have a different relationship with them, as we’re too small a fish currently.
I would recommend going to the Apple store and checking out the M3 15in Air. Our model had an extremely high flicker rate that was difficult to measure properly (~15000Hz), but I’ve seen users online talk about how the flicker bothers them and others who say there isn’t any flickering at all. Our tool can catch it, but I can personally say I’ve never been bothered by it.
As for phones and iPads, I can’t be 100% certain for all models, but I grabbed the iPad Pro M4 and an employees iPhone 15 and quickly checked it with the oscilloscope. It’s a smaller screen on the phone, so it’s harder to tell, but I wasn’t picking up any flicker until below 20% brightness on the iPhone.
Do you know that you’re sensitive to extremely high flicker rates? It doesn’t seem to be the majority of people, so if you’re willing, you can try the newest Macbook, as the flicker is so high that you might not even notice it. I’m not sure what Apple’s return policy is, but it could be worth a try if you want to get to the bottom of this.
I appreciate your continued coverage of Chromebook devices! I do want to point out an error in this review, however. At time of publication, you say it has 0 storage slots and give it a bad serviceability score. You can clearly see a socketed M.2 2280 SSD in the provided pic however, even the model number that it’s a Western Digital SN740. You CAN replace that SSD– I have done that on many Chromebooks myself. If you want to do a write up when you edit this error, you can tell readers if they download an extension “Chromebook Recovery Utility” form the Web Store, they can add a USB drive (empty preferred, as it will format it) and make a recovery image for the exact Chromebook they’re on. Once you swap the SSD for either service or wanting to upgrade the capacity, just plug in the SSD when the “No bootable ChromeOS image found” (something like that) message shows, and it will install the OS. Bam.
Hey!
Thanks for catching that. You’re right, it should absolutely be 1 and not 0. I’ll get that corrected and you should see an update shortly. I’ll also pass along your message to the writing team to see if they’d be interested in including a blurb about that! Thanks for the info!
Hey!
Glad you’re happy with your purchase! It should be the next one we tackle, but we’re currently working on a new update for the benchmarking on laptops, so it’s likely it’ll come out once we’ve finished making sure all the scoring makes more sense and we’ve put the new tests through the products.
That said, we’re slowly going through the backlog so it might come out before the update is live, so you’ll be able to see the screen testing earlier if the timing works out. I’ll talk to the team to see if we can make it the next laptop to be tested.
Hey!
I can confirm that it was tested at 0.1mm. We always use the settings that will give us the best latency when running these tests. If ever there’s any exceptions, it will get written in the text as to why, but in this case we’re good! :)
Hey!
I can confirm that I used both ports. In any case, USB 3.2 is rated for 100W and USB 4 is rated for 240W, though I don’t actually expect to get anywhere near those results in actual testing.
So far I’ve been able to get a max of 28W which seems odd. I’m going to investigate it a bit more tomorrow though to see if I have another device I can use to charge.
Hey ez8!
I looked into this a little bit today! So, the USB-C is rated for 100W but, I wasn’t able to get anywhere near that. The highest I was able to get was 10W, however, I saw that the battery of the laptop wasn’t quite at 100%. I’ll give it another look on Monday once the laptop is fully charged!
I literally had this problem on my wife’s laptop this weekend with Armoury Crate and used G-Helper instead, which solved the problem. I might eke out some time to see if there’s any noticeable difference between the two software as well.
It could be any changes you’re making as well. We don’t currently calibrate our laptop displays, so ours is out of the box and any deviations outside of that can cause some micro changes in the colours that the tool can pickup. 6000K is quite warm so I’m surprised to hear that!
Hey!
We use the Colorimetry CR-250 Spectroradiometer. There’s a few reasons why they may have shifted to quantum dot, but I’m not complaining about it. I’m glad that even though we’re using different models of spectro it still looks the same, even across the model line. Apple does a good job at ensuring their products give the same experience across the product line. Not a lot of variance there.
Hey!
On our model, there were no screws under the foot. It could be an error on either their documentation, in the building of this model (or batch of models) or it’s possible that different variants of this model do not have these screws. To be safe, I’d remove the foot on yours to check before you try to dismantle it and upgrade to a 2TB SSD. Good luck!
Hey!
Yes, we always enabled the most secure version of the kill-switch for each VPN. You can check out the main R&D article we put out talking more in detail about these settings, though I’ll forward this to the writing team to see if we’d mention it in the reviews themselves.
Thanks for the interest in our review!
Hey!
We’re currently looking into beefing up our VPN performance tests, so the good news is that it’s already being worked on! There’s still value in collecting as much data as we can, as that will help us identify exactly where and what issues exist with the current iteration of the test.
That said, it is possible for the speeds to be faster on the VPN than without. It’s quite complicated, but essentially there is a possibility that the server without the VPN is taking a less efficient path to the speed test servers than the VPN is, so that translates to the VPN being faster.
As for the latency, it’s actually quite good. While the highs and lows you can see on the graph are a bit exceptional compared to most of the VPNs we have tested, the consistency is still considered good. You can see this behaviour on one of our best performing VPNs (when it comes to latency and how we score it) in Windscribe as well.
Thanks for the interest in the reviews though! And if there’s any other questions, don’t hesitate to ask!
Hey igort!
Just wanted to touch base with you again. Been working on the benchmarks on this laptop, but I’ve been running into some issues with it. After updating the laptop, there are a lot of stability issues on the system itself. Lots of reboots and crashes are making it difficult to collect the data I need. I’m still working on it, but I just wanted to let you know that there’s definitely some bugs to iron out with the update. I’ll be completely wiping it and starting from scratch to see if I can either pinpoint the issue or it resolves itself.
Hey!
I have not forgotten about you, don’t worry! I was out sick for a few days so it’s been delayed a little longer than I initially thought. It’s on it’s way though!
Hey!
Currently, we don’t look at port forwarding as a feature yet. I know it’s a feature that a lot of people care about and I’ve already added it to our suggestions for the next test bench update on VPNs.
That said, our ‘perfect’ score on the Downloads box in the review is a 9.5 because we feel we’re not at a state where we can say an aspect of the VPN is a real 10/10 when it comes to that test. But, we’re happy to iterate upon our testing in the future to make sure we hit everything that matters when it comes to a VPN.
Hey!
Yeah, I’ve been reading about this more and more and it seems like there’s some interest here! I’ll make some time to give this a shot early/mid next week so you can expect it to be reflected in the review (whether there’s any change or not) end of next week. Hopefully there’s some noticeable improvements!
Hey!
It seems like there might be some confusion about what we’re testing. On laptops, as far as uniformity goes we are only testing the black uniformity. So we’re evaluating the uniformity of black on the screen. This will include IPS bleed as an aspect that gets accounted for on the black uniformity, but it’s still important to know what to expect on a black screen. In the future, we might add a grey uniformity test to the test bench, which would be a grey picture instead of black.
There are a few different ways to get the uniformity of a screen and I’m aware that some people use measurement tools, but our method is different. In the case of laptops, we take a photo of the screen (with specific settings on both the camera and the display) and run it through a job which compares the pixel data to each other pixel, with a low pass filter to make sure we cut out any noise that could be caught by our camera. We also remove the center around the cross from the comparison as we want to have the scene have a highlight on it, since many real scenes have a lot of darkness with some highlights. At the end, we provide the std. dev of this as a reference for how different they are.
You can read about how it’s done across our other products such as TVs and monitors. We don’t have a dedicated article for laptops as of yet, but it’s a similar design principle here. I hope that answers your question, but feel free to ask for any clarifications if you need.
Hey!
Both were supposed to start at the same time, actually. But the truth is was that a member of the team was off so it just delayed one of them by a day. No need to worry though, as they are both in progress as we speak.
Hey!
You absolutely can. Best option is to download the app itself onto the laptop rather than Steam Link or anything and remote play directly through the app.
Yep! This works with Windows HDR which you can enable in the display settings of Windows. Usually you’ll need to enable it in games separately if it doesn’t automatically detect that it’s on.
Hey!
We’re hoping to do a larger scale update to the testing in general for laptops that will do a better job of capturing the performance of the laptop. I agree though, love this laptop personally and it’s quite a good contender in general.
The way we do our brightness tests is definitely different than most other reviewers, as we take an average of five separate points on the screen using a checkerboard pattern. This can explain the differences between us and other reviewers, but we feel it matches a regular person’s usage of the laptop display more than simply just trying to get the brightest number possible.
HDR is something we do check on some laptops, particularly ones that advertise the quality of their screen/HDR, so it’s something we can definitely look into for this model in the future.
We’ll be getting to it over the next few days, so it’ll likely be live by the end of the week or latest very early next week.
Hey!
Wanted to give you an update! We should be able to give this a look next week! I’ll make another post when either we’ve finished the retesting of it or when we’re starting depending on whether it’s the beginning or end of the week.
Hey! You’re absolutely right, just went to double check our model and it is the case. The review should be updated shortly! Thanks for catching that!
Hey!
Yeah, we can give this another looksy with the new firmware. I expect we can fit it in within the next couple of weeks, so I can update you when I have a better idea on when, but I think it’s definitely worth checking into.
Hey!
We’re keeping an eye on the updates so we’ll likely get to this in the near future. We’re planning on going through a few keyboards, but I don’t have an exact timeline yet. It should be fairly soon, but I can bump this when I have a better idea of when we’ll be able to get to this in particular.
Thanks for bringing it to our attention though, it helps us prioritize which products to focus.
Hey!
Unfortunately, the keyboard doesn’t have true NKRO. When you enable the setting in the software, it is still limited to 16 keys and not a true NKRO feature.
Hey!
I’m not completely surprised for us to have different results, especially after reading your setup. Our methodology appears to be very different, but we absolutely have some interference that, for now, we’re unable to isolate. This is something we’re completely aware of and are working on updates in the future. So if you have any feedback you’d like to share, we take it really seriously.
As for our setup, we’ve got a server wired to the router that we run iperf3 and we use a laptop as a measuring device. The laptop is a Lenovo P15 Gen 2 with an Intel BE200 Wi-Fi 7 nic running Clear Linux. In our experience, it gave us the most stable results. We do expect it to be higher than our results, but a big value with this test in general is that it shows the loss in speed over distance rather than the total speed possible. This is also something we’ll be tackling in a test bench update, so we’ll be able to make the improvements we need in the future!
Nice! At home, I’m in the same position so my 6ghz band is empty except for me.
Hey!
Yeah, sorry! December always has more vacations and such going on than I anticipate. That said, I’ve made the task and it’ll be pushed through to the next available tester so we should have the results this week! Sorry for the delay!
Heyo!
Absolutely! It’s consistent with other models for sure. We always turn it off for our testing since we want it to be a stable brightness whether the lights are on or off, but it’s been on every Macbook that I can remember testing. In a brighter environment, it dims more than in a darker one, but it’s never as bright as when you turn the setting off.
Hey!
Typically, the speed increases with the size of the drive on our testing with the Apple silicon. In general, larger SSDs are faster than smaller SSDs, usually this is due to the fact that these drives are typically made of many smaller chips rather than one larger one. The drive will be able to read/write multiple things at the same time because of this, as well as having larger caching chips, etc. It does have a maximum speed so I wouldn’t expect it to get much higher using this test. On other tests, the larger drives have hit as high as ~8000mb/s, but they’re not directly comparable to our test. In either case, these drives are plenty fast and are among the fastest we have in general, so if you’re worried at all about speed, know that you’re getting some of the fastest drives out there with any size.
I would expect that our method for testing the storage speed to change in the next year, as our current test is more on the basic side and with feedback like yours, we’d like to improve how we show it and what tools we use to do so.
I appreciate the feedback and I’ll make sure to bring it up when we revisit the scoring/update the test! Thanks for the comment and feel free to post more if you have more comments on the tests!
Yes, that’s correct. We don’t go too much into the speakers as we can’t be sure which mode is best without more extensive testing, as we don’t want to lean too much into subjectivity with this test bench. That said, it’s totally possible that we shift this philosophy for future test benches.
Our current scoring does take value peak volume very highly (35.2%), so if switching it does reduce the peak volume and decrease the bass extension on our readings, the overall score will trend downwards from it’s current value, so it might appear worse on our current test, even though it may technically sound better. This is absolutely a flaw with our current test, so I’ll happily add it to our suggestions for next year.
Hey!
We always test in the default speaker modes. In this case, it was in “Music” setting. As for the mini-LED panel variant, I’d personally love to test that one to see the differences, but I can confidently tell you that it’ll be a better buy if you’re willing to spend the money. You can definitely expect better contrast, better brightness and I’d expect it to have better out of the box color accuracy.
Any of these kinds of enthusiast gaming machines are going to be really good across the board, but you’re definitely paying a heftier price for it. There might be that option on the next Razer laptop we buy, so if you want to vote for another one so we can prioritize and get it out check out the voting poll here.
Hey!
You are not crazy! But I can help clarify some of the information for you. So, there is indeed a flicker in a flicker here. It’s fairly uncommon, but we’ve seen it before on other displays, particularly Apple displays. As you mentioned, there is the backlight which is operating at ~15Khz and it appears to be in a pattern of 60Hz. All of this is true, however, we need to be clear on the screen flicker and the modulation of the actual display itself.
The backlight itself is flickering at 15Khz, but the screen itself is refreshing at 60Hz, which is being picked up by our tool. So while there is a flow of 60Hz to the actual display, the flickering that one would actually catch with their eye would be the 15Khz. I can absolutely vouch for it, as 60Hz flicker is absolutely unusable to me and I’d absolutely recommend staying away from displays that have backlight flicker at 60Hz at all times.
The test itself is taken in 60Hz and there isn’t as easy a way to collect the data at 120Hz on Apple with our current setup. On other Macbooks, when you shift to 120Hz, the larger pattern does indeed change to 120Hz, but the backlight itself remains at 15Khz. As far as flickering goes, the experience is identical between the two modes. I did run out and double check and I can confirm the same behaviour on the 2024 Macbooks as well.
So just to end this all off, it’s more of the same as far as flicker goes on the M-chip macbooks. Hope that answers your question!
Hey!
Yessir! I’ll add it to our retest list so that we can get to it ASAP. I expect we’ll be able to catch up on a lot of these types of things early next week, so keep an eye out for some updates to come!
Hey!
Our unit did not come with the keyboard deflection kit. Currently, we’re not planning on buying it but this could change and if it does, we will absolutely retest the keyboard performance.
Hey!
I’ve added a note for us to go and take a look at it, but it’s likely we’ll only be able to get to it around early December. That said, I quickly plugged it in and gave it a try and wasn’t able to feel a difference, but the data will help to show what the difference is once we’ve got a tester available to jump onto this task.
Hey!
I’ll create a task for us to look into this. Thanks for bringing it up!
Hey! The Dolby was completely off and typically it is. The way out speakers test is done, we want to put them all on the same playing field and so we make sure there are no active sound profiles or anything enabled, if we can.
One of the reasons for this is that our test doesn’t play very nicely with Dolby profiles in particular. We use a lot of pink noise and sweeps, and the Dolby profiles, tend to make it significantly quieter compared to when you use them on real content, so it negatively impacts the way we test them currently.
Yo! The iPad Pro M4 does measure with flicker at 480Hz.
Opple Lightmaster meters are a consumer-grade luminance tool that will give you a bunch of information on the display (brightness, color temperature, flicker, etc). I’ve not personally used one as I have access to much more sophisticated tools, but the reviews seem to be positive about the tool itself, but the app being more of an annoyance than anything. I’m hesitant just because it doesn’t tell me all the thresholds for it’s reading capability, so I’m concerned it wouldn’t be able to identify flicker above a certain cap, which they don’t tell us.
Another method is to use a camera and set the shutter speed to 1/4000 and you might be able to identify it. The thing you’ll see more of with this method is the stroboscopic visibility, which would be the seen as wide gaps in the line. The Macbooks don’t have a wide stroboscopic visibility so this method isn’t really useful for you in this particular circumstance.
Sorry, I’ll try to make it a bit clearer. It does get a bit confusing.
So, the M1 Macbook Air we tested from 2021 does have flicker (this should be updated in the review soon). The M2 Macbook Air from 2022 does not have flicker.
What I meant by “if” was more a reference to the information I received from the Apple Store genius. I can confirm what I have here but any variants of our products (so in this case a larger M2 Macbook from 2022), I cannot verify. I can only say what Apple employees have told me. So the “if” is a lack of my ability to 100% guarantee that information. What I can guarantee is that the M2 Macbook Air 13" 2022 does not have flicker.
It’s all good! It wasn’t a waste or anything, so no worries about that. :)
If the information I received from Apple is correct, than if one of the models is flicker-free than all models regardless of size within that same category should be flicker free as well. So the 2021 13" Macbook Air being flicker free should mean that the larger sizes of the Macbook Air from that year should be the same.
I hope that answers your question!
Hey! Sorry for the delay! The largest I was able to get my hands on was a 15in of the 2022 and it does have the same flicker. I also spoke with some people at my local Apple store as I happened to be in the area and managed to talk my way into a conversation with one of their techs. They told me that all displays per model type (Air, Pro, etc) are tuned in the same ways. So the 14in Pro would have the same screen specs and be tuned the same way as it’s larger variant.
I know it’s not the best answer you were looking for, but it appears to be the reality and what the folks who are extremely sensitive to flicker feel is holding them back from buying Apple in the first place.
Hey!
So this isn’t technically a math error, but an error with the amount of characters we can put in the box. All of the results you see are rounded, however, the actual volume that’s left is from the pre-rounded values. The way the review is structured, it does this automatically for those particular boxes, but not for the volume. So you’re right in the math you’ve presented, but the real numbers for this one (as an example) is 1.68 x 31.2 x 21.39 = 1121.178. These tiny differences add up once you’ve multiplied them together.
However, when we input this data into the site it rounds it up to 1.7 and 21.4, respectively. It’s already a known issue and is expected to be fixed in our next test bench update. I know it’s a little bit weird, but it’s an old issue that exists on almost every laptop currently and once we’ve updated our test bench, we can actually change how it’s presented so it won’t automatically round in the final review. Hope that clears it up!
Hey! Unfortunately, we don’t have any plans to retest or add any additional testing to this test at this time. It’s one of the most accurate mice we have, regardless of the polling rate, which has little to no effect on the SRAV for this mouse in our testing and we expect it to perform similarly in the most common DPI settings (400, 800, 1600, 3200, 6000).
Hey! Just wanted to let you know that I’ve added this retest to our list of 8K updates. We’ll give it a once over and you should see an update as to whether there are any changes soon. We’ve got limited resources at the moment so it’ll be a few days before we can properly start the retests, but it’s on the list.
Hey! They are the old ones. However, we’ll take a look at this on the next retests we do on the Lamzu mice and we can check to see what the difference there is, if any! If it’s significant, we’ll be sure to update this review as well, but if it’s within our own testing parameters, we’ll let you guys know so that if you’re only using 1000Hz, you’d know that you may not need the new dongle. Either way, keep an eye out! :)
Hey! Thanks for the feedback! Thankfully, we’ve already planned to retest these two mice (along with some others), but due to limited resources, it’ll be a few more days before we can get started on the retests. But we will be retesting them soon, so keep an eye out on their reviews to see updates!
Hey! In order to get the most accurate results for the keyboard latency, we intercept the data between the device and the keyboard. This allows us to get the keyboard latency without considering system latency, as that can vary wildly depending on your personal system. Because we need a tool to be able to collect this data, we don’t directly connect to the computer via bluetooth and instead we connect to this tool to be able to collect the data.
All that to say that updating the macOS bluetooth firmware will not affect our results, since our tool is the middleman between the keyboard and the computer itself.
This is sort of the ELI5 answer, but you can find more details here.
Yeah, unfortunately the level of sensitivity to the flicker that most of their products produce seems to be very rare, so I’m not surprised that the employees don’t have any knowledge of it. It does seem to be a combination of brightness, colour space & contrast more than the flicker, in my opinion.
That said, if you’ll give me some time, I can see if I can find someone who’ll lend me their laptop and try it out for you. I can make another post towards the end of the week if you’d like as to whether I was successful or not and we can proceed from there. How’s that sound?
I wish I could say with 100% certainty that the larger models would also not have any PWM flicker. We haven’t actually tested it, so I can’t guarantee that it would and because it’s a different screen entirely, I wouldn’t be confident suggesting it to you. I can try to contact someone, but we don’t really have a different relationship with them, as we’re too small a fish currently.
I would recommend going to the Apple store and checking out the M3 15in Air. Our model had an extremely high flicker rate that was difficult to measure properly (~15000Hz), but I’ve seen users online talk about how the flicker bothers them and others who say there isn’t any flickering at all. Our tool can catch it, but I can personally say I’ve never been bothered by it.
As for phones and iPads, I can’t be 100% certain for all models, but I grabbed the iPad Pro M4 and an employees iPhone 15 and quickly checked it with the oscilloscope. It’s a smaller screen on the phone, so it’s harder to tell, but I wasn’t picking up any flicker until below 20% brightness on the iPhone.
Do you know that you’re sensitive to extremely high flicker rates? It doesn’t seem to be the majority of people, so if you’re willing, you can try the newest Macbook, as the flicker is so high that you might not even notice it. I’m not sure what Apple’s return policy is, but it could be worth a try if you want to get to the bottom of this.
Hey!
Thanks for catching that. You’re right, it should absolutely be 1 and not 0. I’ll get that corrected and you should see an update shortly. I’ll also pass along your message to the writing team to see if they’d be interested in including a blurb about that! Thanks for the info!