Hello everyone, I recently bought the Sony ULT headphones and unfortunately discovered an issue. I’ve always taken good care of them — never dropped, never used while sleeping, and never applied any pressure. I never expected this to happen, but while I was working, the headphones suddenly broke — right at the hinge.
After that, I did some searching and found that many other users are facing the same problem. It’s really disappointing for a product in this price range
Sorry to hear it. Do you have one of these covered serial numbers? If you do, it should be an easy (but inconvenient) process to get it replaced. If you don’t, I would recommend that you go through your cc company, bank, or retailer to get reimbursed, depending on their policies.
That’s great news! These have become my daily drivers based on exemplary sound quality and very acceptable comfort, and at least good performance in other areas. I’ll be interested in your take on the annoying delay in response to power-on/off functionality (requires a very long button press).
These headphones, as well as some other Noble Audio models, include a “hearing test” feature (branded “Audiodo” on some models, but not on the Apollo) that creates a custom EQ curve based on your actual hearing acuity. I wonder if there’s any way to test this or evaluate its effectiveness.
Good shoutout on the power-on/off we’ll keep an eye on it, surely it can’t be worse than most other headphones right… The hearing test sounds to me like something similar to other headphones’ personalized EQs, where the app does a/b tests with sound samples to personalize your sound.
Am I correctly understanding that you’re saying the $89 Nothing Ear (a) sound better than the $300 Technics and $220 Sony earbuds that are the current high-end picks? That’s…surprising to say the least, but I can’t find anything in the writeup that implies anything other than “These are the best sounding earbuds one can buy.” But you also only explicitly compare them (both here and in the full review) to earbuds that are < $100. Can you clarify? Thanks.
Hi, this is a fantastic question. The way our recommendation articles are structured, we provide picks based on both price and usage. A more complete description would be “Best Sounding (Barring the other picks on the list, and considering other factors).” So the Best -> Mid-Range -> Budget will be what we think are overall the best in each price category, reliant on any use cases we specify in the title of the article. For example, as the article is Best Bluetooth, we still generally pick the best option overall, but we will put a bit of extra weight on connectivity features like expanded codec support, Bluetooth LE, or 3-way multipoint. For more defined use cases like Best Gaming, we may elect to disqualify a pick based on the lack of a certain feature (like a microphone).
With this added context, the answer to your question about the Nothing Ear (a): we found them to be the Best Sounding from the pool of other products besides the ones on the list, when balanced with both their price point and overall features.
I hope this helps illuminate our process, and while not always obvious, we do take great care when we create and update recommendation articles to include picks that we feel won’t lead anyone astray, which often leads to some, let’s call it, enthusiastic debate within the audio writing team about which picks to settle on.
Does the midnight black version have the same sound profile or is there some update?
Hi, we haven’t tested it, but in our experience, color variants typically don’t sound any different unless they’re special editions that advertise different tunings.
I have a question. I recently bought the Q990F and would like to know the best way to mount the surround speakers. My sofa is against the wall, so the speakers will be fixed to the wall.
Should I place the Samsung logo facing directly towards my head or at a 45° angle to the wall?
Remember that these speakers have two speakers on the sides.
Thank you.
Ideally, you have the speakers behind the listening position between 130-150 degrees. In your situation, no matter what you do it’ll be suboptimal since the speakers were not designed with your configuration in mind. I’d recommend that you try both out, as well as potentially the front layout on page 11 of the manual, and see what sounds the best to you.
Regarding some issues with the momentum 4, there’s a common problem with a low max volume limit. I’ve also ran into issues with the touch controls not working when you turn it on or after it sits idle for a while.
Although personally I haven’t had the headphones long enough to run into connectivity issues yet, there seems to be a common issue with that area.
Also the stock tuning is extremely muddy and muffled. The bass is grossly excessively boosted. Luckily it has adjustable EQ in the app and with the right settings the sound quality can be returned to normal levels, but I wouldn’t say there’s anything that can be called “audiophile” or hifi sound quality.
I’m enticed to say that this product wasn’t designed for the same demographic that their legendary hd600 and hd800 a headphones are.
I ended up returning the headphones I received after about a day. I was simply disappointed in the end. They were used on amazon so perhaps that may be a reason why I received a defective product. Regarding just the sound quality though, these products are grossly overpriced considering that I’m used to things like the hd600, hifiman arya stealth, focal clear og, etc.
I really enjoy these reviews though, I prefer your site way more than say consumer reports.
Hi thanks for sharing your experience with us. The Momentum 4 definitely make use of the clout of the Sennheiser brand as one of their selling points, though I should point out that the HD products are made by Sennheiser’s professional audio division, while wireless/TWS products are likely designed by Sennheiser’s consumer division (acquired by Sonova in 2022).
The bass is a lot on the Momentum 4’s; it’s definitely not for everyone. If you’re interested in some other options, but don’t want to spring for one of the flagships, you could check out the Bose QuietComfort Headphones (non-ultra, go on sale fairly frequently) or the Sony ULT WEAR. Both will need a bit of tweaking since they’re also pretty bassy, but they have solid performance in other respects, like comfort and noise cancelling, too. I’ve made a compare link here for you to take a look at their performance side-by-side (it’ll look a bit jank since the Sennheiser and Bose are still on v1.8 of our testing).
Why is the Tutle Beach Atlas Air not the Midrange pick? It’s $143 on sale, $157 now, and got much better scores than the SteelSeries Arctis Nova 7 Wireless, especially in sound, mic, and build quality.
Hi, thanks for reaching out. When we select products for our articles, we typically don’t assess headphones based on their sale/discounted prices unless we have historical data that they often drop that low, which isn’t the case for the Turtle Beach. It’s also worth pointing out that both headphones aren’t on the same test bench, so comparing their sound scores won’t be 1:1 right now. Their build quality scores are also the same. That said, I think you make some great points, so we’ll make sure to mention the Turtle Beach somewhere in the article the next time we update it.
Why is the rating on the Smart Ultra so different and lower in every category than the ratings on the Smart 900? The test methodology is absolutely the same, is there unaccounted bias?
https://www.rtings.com/soundbar/tools/table/171066
Thanks for reaching out. Taking a quick look at the products side by side, it looks like incremental losses in most of the sound tests is responsible for the Smart Ultra scoring worse. This is due to their ‘No Preset’ mode having a more excited sound signature that doesn’t adhere to our in-room target loosely based on Harman research. This leads to a higher deviation and a lower score in each of the separate sound metrics (Stereo, Center, Surround, Atmos), which are all part of the weighting in the usages at the top of the reviews.
Hi,
Just a sincere question
Why did you give a score of 6.6 in the gaming category, mainly due to low latency over Bluetooth, when the dongle provides very good latency? I assume no one would use them with Bluetooth for gaming.
I’m just trying to understand why it received this score and how the reviews are structured, in case I’m missing something obvious.
Thanks for the great work you do.
Regards
Hi, thanks for reaching out. That’s a good question. First, if you hover over the question mark beside the “Wireless Gaming” score, you’ll actually see that we used the Wireless Dongle and not the Bluetooth Connection information, so they get an 8.9 with 16% weighting. You also probably noticed the rest of the score breakdown, which shows you where the earbuds lost the most points. In particular, their target compliance isn’t great, and their frequency response isn’t very smooth either. Hope this helps clarify why they’re scoring a 6.6.
Hi there, I’m debating on either these or the Airpods Pro 2. I like the beats for the ear hook, but most important part is the sound at the end of the day. Would you say that these are somewhat similar in terms of sound quality?
Thanks so much in advance.
Thanks for reaching out, they’re similar in sound quality, though the AirPods Pro 2 are warmer-sounding. If you’re not in a rush, you could also wait until the AirPods Pro 3 release. Nobody knows if it’ll be this year or next year, though 🙃.
What’s the difference between the attenuation plots and numbers you mention?
When we evaluate performance in Noise Isolation - Full Range, we use averaged attenuation numbers in Bass, Mid, Treble. The same is true for Common Scenarios, but it’s for the full measured frequency range rather than specific ranges. This helps more concretely describe relative performance for products since most products do trade blows depending on the frequency ranges in attenuation.
As of today, you are about 50% of the way updating 116 headphones to the big Test Bench 2.0 update. I’m sure a lot goes into retesting these headphones and I appreciate your commitment to excellence. I’m still learning about the technical differences from the eliminated “neutral sound” category, which was a prominent factor on the still most recent 7 Best Audiophile Headphones of 2025" (link here: https://www.rtings.com/headphones/reviews/best/by-usage/critical-listening-audiophile ) to the “audio reproduction accuracy” category, which I believe replaced it. However, even more interesting would be an article that shows headphones that had a very high “neutral sound” rating, but has a much lower “audio reproduction accuracy category” or vice versa (low to high). Finally, I purchased the AKG N700 NC M2 as a result of it being a good value (neutral sound of 8.0), however, I don’t believe it is on the list to be retested based on the test bench 2.0 metrics. Any change you could make it 117 (+1) headphones to be updated. A man can dream :). Thanks for being my favorite review and rating website!!!
That’s an interesting proposal. I’ll see what the rest of the audio team thinks about it. In case we aren’t able to tackle it in an article, you can check out the 2.0 changelog. I’ll also give a small summary: we shifted our weightings to more heavily consider Peaks/Dips, and Stereo Mismatch and Group Delay (formerly Imaging). We also reduced the weight of compliance with our target curve in all ranges to account more for personal taste. In practice, this means that products with fundamentally good objective metrics that didn’t match our target curve score higher, while products that were “carried” by their adherence to our target are now scored more harshly if they have many fluctuations in their frequency response, or if they have issues with stereo reproduction (we’re also more harsh with how we score stereo matching).
Regarding the AKG N700 NC M2, I’ll have to check with some colleagues. Thank you for the positive feedback, you’re our favorite user (but don’t tell the others 😏).
Can you please update this list, if necessary, as time permits? Thank you.
Hi, we’re currently working through our backlog of products to update to v2.0. Many audiophile headphones are on that list, so we’re holding off on updating the article until we can evaluate as many headphones as possible on the same test bench.
Hi team, as always thanks for the details review.
I’m trying to understand whether there’s a significant difference between calls using Bluetooth vs. 2.4 GHz dongle. My use case being that if I want to get on a call, I want something better than the ancient Bluetooth voice profile and I’m wondering whether this 2.4 GHz circumvents that.
It’s also useful to have something that physically snaps on and is instantly ready for high quality calls as opposed to using multipoint Bluetooth where the software sometimes fails me.
Are these devices good for my use case or should I be looking elsewhere. Hope my understanding of Bluetooth technology now is up to date, else feel free to school me.
Howdy, you’re correct; there will be a significant difference if you use HFP (worse). When we test products that have multiple connection options (dongle, wired, Bluetooth), we include results for the one that most users are likely to use. In this case, we tested with the proprietary dongle, so measurements and recordings in the Microphone section are going to be with that connection. According to their product page, the dongle should be compatible with both iOS and Android devices. If you’re planning on taking calls on a mobile device, make sure to use the dongle rather than Bluetooth.
Found a little bug: The connection type, in Test settings, is indicated as bluetooth. This makes this headphone appear in the table tool when filtering for only wireless headphones.
Are you sure? Midrange goes from 500Hz to 2kHz and the Bose outperforms from 1kHz onwards.
Wouldn’t know unless we have SoundGuys’ mid-range attenuation numbers. At the very least, it’s not a concrete win in their testing, and definitely not the case with our measurements (easier visualization). All this to say, we’re confident with our measurements, though there’s plenty of variation in methodology between publications and personal experiences.
Besides r/earbuds, What Hi-fi and Soundguys rate the ANC as improved but not better than the Bose. What Hi-fi: “Is it better than Bose QC Ultra Earbuds’ ANC quality? The Technics comes very close; the Bose is better at silencing those noises at the mid frequencies, such as the sound of typing on keyboards” https://www.whathifi.com/reviews/technics-eah-az100
Hmm, it’s possible that differences in test methodology account for the varying conclusions. For example, I believe SoundGuys uses a single speaker for their noise attenuation test, while we use four (to closely simulate diffuse background noise). I’ll also point out that if you check SoundGuys and look at the attenuation plots for the Bose and Technics side by side, there’s a notable drop in attenuation in the 500Hz range compared to the Technics. While we don’t have their products’ mid-range averaged total attenuation numbers, that still contradicts or, at the very least, doesn’t support What Hi-fi’s conclusion. All this isn’t to say either publication’s conclusions (or posts on Reddit) are invalid; it’s just important to consider the context of the claims that one product performs better than the other.
Have you personally tested the AZ & QC back-to-back? I wonder because the consensus seems quite strong even beyond Reddit that the ANC on the AZ remains shy of the best.
Yeah, I did some side-by-side listening with most of the obvious competitors. While it’s not a proper blind test, my personal experience has the Technics on par with the other top-tier ANC picks. I want to make sure to reiterate that ANC in our reviews is purely based on objective measurements that use accepted standards as a framework, though I’m sure there’ll be some users who have a different idea of what products perform the best. That said, just looking at the worse mid-range attenuation on the Bose, I think most people would notice more noise wearing them when compared to the Technics. Were there any communities or publications in particular that have a ‘not quite matching the best’ viewpoint?
Hi, thanks for reaching out. We don’t have any immediate plans to review the QS-750F. Just looking at the spec sheet, it looks pretty decent, though if I were in the market for the soundbar, I’d probably still end up going with the Q910D and just dealing with the wires. That said, we’ll definitely keep this soundbar in mind, and if we see enough requests for it, we’ll probably test it. You can also vote for it here with your additional insider votes.
speaking from personal experience, this review is a pretty disappointing, and I’m not talking about the headphones.
I respected Rtings, but as a software engineer/digiital nomad with tens of thousands of hours of listenings hours of on xm5 working late night at startups these past few years, the xm6s are far superior in sound quality and far more versatile.
Hifi put these at 5/5 stars overall, Cnet put these at 9.3 overall (with the questionable cons). Note I’m not saying they are 5 stars, as i’m assuming there are professional headphones that might be better (i know you guys are generally more thorough and critical with reviews), but to put the xm6 at a lower sound quality than xm5 seemingly for “physical characteristics” of a user and how the headphones are positioned on a users ear, is disingenuous. Most people won’t dig into the score, so top level scores are important. Maybe the reviewer wore glasses and/or has physical characteristics that ruined his experience, but don’t blame the headphones for your own.
The headphones are far superior in audio quality, and i loved my xm5s to death. And yes for android users, be sure to go into developer options, (unless you game where latency is an issue): set bluetooth audio codec to LDAC, set bluetooth audio ldac code playback quality to optimized for audio quality (990kbps), max out sample rate and audio bits per sample
and recordingnow has good xm6 eq settings online, but i found i don’t need them. I spent years of fine tuning eq settings for my xm5 for my favourite songs, and xm6 sounded better out for all of them.
Also, i found the xm5s sounded better over the years after app updates, and the xm6s are so new there’s going to be some firmware kinks to sort out (which may negate your “physical characteristics” issues).
Final point for those that use Spotify, set wifi streaming quality and cellular streaming quality to very high (unless your worried about data) , and turn off auto adjust quality (unless you have slow internet).
Hi, thanks for the genuine and thoughtful feedback. There’s a lot to unpack here, but I’ll try my best to address all your concerns.
The first thing I’d like to tackle is actually the increased variability in the XM6’s audio delivery. I want to mention this at the very beginning: due to how audio delivery varies based on physical aspects of the wearer, it’s entirely possible that the way they sound on your head happens to align with your particular preferences. I also agree that the XM6 are more versatile—they have a wider feature set and a better microphone (especially over LC3), among other things.
While we do check with other reviewers and publications to ensure we don’t miss anything substantial when reviewing a product, we’re also confident in our own measurements, particularly when it comes to objective metrics in sound. At this point, I’d like to clear up a misconception you might have. Our Audio Reproduction Accuracy score—which I assume is what you’re using to assess sound quality between the XM6 and XM5—does not include Frequency Response Consistency as part of the score breakdown. The XM6 have more peaks and dips in their response (localized to the treble), and more variability in their phase response mismatch passes. It’s really important to mention that after evaluating all the components of ARA, there is only a minor 0.2-point difference in score between the two headphones; they’re both green (above 7.5), which is what we would consider good performance when assessed by an enthusiast like yourself—and most definitely a small enough difference that personal preference can easily outweigh the score difference. And indeed, part of the ARA score does evaluate compliance with our target curve (which itself is a benchmark based on taste). I want to reiterate that you’re valid in your preference for the XM6’s sound, and that liking the XM6’s sound over the XM5’s is not mutually exclusive with the validity of our objective measurements.
Now, let’s get into Frequency Response Consistency a bit more. To paraphrase some points made in this article: headphones are not typically specialized for physical characteristics, meaning they should perform similarly for a decently wide variety of head shapes, hair thicknesses, and yes, for people who wear glasses (about 63% of Americans wear prescription glasses, according to the Vision Council in 2021). Our test uses measurements from both the industry-standard BK5128 and multiple individuals to calculate the standard deviation relative to the mean passes from the population average. We believe that FRC is a metric worth evaluating because it highlights limitations in headphones’ ability to sound alike when worn on different people and over multiple sessions on the same individual. For a category of product that is targeted at a wide audience, I think it’s fair for us to evaluate whether it can perform consistently for the same population. It’s worth adding some nuance here: we aren’t disqualifying the XM6 as an option for people who wear glasses—because, as you said, that would be disingenuous. Our intention is to highlight a potential drawback and encourage additional consideration for users who might be affected.
It’s entirely possible that the XM6 will improve their sound quality performance in subsequent firmware updates, but it’s worth tempering your expectations, as seal issues are a physical problem. It’s also generally the exception, rather than the rule, that we find frequency response to change in a meaningful way as a result of software updates. If you’d be open to it, please let us know what aspects of the XM6 you found sounded better than the XM5. We try to stay aware of potential blind spots in our approach, so we’d appreciate any insights you can offer.
The ANC rating seems to diverge from reviews on Reddit, which seems to agree the Bose earbuds win. Maybe the tested ear canal makes the difference?
My guess is it’s related to the seal. I daily drove them for a while, and I had to rotate them (think Google Pixel Buds Pro 2) to seat them properly. It’s not the most comfortable to wear them this way so some people might notice a more prominent plunger-like effect.
The microphone sound demo sounds way better than what the microphone frequency graph is showing, was this a recording mistake or how can this happen?
Hi, is there something specific you think is an issue? The graph looks pretty solid, there isn’t much fluctuation in the fundamental focal ranges, and while the treble roll-off is a bit early (6.5 kHz), the speaker has a deeper register, so they’re not affected by the issue as much.
Hello,
I’m pairing this headphone with a pc using mainboard TUF GAMING B760M-E D4. Which dongle or DAC/AMP should I use to have a better sound quality
Most onboard DACs on motherboards have as close to a flat response as possible. If you aren’t noticing any popping or artifacts when plugged into the 3.5mm jack, there’s not much need for a DAC/AMP.
Will this be included in “best closed back headphones”?
Hi, it’s definitely a contender for the list. We’re currently prioritizing bringing up many of our educational and informative articles about headphones and our specific tests, so recommendation articles are on the back burner. That said, the headphones have some pretty notable frequency response consistency issues, which may not make them a great option for a general recommendation article (as many users could have a suboptimal listening experience if they wear glasses, for instance). Recommended products in our articles should ideally have a replicable listening experience for a wide variety of users, especially for articles where sound quality is a focus. Feel free to let us know if you have any additional questions.
I’m seeing some contradictions or I’m just confused. The 990d is the best atmos soundbar.
On the lg s95tr review, when you scroll down and see the comparisons, you state ‘ The Samsung HW-Q990D and the LG S95TR are top-of-the-line soundbar models, but the Samsung has the edge. Its sound is a bit more balanced out of the box, and it has a better overall performance across all of its channels, including surround and height, which gives you an immersive, dimensional sound.’
And then on the top 7 soundbars, no mention of any soundbar.
Hi, thanks for leaving us your feedback. I’m not sure I fully understand your question, but I’ll clear up why the LG S95TR isn’t in this article. When we assess soundbars that make it onto our articles, we balance a number of factors, including price, when making the decision. In most of our “Best of X” articles, we recommend products in a number of price ranges to represent the variation in budget that a population of readers may have. As you quoted from the side-by-side, in this case, the HW-Q990D outperforms the S95TR in a number of metrics (performance and flexibility), including our tested Atmos performance, meaning they are the better pick given the scope of the article (Atmos). If we also consider that both soundbars occupy the same price bracket at the moment ($1400-$1600USD), the omission of the LG is fairly justified in our recommendations framework.
I really want to stress that if you are a fan of the S95TR, or you prefer it for utility that is specific to you (eg, WOWCAST with LG TV), they are both impressive soundbars and you likely would be happy with either. Hope this helps clarify things, and if I’ve misunderstood your question, please let me know.
So what’s the lowest latency setup? setup from headsets to mic amps to optical extractors to soundbars & speakers?
Thanks for the question, I’m not fully sure what you’re asking since you’ve listed a few signaling methods and devices, and latency is pretty device-dependent. But as a rule of thumb, you should have the lowest latency with analog devices that can be used passively and through most audio interfaces designed for music. From testing, optical tends to have higher latency, though YMMV.
Keep in mind that latency alone isn’t the most important factor if you’re dealing with video and audio; the interaction between the two will contribute to your perception of A/V-desync.
It was interesting that a soundbar has a “Screen”, won’t it reflex lights form a TV screen? And the woofer size is so large.
It will probably reflect some light from the TV, but depending on how low you have the soundbar positioned relative to your eyes, the angle might be large enough that you won’t see the reflections. It’s an interesting design to have the woofer exposed, but as we don’t typically disassemble soundbars that have an integrated sub, we can’t really say much about the relative size.
What is HW-Q700C Bluetooth Version? Samsung Doesn’t Mention It
Hi, thanks for reaching out. We took a look at the documentation as well and didn’t find the information. As we typically test for functionality, this isn’t something we normally look at. We’ll find out what version the soundbar uses in the lab when we have a chance.
Would be nice to have more reviews of dedicated Wired Gaming headsets for PC, seems like most of these are Wireless or Wireless with optional Wired use.
Dedicated Wired Gaming headsets are usually available at half the price or less, and avoid issues with charging/battery failure/latency.
Common brands in this category are HyperX, Corsair, SteelSeries, Razer and Logitech.
Howdy, thanks for sharing some of your feedback. We typically prioritize products that interest the most users, but that can sometimes leave us with gaps in coverage, so we’ll keep this in mind as we update products to v2.0.
Are there any specific headsets that you’d like to see on our newest test bench update?
Hi there, what EQ setting were used during testing?
Also, what EQ settings would you recommend for the best possible audio/performance?
Thanks :3
Hi, thanks for reaching out. We had the EQ function disabled during testing (so the headphones were “stock”) as you can see in the Test Settings. Sound is pretty subjective, so it’s hard for us to say what is best (and we don’t provide individualized EQs). That said, if you’re using this for competitive shooters, you’ll probably want a bit of emphasis in the treble overall, and a boost in the high-bass/low-mids to help with sound cues. Hope that helps, good luck out there ;)
Without the target curve in the graphs I find it very hard to see the changes in frequency response with ANC ON vs OFF.
Hi, thanks for the feedback. I see what you mean; typically we’d just overlay the ON and OFF FR on the same graph, that way you can directly compare the two settings. In this case, we took the approach of individual graphs due to the AirPods changing their frequency response depending on the listening level; it likely would’ve been too cluttered if we had everything on one graph (though as you said, we could’ve included the target). I’ll flag this with the rest of the team and see if we can generate an On/Off graph using a typical listening level.
I’m actually surprised at this review. I recently bought these just to check them out and I find them bassier than I expected, they are not really neutral at all 🤔
Btw, I listened to them with silicone tips. The foam was too big for me.
Hi, thanks for sharing your experience with us. Our calculator assesses the headphones as balanced based on the bass and treble amounts relative to one another. In this case, their relative dB difference is close, but not quite past the threshold of what we’d consider balanced. It’s also worth pointing out that listeners also typically prefer an extra bit of bass on IEMs relative to headphones, so you could say these are ‘balanced’ for IEMs.
Preference aside, it’s likely that switching ear tips affected the headphones’ sound signature, so it’s possible you might be experiencing more bass than what we measured with the foam tips. While it’s still early days in v2.0 as we work through our backlog of products to update, I’ve generated this table for you in case you’re still searching for earbuds. Perhaps something like the TRUTHEAR HEXA might be what you’re looking for.
A fault with these headphones inspired me to have a suggestion for your test metrics.
I’ve been having this issue with the Momentum 4’s where the right channel is biased and sounds louder - but only with certain frequencies. This means that say, a podcast, can sound right-biased because certain frequencies in voices are thrown to the right, which impacts the perception of where the voice is coming from. This is a very common complaint with these headphones online. I had a similar problem with Bose Quietcomfort series, where one channel is louder than the other in certain frequencies.
The stores always look at me like I’m crazy if I return the headphones and opt to buy the same ones, hoping for a balanced pair, and I will usually drop the brand and go with a different one. And - given that they are CONSUMER headphones, it’s not easily corrected because on an iPhone there’s no way to correct this.
I noticed you have metrics for Sound Profile and Frequency Response Consistency which are broken up by channel. This is convenient for eye-balling the difference between the channels, but it is difficult to get an objective measurement from this data about how matched the headphones are across the panning spectrum. Have you considered adding a new metric that takes in the right/left data per frequency, calculates the difference between the right/left channels in dB and plots it on a frequency graph? Then it would be easy to conclude how balanced the headphones are. Of course, some frequencies are more impacted than others by a mismatch, so weighting it to a Fletcher-Munson curve might also help you calculate a balance score.
Hi, thanks for sharing your experience with us. The metric you’re looking for that applies to this review (v1.8) is Imaging, specifically the Weighted Phase, Amplitude, and Frequency Mismatch values. Unfortunately, this current iteration doesn’t visualize the affected frequencies. Fortunately, this has been improved in Test Bench v2.0, which you can read more about here. Part of the update is the introduction of a Stereo Mismatch score, which aims to assess issues like directional bias in stereo reproduction with more granularity. The Sennheiser MOMENTUM 4 Wireless are part of our list of headphones to be updated, so sit tight 😊.
As always, feel free to leave us additional feedback if there’s something you’d like to see.
As far as I know rtings only tests using the Bluetooth Classic profiles at the moment, so it wont be a true reflection of what is possible. You need to test the mic using the TMAP profile which is part of Bluetooth LE Audio. Although I’m not 100% sure if the XM6 fully supports TMAP or not.
That’s correct, we force the HFP (hands free protocol), so the results won’t be with Bluetooth LE. That said, we’ll try obtain some recordings if the headphones’ Bluetooth LE works with our Creative BT-W6, but we likely won’t score the performance in the review to maintain apples-to-apples comparisons.
The XM6 spec sheet lists support for TMAP and PBP on Bluetooth LE. but I’m personally hoping they ship GMAP support via a firmware update at some point too. 🤞
Let us know if there’s anything else you’d like to see and we’ll try take a look.
Does using a wired connection help with mic quality?
Hi, good question. We haven’t tested the microphone performance when using an analog connection. Theoretically, the mic could have a different FR, and there won’t be any bandwidth limitations compared to the wireless signaling protocol when connected via the headphones’ dongle.
At the moment, our testers don’t have the resources to look into this, but we have potential plans to update these headphones to TBU 2.0 (and we may squeeze analog microphone testing in for this product). That said, there’s no set timeline for now.
Bought these with your recommendation. Worst sounding headphones ive ever tried. Maybe they were a lemon but im reading online a lot of people complaining. I came from a Sennheiser game open. Is it possible the analog more open soundstage made them sound like crap in comparison? Am i tripping?
Hi, thanks for sharing your experience with us here. Could you share more details with us about what you found suboptimal with the Logitech? There’s a known bug where opening the headphones’ software enables Windows sound processing. If you’re using them on a PC, it might be worth checking advanced sound settings to ensure Spatial Sound and Enhancements are disabled. Barring that, it could just be the open design suiting your tastes more.
Absolutely terrible frequency response in the test and Audio Reproduction Accuracy rating is 7.9 - wtf?!
It’s very sad that RTINGS is turning into another corrupt dxomark.
Maybe you testing methodology are ok, but ratings and conclusions are absolutely inadequate
Sory, I can’t trust your reviews and recomendations anymore
Hi, thanks for sharing your concerns with us. Could you elaborate with what you mean when the earbuds’ frequency response is “absolutely terrible”?
In the interest of supporting more varied preferences in the community, we’ve lessened the weightings based on adherence to our target in line with the reasoning presented in this article. As such, our Audio Reproduction Score (ARA) includes weightings for more objective measures that are less dependent on sound signature preference, such as Peaks and Dips, Stereo Mismatch, and Total Harmonic Distortion. While ARA still includes compliance scores related to our in-house target curve, they’re not as heavily weighted as in previous iterations of our reviews.
It’s also worth pointing out that these earbuds’ sound signature is not static, you’re able to EQ them, which can be seen as another reason to focus on objective measurements that will remain consistent even after adjustments are made to the earbuds’ sound. If you’re looking for earbuds that are more balanced out of the box, consider taking a look at this table. We broadly categorize products using our new Sound Signature test (which may be better to focus on if you’re primarily interested in products’ FR). We’re working on increasing the number of products on TBU2.0, but this would be a good starting point.
Dear RTINGS.com team, thank you so much for testing the EAH-AZ100 and posting your review!
Do you think their high-frequency noise attenuation could be further improved by adding foam tips?
We haven’t tried any aftermarket tips out (but depending on the density they could make a difference). Give it a try and let us know :)
HD560s sounds really muddy and not clear after my beloved dt770 pro 250ohm. This is not a critical listening headphone. Nope. Dt770 scores 7.9 but hd560s scores 8.2? What a joke.
Soundstage not wide enough either.
PS. I have a dampening mod inside the dt770 to make it sound more open and detailed from customcans.co.uk.
For me, no comparison between the 2. 770 is way ahead.
Hi, thanks for sharing your feedback with us. Was there anything specific you can pinpoint that gave you the impression that the HD 560S were muddier than the DT 770? From my experience, elevated treble (which incidentally tends to be the case with Beyerdynamics) can sometimes give me the impression of precision. As sound goes pretty deep into preference, it’s not unexpected that you might disagree with our scoring.
Looking at both headphones, the DT 770 PROs run into some issues with their group delay, contributing to the bass range notch, affecting their peaks and dips scoring. If you like how the headphones sound and where notches in their frequency response are placed, that’s ultimately more important, especially since there aren’t any glaring issues with the headphones.
We only review our headphones stock, so I’m curious about how you found the modding (and what you might recall before and after).
Why did the Bass extension changed so much after the review update? It used to be at 14Hz, whereas now the result is comparable to the Philips X2HR, which does not go as low as the Superluxes.
Hi, as part of our update from Test Bench 1.7 to 1.8, we changed our measurement rig from the HMS head to the BK5128. In addition, we also adjusted our target curve (from which our bass extension calculations are derived). The slight difference in how the headphones interact with a different head and modifications to the target curve are the culprits for the change in our measured bass extension values. Regarding the Philips and Superlux, you can make a confident comparison between the fine details of the review as they’re both on the same Test Bench.
Good day Could you please tell me: from your test it is clear that they have the best rating “sound positioning” ( 9.2
Imaging ) and among open-type headphones. Do I understand you correctly that this is the best solution for eSports games like counter-strike, Apex ,fortnite in order to accurately determine the location of enemies
Hi there, the first thing to mention is that imaging varies from unit to unit. So, things like stereo mismatch and group delay, to a certain extent, are dependent on tolerances set by the manufacturer. That said, once you pass a certain point in performance (good stereo matching), you want to care about other factors like frequency response and, of course, whether you like how they sound. These are an excellent pickup for eSports titles, but my opinion is that if you have a decent headphone, it comes down to just getting used to how they sound.
Options like the DT 990 PRO or DT 900 PRO X, can be found significantly cheaper, have good imaging, and have the treble detail. You can also go for the DT 1990 PRO MKI (if you have an amp already).
I HIGHLY recommend to remove these headphones from recommended items: they have a lot of issues making them unusable.
The connections with my macbook pro and iphone (14 Pro) is erratic when using multi-devices. It keeps connecting and disconnecting. Voice keeps telling CONNECTED TO TWO DEVICES, or DISCONNECTED, ONE DEVICE CONNECTED…etc.
The earbuds starting to generate a VERY LOUD feedback noise after a month of use or so. Basically sometime when putting them, or most often, from their case, they emit a very loud larsen noise (look for larsen in wikipedia if you don’t know what I am referring to). Including at night.
They randomly turn off (one of the earbuds only). I have to manually long press the button to start it back, battery not empty.
Jabra support gave me a procedure to reset them by re-flashing the firmware, the iPhone app just crashes and I am not able to proceed.
Jabra announced they leave the earbuds market. What can we expect from them in term of firmware updates and support in the future? On reddit, it has not been great lately.
Cannot fully control ANC. I bought them for audio quality reason (they are good), but we CANNOT disable ANC in calls. It’s expecitly specified in the app.
I bought them after seeing RTING’s review and it’s probably one of my worst gear purchase ever. Not usable at all, too bad because the sound quality is very good.
More context: I am a software engineer, I work in a clean environment (no dirt or dust), I am very careful about my gear. In comparison, my previous earbuds from another brand still works perfect even though their battery is almost gone.
Hi there, thank you for giving us a detailed list of issues you encountered. We’re aware their stock coverage is dipping after Jabra’s exit from the market, and we are in the process of removing the product from our articles.
As far as the issues you’ve encountered, we’ll see if we can replicate #1-3 on our unit, though we didn’t encounter any problems from our initial testing and more recent retests (for firmware updates and recent test bench update 1.8).
Update: We recently completed some investigation and found:
No connection issues pairing the earbuds with multiple devices (laptop, Mac Mini, iPhone) when switching devices and listening to audio. No connection issues with single earbuds disconnecting either.
We had difficulty replicating the loud feedback users have experienced; we were able to create the issue once by placing hands over the earbuds.
With the company killing off the consumer division, the Jabra Elite 8 Active Gen 2 are unfortunately getting rather difficult to find, it seems - might want to consider updating the top slot at some point.
It’s a real shame, for I got really excited about settling on these in my search for The Ones.
Hey, thanks for the heads up. We’re aware their availability is shrinking and are gradually phasing them out of our lists. Looks like you’ll have to keep looking for The Ones a little longer. ;)
Nah no pity. They’re not worth it. I cancelled my rtings premium membership over this- I only bought them to begin with because of the rtings “best ANC” label- I’ve had constant problems with all 4 of my units now. There’s better products out there that don’t STOP NOISE CANCELLING when the noise cancelling gets HARD.
Hi there, I’m sorry you haven’t had a good experience with these earbuds. Thank you for sharing details about the issues you had. Feedback is very important to us as we strive to constantly improve our testing methodology and fill gaps that we uncover: to that end, we’re looking to replicate the ANC issue you’ve described. We’re also working on removing Jabra picks from our recommendation articles as stock is now waning.
I’ve also gone ahead and refunded your Insider; you should be getting that in a few business days.
Does the audio sound better if connected directly to the PS5 via USB adapter instead of the 3.5mm dualsense controller?
Hi, unfortunately we don’t test the audio quality of console configurations extensively, mostly focusing on compatibility.
That said, conventional wisdom leans toward the USB directly to the console having better sound. In the case of the DS5 controller, sound quality will be limited by the wireless connection to the PS5 and then by an onboard DAC on the controller. Sony is likely not using a high-fidelity focused connection to the controller, focusing instead on latency (gaming product), so you’ll probably have a better time with the USB plugged straight into the PlayStation.
Sorry to hear it. Do you have one of these covered serial numbers? If you do, it should be an easy (but inconvenient) process to get it replaced. If you don’t, I would recommend that you go through your cc company, bank, or retailer to get reimbursed, depending on their policies.
User does not have permissions.Sample message. User has no access to this comment.No access.Insider only comment.
This discussion is for insiders only.
Become an InsiderGood shoutout on the power-on/off we’ll keep an eye on it, surely it can’t be worse than most other headphones right… The hearing test sounds to me like something similar to other headphones’ personalized EQs, where the app does a/b tests with sound samples to personalize your sound.
Hi, this is a fantastic question. The way our recommendation articles are structured, we provide picks based on both price and usage. A more complete description would be “Best Sounding (Barring the other picks on the list, and considering other factors).” So the Best -> Mid-Range -> Budget will be what we think are overall the best in each price category, reliant on any use cases we specify in the title of the article. For example, as the article is Best Bluetooth, we still generally pick the best option overall, but we will put a bit of extra weight on connectivity features like expanded codec support, Bluetooth LE, or 3-way multipoint. For more defined use cases like Best Gaming, we may elect to disqualify a pick based on the lack of a certain feature (like a microphone).
With this added context, the answer to your question about the Nothing Ear (a): we found them to be the Best Sounding from the pool of other products besides the ones on the list, when balanced with both their price point and overall features.
I hope this helps illuminate our process, and while not always obvious, we do take great care when we create and update recommendation articles to include picks that we feel won’t lead anyone astray, which often leads to some, let’s call it, enthusiastic debate within the audio writing team about which picks to settle on.
Let me know if you have any other questions, N
Hi, we haven’t tested it, but in our experience, color variants typically don’t sound any different unless they’re special editions that advertise different tunings.
Ideally, you have the speakers behind the listening position between 130-150 degrees. In your situation, no matter what you do it’ll be suboptimal since the speakers were not designed with your configuration in mind. I’d recommend that you try both out, as well as potentially the front layout on page 11 of the manual, and see what sounds the best to you.
Hi thanks for sharing your experience with us. The Momentum 4 definitely make use of the clout of the Sennheiser brand as one of their selling points, though I should point out that the HD products are made by Sennheiser’s professional audio division, while wireless/TWS products are likely designed by Sennheiser’s consumer division (acquired by Sonova in 2022).
The bass is a lot on the Momentum 4’s; it’s definitely not for everyone. If you’re interested in some other options, but don’t want to spring for one of the flagships, you could check out the Bose QuietComfort Headphones (non-ultra, go on sale fairly frequently) or the Sony ULT WEAR. Both will need a bit of tweaking since they’re also pretty bassy, but they have solid performance in other respects, like comfort and noise cancelling, too. I’ve made a compare link here for you to take a look at their performance side-by-side (it’ll look a bit jank since the Sennheiser and Bose are still on v1.8 of our testing).
Hi, thanks for reaching out. When we select products for our articles, we typically don’t assess headphones based on their sale/discounted prices unless we have historical data that they often drop that low, which isn’t the case for the Turtle Beach. It’s also worth pointing out that both headphones aren’t on the same test bench, so comparing their sound scores won’t be 1:1 right now. Their build quality scores are also the same. That said, I think you make some great points, so we’ll make sure to mention the Turtle Beach somewhere in the article the next time we update it.
Insider only comment.Sample message. User has no access to this comment.User does not have permissions.No access.
This discussion is for insiders only.
Become an InsiderThanks for reaching out. Taking a quick look at the products side by side, it looks like incremental losses in most of the sound tests is responsible for the Smart Ultra scoring worse. This is due to their ‘No Preset’ mode having a more excited sound signature that doesn’t adhere to our in-room target loosely based on Harman research. This leads to a higher deviation and a lower score in each of the separate sound metrics (Stereo, Center, Surround, Atmos), which are all part of the weighting in the usages at the top of the reviews.
Hope that clears things up for you.
Hi, thanks for reaching out. That’s a good question. First, if you hover over the question mark beside the “Wireless Gaming” score, you’ll actually see that we used the Wireless Dongle and not the Bluetooth Connection information, so they get an 8.9 with 16% weighting. You also probably noticed the rest of the score breakdown, which shows you where the earbuds lost the most points. In particular, their target compliance isn’t great, and their frequency response isn’t very smooth either. Hope this helps clarify why they’re scoring a 6.6.
Thanks for reaching out, they’re similar in sound quality, though the AirPods Pro 2 are warmer-sounding. If you’re not in a rush, you could also wait until the AirPods Pro 3 release. Nobody knows if it’ll be this year or next year, though 🙃.
When we evaluate performance in Noise Isolation - Full Range, we use averaged attenuation numbers in Bass, Mid, Treble. The same is true for Common Scenarios, but it’s for the full measured frequency range rather than specific ranges. This helps more concretely describe relative performance for products since most products do trade blows depending on the frequency ranges in attenuation.
That’s an interesting proposal. I’ll see what the rest of the audio team thinks about it. In case we aren’t able to tackle it in an article, you can check out the 2.0 changelog. I’ll also give a small summary: we shifted our weightings to more heavily consider Peaks/Dips, and Stereo Mismatch and Group Delay (formerly Imaging). We also reduced the weight of compliance with our target curve in all ranges to account more for personal taste. In practice, this means that products with fundamentally good objective metrics that didn’t match our target curve score higher, while products that were “carried” by their adherence to our target are now scored more harshly if they have many fluctuations in their frequency response, or if they have issues with stereo reproduction (we’re also more harsh with how we score stereo matching).
Regarding the AKG N700 NC M2, I’ll have to check with some colleagues. Thank you for the positive feedback, you’re our favorite user (but don’t tell the others 😏).
Hi, we’re currently working through our backlog of products to update to v2.0. Many audiophile headphones are on that list, so we’re holding off on updating the article until we can evaluate as many headphones as possible on the same test bench.
Howdy, you’re correct; there will be a significant difference if you use HFP (worse). When we test products that have multiple connection options (dongle, wired, Bluetooth), we include results for the one that most users are likely to use. In this case, we tested with the proprietary dongle, so measurements and recordings in the Microphone section are going to be with that connection. According to their product page, the dongle should be compatible with both iOS and Android devices. If you’re planning on taking calls on a mobile device, make sure to use the dongle rather than Bluetooth.
Thanks for the catch, we’ll fix the bug shortly.
Wouldn’t know unless we have SoundGuys’ mid-range attenuation numbers. At the very least, it’s not a concrete win in their testing, and definitely not the case with our measurements (easier visualization). All this to say, we’re confident with our measurements, though there’s plenty of variation in methodology between publications and personal experiences.
Hmm, it’s possible that differences in test methodology account for the varying conclusions. For example, I believe SoundGuys uses a single speaker for their noise attenuation test, while we use four (to closely simulate diffuse background noise). I’ll also point out that if you check SoundGuys and look at the attenuation plots for the Bose and Technics side by side, there’s a notable drop in attenuation in the 500Hz range compared to the Technics. While we don’t have their products’ mid-range averaged total attenuation numbers, that still contradicts or, at the very least, doesn’t support What Hi-fi’s conclusion. All this isn’t to say either publication’s conclusions (or posts on Reddit) are invalid; it’s just important to consider the context of the claims that one product performs better than the other.
Yeah, I did some side-by-side listening with most of the obvious competitors. While it’s not a proper blind test, my personal experience has the Technics on par with the other top-tier ANC picks. I want to make sure to reiterate that ANC in our reviews is purely based on objective measurements that use accepted standards as a framework, though I’m sure there’ll be some users who have a different idea of what products perform the best. That said, just looking at the worse mid-range attenuation on the Bose, I think most people would notice more noise wearing them when compared to the Technics. Were there any communities or publications in particular that have a ‘not quite matching the best’ viewpoint?
Hi, thanks for reaching out. We don’t have any immediate plans to review the QS-750F. Just looking at the spec sheet, it looks pretty decent, though if I were in the market for the soundbar, I’d probably still end up going with the Q910D and just dealing with the wires. That said, we’ll definitely keep this soundbar in mind, and if we see enough requests for it, we’ll probably test it. You can also vote for it here with your additional insider votes.
Hi, thanks for the genuine and thoughtful feedback. There’s a lot to unpack here, but I’ll try my best to address all your concerns.
The first thing I’d like to tackle is actually the increased variability in the XM6’s audio delivery. I want to mention this at the very beginning: due to how audio delivery varies based on physical aspects of the wearer, it’s entirely possible that the way they sound on your head happens to align with your particular preferences. I also agree that the XM6 are more versatile—they have a wider feature set and a better microphone (especially over LC3), among other things.
While we do check with other reviewers and publications to ensure we don’t miss anything substantial when reviewing a product, we’re also confident in our own measurements, particularly when it comes to objective metrics in sound. At this point, I’d like to clear up a misconception you might have. Our Audio Reproduction Accuracy score—which I assume is what you’re using to assess sound quality between the XM6 and XM5—does not include Frequency Response Consistency as part of the score breakdown. The XM6 have more peaks and dips in their response (localized to the treble), and more variability in their phase response mismatch passes. It’s really important to mention that after evaluating all the components of ARA, there is only a minor 0.2-point difference in score between the two headphones; they’re both green (above 7.5), which is what we would consider good performance when assessed by an enthusiast like yourself—and most definitely a small enough difference that personal preference can easily outweigh the score difference. And indeed, part of the ARA score does evaluate compliance with our target curve (which itself is a benchmark based on taste). I want to reiterate that you’re valid in your preference for the XM6’s sound, and that liking the XM6’s sound over the XM5’s is not mutually exclusive with the validity of our objective measurements.
Now, let’s get into Frequency Response Consistency a bit more. To paraphrase some points made in this article: headphones are not typically specialized for physical characteristics, meaning they should perform similarly for a decently wide variety of head shapes, hair thicknesses, and yes, for people who wear glasses (about 63% of Americans wear prescription glasses, according to the Vision Council in 2021). Our test uses measurements from both the industry-standard BK5128 and multiple individuals to calculate the standard deviation relative to the mean passes from the population average. We believe that FRC is a metric worth evaluating because it highlights limitations in headphones’ ability to sound alike when worn on different people and over multiple sessions on the same individual. For a category of product that is targeted at a wide audience, I think it’s fair for us to evaluate whether it can perform consistently for the same population. It’s worth adding some nuance here: we aren’t disqualifying the XM6 as an option for people who wear glasses—because, as you said, that would be disingenuous. Our intention is to highlight a potential drawback and encourage additional consideration for users who might be affected.
It’s entirely possible that the XM6 will improve their sound quality performance in subsequent firmware updates, but it’s worth tempering your expectations, as seal issues are a physical problem. It’s also generally the exception, rather than the rule, that we find frequency response to change in a meaningful way as a result of software updates. If you’d be open to it, please let us know what aspects of the XM6 you found sounded better than the XM5. We try to stay aware of potential blind spots in our approach, so we’d appreciate any insights you can offer.
My guess is it’s related to the seal. I daily drove them for a while, and I had to rotate them (think Google Pixel Buds Pro 2) to seat them properly. It’s not the most comfortable to wear them this way so some people might notice a more prominent plunger-like effect.
Hi, is there something specific you think is an issue? The graph looks pretty solid, there isn’t much fluctuation in the fundamental focal ranges, and while the treble roll-off is a bit early (6.5 kHz), the speaker has a deeper register, so they’re not affected by the issue as much.
Yeah, it’s quite something.
Most onboard DACs on motherboards have as close to a flat response as possible. If you aren’t noticing any popping or artifacts when plugged into the 3.5mm jack, there’s not much need for a DAC/AMP.
Hi, it’s definitely a contender for the list. We’re currently prioritizing bringing up many of our educational and informative articles about headphones and our specific tests, so recommendation articles are on the back burner. That said, the headphones have some pretty notable frequency response consistency issues, which may not make them a great option for a general recommendation article (as many users could have a suboptimal listening experience if they wear glasses, for instance). Recommended products in our articles should ideally have a replicable listening experience for a wide variety of users, especially for articles where sound quality is a focus. Feel free to let us know if you have any additional questions.
Thanks for letting us know, we’ve updated the review 8).
Hi, thanks for leaving us your feedback. I’m not sure I fully understand your question, but I’ll clear up why the LG S95TR isn’t in this article. When we assess soundbars that make it onto our articles, we balance a number of factors, including price, when making the decision. In most of our “Best of X” articles, we recommend products in a number of price ranges to represent the variation in budget that a population of readers may have. As you quoted from the side-by-side, in this case, the HW-Q990D outperforms the S95TR in a number of metrics (performance and flexibility), including our tested Atmos performance, meaning they are the better pick given the scope of the article (Atmos). If we also consider that both soundbars occupy the same price bracket at the moment ($1400-$1600USD), the omission of the LG is fairly justified in our recommendations framework.
I really want to stress that if you are a fan of the S95TR, or you prefer it for utility that is specific to you (eg, WOWCAST with LG TV), they are both impressive soundbars and you likely would be happy with either. Hope this helps clarify things, and if I’ve misunderstood your question, please let me know.
Thanks for the question, I’m not fully sure what you’re asking since you’ve listed a few signaling methods and devices, and latency is pretty device-dependent. But as a rule of thumb, you should have the lowest latency with analog devices that can be used passively and through most audio interfaces designed for music. From testing, optical tends to have higher latency, though YMMV.
Keep in mind that latency alone isn’t the most important factor if you’re dealing with video and audio; the interaction between the two will contribute to your perception of A/V-desync.
It will probably reflect some light from the TV, but depending on how low you have the soundbar positioned relative to your eyes, the angle might be large enough that you won’t see the reflections. It’s an interesting design to have the woofer exposed, but as we don’t typically disassemble soundbars that have an integrated sub, we can’t really say much about the relative size.
Hi, thanks for reaching out. We took a look at the documentation as well and didn’t find the information. As we typically test for functionality, this isn’t something we normally look at. We’ll find out what version the soundbar uses in the lab when we have a chance.
Howdy, thanks for sharing some of your feedback. We typically prioritize products that interest the most users, but that can sometimes leave us with gaps in coverage, so we’ll keep this in mind as we update products to v2.0.
Are there any specific headsets that you’d like to see on our newest test bench update?
Hi, thanks for reaching out. We had the EQ function disabled during testing (so the headphones were “stock”) as you can see in the Test Settings. Sound is pretty subjective, so it’s hard for us to say what is best (and we don’t provide individualized EQs). That said, if you’re using this for competitive shooters, you’ll probably want a bit of emphasis in the treble overall, and a boost in the high-bass/low-mids to help with sound cues. Hope that helps, good luck out there ;)
Hi, thanks for the feedback. I see what you mean; typically we’d just overlay the ON and OFF FR on the same graph, that way you can directly compare the two settings. In this case, we took the approach of individual graphs due to the AirPods changing their frequency response depending on the listening level; it likely would’ve been too cluttered if we had everything on one graph (though as you said, we could’ve included the target). I’ll flag this with the rest of the team and see if we can generate an On/Off graph using a typical listening level.
Hi, thanks for sharing your experience with us. Our calculator assesses the headphones as balanced based on the bass and treble amounts relative to one another. In this case, their relative dB difference is close, but not quite past the threshold of what we’d consider balanced. It’s also worth pointing out that listeners also typically prefer an extra bit of bass on IEMs relative to headphones, so you could say these are ‘balanced’ for IEMs.
Preference aside, it’s likely that switching ear tips affected the headphones’ sound signature, so it’s possible you might be experiencing more bass than what we measured with the foam tips. While it’s still early days in v2.0 as we work through our backlog of products to update, I’ve generated this table for you in case you’re still searching for earbuds. Perhaps something like the TRUTHEAR HEXA might be what you’re looking for.
Hi, thanks for sharing your experience with us. The metric you’re looking for that applies to this review (v1.8) is Imaging, specifically the Weighted Phase, Amplitude, and Frequency Mismatch values. Unfortunately, this current iteration doesn’t visualize the affected frequencies. Fortunately, this has been improved in Test Bench v2.0, which you can read more about here. Part of the update is the introduction of a Stereo Mismatch score, which aims to assess issues like directional bias in stereo reproduction with more granularity. The Sennheiser MOMENTUM 4 Wireless are part of our list of headphones to be updated, so sit tight 😊.
As always, feel free to leave us additional feedback if there’s something you’d like to see.
That’s correct, we force the HFP (hands free protocol), so the results won’t be with Bluetooth LE. That said, we’ll try obtain some recordings if the headphones’ Bluetooth LE works with our Creative BT-W6, but we likely won’t score the performance in the review to maintain apples-to-apples comparisons.
The XM6 spec sheet lists support for TMAP and PBP on Bluetooth LE. but I’m personally hoping they ship GMAP support via a firmware update at some point too. 🤞
Let us know if there’s anything else you’d like to see and we’ll try take a look.
Hi, good question. We haven’t tested the microphone performance when using an analog connection. Theoretically, the mic could have a different FR, and there won’t be any bandwidth limitations compared to the wireless signaling protocol when connected via the headphones’ dongle. At the moment, our testers don’t have the resources to look into this, but we have potential plans to update these headphones to TBU 2.0 (and we may squeeze analog microphone testing in for this product). That said, there’s no set timeline for now.
Hi, thanks for sharing your experience with us here. Could you share more details with us about what you found suboptimal with the Logitech? There’s a known bug where opening the headphones’ software enables Windows sound processing. If you’re using them on a PC, it might be worth checking advanced sound settings to ensure Spatial Sound and Enhancements are disabled. Barring that, it could just be the open design suiting your tastes more.
Hi, thanks for sharing your concerns with us. Could you elaborate with what you mean when the earbuds’ frequency response is “absolutely terrible”?
In the interest of supporting more varied preferences in the community, we’ve lessened the weightings based on adherence to our target in line with the reasoning presented in this article. As such, our Audio Reproduction Score (ARA) includes weightings for more objective measures that are less dependent on sound signature preference, such as Peaks and Dips, Stereo Mismatch, and Total Harmonic Distortion. While ARA still includes compliance scores related to our in-house target curve, they’re not as heavily weighted as in previous iterations of our reviews.
It’s also worth pointing out that these earbuds’ sound signature is not static, you’re able to EQ them, which can be seen as another reason to focus on objective measurements that will remain consistent even after adjustments are made to the earbuds’ sound. If you’re looking for earbuds that are more balanced out of the box, consider taking a look at this table. We broadly categorize products using our new Sound Signature test (which may be better to focus on if you’re primarily interested in products’ FR). We’re working on increasing the number of products on TBU2.0, but this would be a good starting point.
We haven’t tried any aftermarket tips out (but depending on the density they could make a difference). Give it a try and let us know :)
Hi, thanks for sharing your feedback with us. Was there anything specific you can pinpoint that gave you the impression that the HD 560S were muddier than the DT 770? From my experience, elevated treble (which incidentally tends to be the case with Beyerdynamics) can sometimes give me the impression of precision. As sound goes pretty deep into preference, it’s not unexpected that you might disagree with our scoring.
Looking at both headphones, the DT 770 PROs run into some issues with their group delay, contributing to the bass range notch, affecting their peaks and dips scoring. If you like how the headphones sound and where notches in their frequency response are placed, that’s ultimately more important, especially since there aren’t any glaring issues with the headphones.
We only review our headphones stock, so I’m curious about how you found the modding (and what you might recall before and after).
Hi, as part of our update from Test Bench 1.7 to 1.8, we changed our measurement rig from the HMS head to the BK5128. In addition, we also adjusted our target curve (from which our bass extension calculations are derived). The slight difference in how the headphones interact with a different head and modifications to the target curve are the culprits for the change in our measured bass extension values. Regarding the Philips and Superlux, you can make a confident comparison between the fine details of the review as they’re both on the same Test Bench.
Hi there, the first thing to mention is that imaging varies from unit to unit. So, things like stereo mismatch and group delay, to a certain extent, are dependent on tolerances set by the manufacturer. That said, once you pass a certain point in performance (good stereo matching), you want to care about other factors like frequency response and, of course, whether you like how they sound. These are an excellent pickup for eSports titles, but my opinion is that if you have a decent headphone, it comes down to just getting used to how they sound. Options like the DT 990 PRO or DT 900 PRO X, can be found significantly cheaper, have good imaging, and have the treble detail. You can also go for the DT 1990 PRO MKI (if you have an amp already).
Hi there, thank you for giving us a detailed list of issues you encountered. We’re aware their stock coverage is dipping after Jabra’s exit from the market, and we are in the process of removing the product from our articles.
As far as the issues you’ve encountered, we’ll see if we can replicate #1-3 on our unit, though we didn’t encounter any problems from our initial testing and more recent retests (for firmware updates and recent test bench update 1.8).
Update: We recently completed some investigation and found:
Hey, thanks for the heads up. We’re aware their availability is shrinking and are gradually phasing them out of our lists. Looks like you’ll have to keep looking for The Ones a little longer. ;)
Hi there, I’m sorry you haven’t had a good experience with these earbuds. Thank you for sharing details about the issues you had. Feedback is very important to us as we strive to constantly improve our testing methodology and fill gaps that we uncover: to that end, we’re looking to replicate the ANC issue you’ve described. We’re also working on removing Jabra picks from our recommendation articles as stock is now waning.
I’ve also gone ahead and refunded your Insider; you should be getting that in a few business days.
Hi, thanks for bringing this up. Considering their current stock situation, we’re working on phasing out Jabra picks.
Hi, unfortunately we don’t test the audio quality of console configurations extensively, mostly focusing on compatibility.
That said, conventional wisdom leans toward the USB directly to the console having better sound. In the case of the DS5 controller, sound quality will be limited by the wireless connection to the PS5 and then by an onboard DAC on the controller. Sony is likely not using a high-fidelity focused connection to the controller, focusing instead on latency (gaming product), so you’ll probably have a better time with the USB plugged straight into the PlayStation.
Hope this helps.