I got AirPods only Negative mention during the keynote at the Apple event. It’s understandable, as the iPhone 15 and Apple Watch Series 9 (and Ultra 2) have been the center of attention. Furthermore, the headphones haven’t gotten the same way of hardware updates. like press release Released after the event was confirmed, the biggest physical change to the AirPods Pro 2 is the arrival of the (long-awaited) USB-C charging case.
You’d be forgiven for thinking that the AirPods news ended there. However, Apple’s high-end earbuds have also received a useful software update, in the form of new listening modes accessible with a few taps in iOS 17 on both versions of the AirPods Pro 2 (USB-C and Lightning).
With new models connected, swipe down to pull up the Control Center, then long press on the sound chip. Three mode selections will appear below: Noise Cancellation, Conversation Awareness, and Spatial Audio. They are the first two to get love this year.
Adaptive sound has been added to the options, along with standard noise cancellation, transparency, and off. By clicking on the new option, it is highlighted with a rainbow background. The new feature seamlessly switches between different settings in real time. It’s an attempt to bring both ends of the spectrum into one mode, so that you can walk down a busy street with situational awareness, while not getting the full noise effect of a garbage truck as it passes by.
Image credits: apple
Although it’s named similarly to last year’s Adaptive Transparency feature, Adaptive Audio offers a full range of modes, with both transparency and noise cancellation playing a role.
“Adaptive transparency, which we announced last year, needs to happen very quickly,” Eric Tresky, director of product marketing, said in a conversation with TechCrunch. “It’s happening at a rate of 40,000 times a second. And that’s not just monitoring, but downsampling as well. In order to bring that down quickly, it has to happen in real time. Adaptive audio is a little bit slower over a few seconds, because it’s supposed to It’s a more systematic process of finding out what you’re listening to. We’re moving from adaptive audio to transparency, so – in order to make it less distorted and more comfortable – it’s much slower for that reason.
The system also takes into account whether the content you are listening to is music or podcasts. This is determined based on tagging from apps like Apple Music. The microphone also measures the sound level inside your ear to get a true sense of how loud you feel. “Because if you only measure the volume of what you think you’re playing in someone’s ears,” explains Vice President of Sensing and Connectivity Ron Huang, depending on how you wear it and other factors, it could be less accurate.
Huang tells TechCrunch that the company has considered leveraging your device’s GPS to determine sound levels based on location. But in real-world tests, this method has proven ineffective.
“During early exploration of Adaptive Audio, we basically put you in ANC versus transparency mode, depending on where you are,” Huang says. “You can imagine that the phone could give a cue to the AirPods and say, ‘Hey, you’re home,’ etc. That’s a way to do it, but after everything we’ve learned, we don’t think that’s the right way to do it, and that’s not what we did.” Of course, your home isn’t always quiet, and the streets aren’t always noisy. We decided that instead of relying on a location cue from your phone, AirPods monitor your environment in real time and intelligently make those decisions on their own.
![AirPods Pro 2 with USB-C](https://techcrunch.com/wp-content/uploads/2023/09/Airpods-Pro-2-USB-C-3.jpg)
Image credits: Darrell Etherington
Custom volume is also a big part of the adaptive audio experience. The system combines a range of user data with personal preferences to build a fuller picture of listener habits, combined with “machine learning to understand environmental conditions and listening preferences over time to automatically adjust the media experience,” according to Apple. Several different metrics are included.
“We took tens of thousands of hours of different data — different users listening to different content and with different background noise — to really understand different listening preferences, and what the distractions and aggressors are from a noise standpoint to keep your content really clear,” Huang ads. “We also remember your personal preferences. Given the type of environment, the amount of noise there, and how loudly you listen to your content, it remembers it for you. We add it to our machine learning model to make it work better for you.
The other big mode introduced by iOS 17 is Conversation Awareness, which lowers the volume of a track when you start speaking. However, external sounds will not produce the effect, only the wearer. Apple is able to achieve this effect without keeping the audio profiles on board. Instead, it makes use of a number of onboard sensors. When the microphones hear a sound and the accelerometer detects jaw movement, the feature is activated. How long it lasts depends on a variety of different factors. I was impressed with this feature’s ability to avoid causing things like a clear throat or yawning.
The team also took a stab at another long-standing tool for earbuds: the switch. The five-second gap between answering a call and hearing it on your earbuds seems like forever. Taking advantage of the new switching speed requires the user to be locked into the Apple ecosystem.
![](https://techcrunch.com/wp-content/uploads/2023/09/Apple-AirPods-Pro-2nd-gen-USB-C-connection-demo-230912.jpg)
Image credits: apple
“Our AirPods’ connection times to our devices are much faster with this new software update,” says Huang. “That comes from all the different ways we use to detect nearby devices. It’s important for us to know what the iPhone does, what the iPad does, what the Mac does. The phone call is more important than the music, so when you answer a phone call, we make sure we take Track away from your iPhone and connect to your Mac for a conference call, for example.
The last big part of the AirPods announcement is the Vision Pro connectivity. For a full audio experience, those using Apple’s upcoming Spatial Computing headphones should grab the new AirPods Pro for lossless, ultra-low sound.
“Bluetooth typically operates at 2.4 GHz, and that airspace is very noisy,” Huang says. “Everyone runs on 2.4. That’s why Wi-Fi routers, for example, are usually dual-band if not tri-band, because the 5GHz spectrum is much cleaner. To get really low-latency audio, and to get “High-resolution and lossless – it’s all about a very clean, real-time channel between two channels. The combination of 5GHz and the fact that they’re so close has allowed us to do this. We’re able to basically redesign a completely new audio protocol over 5GHz for AirPods.”