Apple Takes Another Step Toward Hearing Aid Functionality
On Monday, June 22, Apple introduced its latest operating system, iOS14, which includes —among many new features—a substantial move towards its AirPod Pros becoming a hearing-aid-like device. Almost buried as an afterthought at the bottom of Apple’s IOS New Features Preview are identical entries in the “Airpods” and “Accessibility” sections that say:
This new accessibility feature is designed to amplify soft sounds and adjust certain frequencies for an individual’s hearing, to help music, movies, phone calls, and podcasts sound more crisp and clear. Headphone Accommodations also supports Transparency mode on AirPods Pro, making quiet voices more audible and tuning the sounds of your environment to your hearing needs.
Hmmm…sounds a lot like a basic description of wide dynamic frequency compression (WDRC) or AGC, doesn’t it? Abram Bailey of Hearing Tracker, who broke the news yesterday, stated “This is the extremely exciting part, as it indicates that AirPods can now essentially be used to provide typical hearing aid functionality; applying personalised amplification to make it easier to hear those around you.” Bailey went on to show how the headphone accommodations use a custom audio setup with a listening test that generates an “audiogram” from the Apple Health app that “seems to indicate that the AirPods Pro will be capable of providing a very fine-tuned custom amplification experience, based on the audiogram (pitch-by-pitch hearing abilities) unique to the user.”
It should be acknowledged that Apple has for many years been developing hearing-aid-related features, including Live Listen for hearing aids and cochlear implants in 2014 (and later for AirPods and the Earpods), in addition to speech audiometry and speech-in-noise packages for developers, noise warning apps for its WatchOS, and more. The company sold more than 60 million Airpods in 2019, compared to about 15 million hearing aids worldwide for the entire hearing industry (4.2 million units in the US). Apple’s Wearables, Home and Accessories division had the most significant year-on-year growth for the company last year, with its sales increasing 41% thanks to the Airpod and Apple Watch, and the tech-giant owns an enviable 36.5% of the wearables market, according to CompareCamp. Mind you, this includes the “hearables” market that Nick Hunn predicted earlier this year will reach $80 billion a year by 2025.
As Paul Dybala, PhD, AuD, of AudiologyDesign points out in a recent LinkedIn post about Apple and its threat to the hearing industry, “If none of this impresses you, buy a pair of AirPod Pros and turn on the Active Noise Cancelling feature. Then change them over to Transparency Mode and listen further. Once you wipe your jaw off the floor, continue reading. Take your time, I’ll wait…” However, he then goes on to point out that hearing loss is widely viewed as a healthcare problem that should be addressed by a hearing healthcare professional, as shown in a 2017 survey by a study he did with colleague Brande Plotnick at Healthy Hearing.
As a side-note, I’ve personally tried several of the products and hearing tests available in some of the better PSAPs. As one example, Alango Technology’s BeHear app did an impressive job of replicating an audiogram of my own mild sloping hearing loss and tailoring the sound to suit my preferences. The idea of an app doing this also reminded me of a September 2018 Hearing Review article by James Jerger, PhD, who—after describing three basic forms of automated audiometry—wrote:
“The most important issue is to catch up with the rest of the automated world…In spite of the many examples of successful automated systems summarised above, I suspect that there will be little further progress in the actual clinical use of automated audiometry of any variety until clinicians become part of the solution. It goes back to their initial educational experience. If the only procedure they learn as students is the manual Hughson-Westlake method on a conventional audiometer, it is unlikely that they will be easily diverted from that familiar path, sophisticated technology notwithstanding. PhD and AuD students—in addition to practicing clinicians—need to understand that automated audiometry can be carried out by less credentialed personnel, resulting in time and cost savings in a clinical setting. It is apparent this testing is moving into the digital/consumer realm [with the link going to Apple’s WDDC 2018 video that includes a demonstration of speech audiometry].
The point is professional hearing healthcare is so much more than automated tests and apps. As Dr Dybala notes in his article, it’s about assessing an often-complex medical problem and applying all of the tools available to tailor an individual solution that works for the patient in all kinds of listening situations, including (and especially) noise. However, as shown by Apple and others, the world of hearables with their automated testing and applied amplification should help millions of consumers make their first moves toward professional hearing care.