Do you remember how your parents tried to convince you to eat your vegetables as a kid by promising they were good for your health? That’s the same tactic a lot of wearable makers are using today, by adding health-tracking features to devices like smartwatches. Now, researchers have developed a way for earbuds to track your ear health, too.
Every time Apple holds an event, it spends a few minutes touting the health benefits of wearing an Apple Watch, which has heart rate tracking features that can potentially identify heart problems before they become a serious complication. It’s also rumored that the long overdue update to Apple’s AirPods Pro wireless earbuds will Possibly include body temperature measuringallowing the devices to detect a fever: an early symptom of countless other conditions.
It turns out the inherent capabilities of earbuds—namely blasting sound into your ears—also allows them to potentially detect conditions that can afflict the inner ear and the ear canal, as researchers from the University of Buffalo have found with an experimental device they’ve called EarHealth.
What’s most interesting about EarHealth is that it relies on earbuds that more or less feature off-the-shelf hardware, although with an upgraded microphone inside designed to pick up sounds in the ear, not around the wearer. Based on images shared of the prototype, the EarHealth doesn’t even appear to rely on wireless earbuds, although an official release on the research on the University of Buffalo website does specifically mention the use of Bluetooth earbuds—which is good, because none of us want to go back to wires.
WhereAs the Apple Watch uses optical detection tricks to monitor heart health, the EarHealth uses sound instead. The earbuds emit a quick chirp which reverberates through the ear canal, producing unique sounds and echoes which are captured by the microphone. The captured sounds are then processed by a custom app on a connected smartphone that relies on a deep learning algorithm to generate a profile of the user’s inner ear geometry.
The first chirp is done while the user is healthy to generate a baseline profile of their inner ear, while later chirps, which can be regularly scheduled, generate profiles that are compared against the original to spot differences. These can be used to diagnose one of three different conditions: earwax blockage, ruptured ear drums, and otitis media, which is a common infection or inflammation of the middle ear caused by colds or sore throats.
In tests done with 92 users that included 27 healthy subjects, 22 with ruptured eardrums, 25 with a confirmed case of otitis media, and 18 with earwax causing a blockage, the EarHealth had a diagnosis accuracy of 82.6%, but that could potentially be improved As the researchers refine both the hardware and the sample base of users. The benefit to using AI-powered algorithms is that they will continue to improve and become more accurate at making diagnoses over time as more sample data is made available.