These Sonar Glasses Track Your Facial Movements To Let You Communicate Silently
By Alexa Heah, 07 Apr 2023
At first glance, this pair of ordinary glasses looks just like any you’d buy from the optometrist. But look more closely, and you’ll be able to see its wearer silently mouthing the passcode to unlock an out-of-reach smartphone, and commanding it to skip to the next song in the playlist.
According to Cornell University, doctoral student Ruidong Zhang isn’t performing an act of telepathy but is making use of a breakthrough technology dubbed ‘EchoSpeech’—a speech recognition system that can recognize up to 31 silent commands based on facial movements.
Powered by artificial intelligence (AI), the tool was developed by Cornell’s Smart Computer Interfaces for Future Interactions (SciFi) lab, and claims to require just “a few minutes” of getting to know its user before it can recognize lip and mouth movements.
“For people who cannot vocalize sound, this silent speech technology could be an excellent input for a voice synthesizer. It could give patients their voices back,” quipped Zhang, who explained that EchoSpeech could currently be run on smartphones to “speak” to others without a sound.
The device works by sending and receiving sound waves from a user’s face by sensing the movements of their lips. Its deep-learning algorithm then analyzes the echo profiles in real time, with a reported accuracy of 95%.
At the moment, most silent-speech recognition technology only responds to a limited set of commands or requires the user to wear a camera, which isn’t all too practical in daily life, aside from serious privacy concerns.
As such, acoustic-sensing technology found in EchoSpeech does away with this need, and requires much less bandwidth to operate, considering audio data is often much smaller than image or video recordings.
Going forward, the team plans to harness the technology to track facial, eye, and upper-body movements as well, predicting that the glasses will become “an important personal computing platform” in due time.
[via Engadget and Cornell University, cover image via Cornell University]