Sign In

Communications of the ACM

ACM TechNews

Glasses Use Sonar, AI to Interpret Upper Body Poses in 3D

View as: Print Mobile App Share:
Saif Mahmud, a doctoral student in the field of information science, with PoseSonic glasses.

The technology is a major step up from existing wearable devices that often require a mini video camera, which isn’t always practical.

Credit: Louis DiPietro

Cornell University researchers have developed a wearable device that uses inaudible soundwaves and artificial intelligence (AI) to track the user's upper body movements in three dimensions.

The PoseSonic device combines off-the-shelf eyeglasses with micro sonar. It features two pairs of tiny microphones and speakers on the eyeglasses' hinges that produce an echo profile image.

A machine learning algorithm analyzes the image and estimates the wearer's body pose at nine joints, without the need for an initial training session with the user.

Said Cornell's Cheng Zhang, "By integrating cutting-edge AI into low-power, low-cost, and privacy-conscious acoustic sensing systems, we use less instrumentation on the body, which is more practical, and battery performance is significantly better for everyday use."

From Cornell Chronicle
View Full Article


Abstracts Copyright © 2023 SmithBucklin, Washington, D.C., USA


No entries found