A new wristband device developed by researchers at Cornell University utilizes AI and inaudible soundwaves to detect hand positions and interactions with objects.

Wristband uses echos, AI to track hand positions for VR and more

(Photo: Cornell University)

Introducing the EchoWrist

The device, called EchoWrist, has potential applications in various fields, including virtual reality (VR) systems, smartphone control through hand gestures, and activity tracking for tasks like cooking. Its compact size allows it to fit onto a commercial smartwatch, and it can operate all day on a standard smartwatch battery.

EchoWrist is part of the latest wave of low-power body pose-tracking technology developed by the Smart Computer Interfaces for Future Interactions (SciFi) Lab at Cornell. 

Cheng Zhang, an assistant professor of information science at the Cornell Ann S. Bowers College of Computing and Information Science, leads the lab.

According to Zhang, the hand plays a vital role in almost every activity, making continuous hand pose tracking essential. EchoWrist offers an affordable and highly accurate solution to this need.

EchoWrist not only tracks hand movements but also enables users to control devices with gestures and deliver presentations more effectively.

The device employs two small speakers mounted on a wristband to emit inaudible soundwaves, which bounce off the hand and any held objects. Two microphones capture these echoes, which are then processed by a microcontroller. Despite its advanced capabilities, EchoWrist is powered by a battery smaller than a quarter.

The researchers developed a neural network model inspired by the brain's neurons to interpret hand poses based on the received echoes. To train this model, they compared echo profiles with videos of users performing various gestures and accurately reconstructed the positions of 20 hand joints.

In tests involving 12 volunteers, EchoWrist demonstrated impressive accuracy in detecting objects and actions, achieving an accuracy rate of 97.6%. This capability enables interactive recipe apps to track users' cooking progress and provide instructions without the need for users to touch their screens, according to the team.

The researchers note that EchoWrist offers significant advantages in terms of size and energy efficiency. Additionally, its acoustic tracking feature enhances user privacy while delivering performance comparable to that of camera-based systems.

Read Also: Amazon Expands Free Credits Program for Startups Using AI Models, Offers Access to Anthropic and More


Revolutionizing VR Applications

The technology could revolutionize VR applications by accurately reproducing hand movements without the need for bulky camera setups. It also has the potential to enhance AI's understanding of human activities by tracking and interpreting hand poses during daily tasks.

While EchoWrist currently faces challenges in distinguishing between similar-shaped objects, such as forks and spoons, the researchers are optimistic about improving its object recognition capabilities through further refinement.

Doctoral students Chi-Jung Lee and Ruidong Zhang, co-first authors of the study, will present their research at the Association of Computing Machinery CHI conference on Human Factors in Computing Systems (CHI'24). 

"One of the most exciting applications this technology would enable is to allow AI to understand human activities by tracking and interpreting the hand poses in everyday activities," Cheng Zhang said in a statement.

The study findings were published in the journal arXiv. 

Related Article: Deepfakes are Getting Harder to Identify But Scientists Suggest Using AI to Detect Real Signs of Life - Why?

Byline


ⓒ 2024 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Join the Discussion