Revolutionary Wearable Tech Translates Arm Gestures into Machine Commands, Even in High-Intensity Environments
November 18, 2025
The research presents a first-of-its-kind wearable human-machine interface that maintains performance across motion disturbances and can learn from individual users and complex environments.
By combining stretchable electronics with artificial intelligence, it reliably recognizes gesture signals in real-world, high-motion environments through real-time denoising of sensor data.
The work is a collaboration at UC San Diego among Sheng Xu and Joseph Wang’s labs, with Xiangjun Chen and Xianjun Chen as co-first authors and additional contributors Zhiyuan Lou, Xiaoxiang Gao, and Lu Yin.
Potential beneficiaries include rehabilitation patients, people with mobility limitations, industrial workers, first responders, divers, and consumers seeking stable gesture-based controls.
The study detailing the findings appears in Nature Sensors.
DARPA provided funding support for the project under contract HR001120C0093.
Originating from an aim to help military divers control underwater robots, the work addresses a broader, common problem of motion-induced interference across diverse environments.
Limitations and future impact include expanding everyday usability of gesture-based interfaces and broad applicability beyond underwater or specialized environments.
The project is a collaboration between UC San Diego professors Sheng Xu and Joseph Wang, and the work envisions enabling gesture control in real-life applications from medical rehabilitation to underwater robotics and consumer uses.
This work represents a new method for noise tolerance in wearable sensors, enabling reliable gesture-based controls for diverse users in daily life.
Tests included challenging scenarios with running and high-frequency vibrations, combined disturbances, and simulated ocean conditions using a Scripps Ocean-Atmosphere Research Simulator, showing accurate, low-latency performance.
Participants controlled a robotic arm while running and under disruptive motions, with additional validation in simulated ocean scenarios, maintaining accuracy and low latency.
Robust validation across diverse environments demonstrated low-latency performance in all tests.
Co-first author notes that this breakthrough removes motion noise as a limiting factor for wearable gesture sensing, enabling intuitive human–machine interfaces in dynamic settings.
UC San Diego researchers developed a wearable system that translates everyday arm gestures into commands for a robot, even during high-motion activities like sprinting, driving, or turbulent ocean conditions.
The study was published in Nature Sensors on November 17, 2025, with collaboration from Sheng Xu and Joseph Wang and DARPA funding.
Applications span medical rehabilitation, assistive devices, industrial and emergency response settings, and underwater robotics, with broader consumer-use implications.
A soft electronic patch worn on a cloth armband uses stretchable electronics, motion and muscle sensors, a Bluetooth microcontroller, and a flexible battery, powered by a deep-learning framework that denoises signals in real time to interpret gestures and send commands to machines.
The system solves a long-standing problem of gesture-based wearables losing accuracy during movement by achieving noise tolerance across a broad range of disturbances.
Summary based on 4 sources
Get a daily email with more AI stories
Sources

Newswise • Nov 18, 2025
Wearable Lets Users Control Machines and Robots While on the Move | Newswise
Interesting Engineering • Nov 17, 2025
Engineers develop AI-powered wearable that turns everyday gestures into robot commands
Neuroscience News • Nov 17, 2025
Next-Gen Wearable Lets You Control Machines with Simple Gestures
UC San Diego Homepage
Wearable Lets Users Control Machines and Robots While on the Move