Ting, Natasha, and Yasser’s Paper in Nature Electronics. Congrats!!!
Paper title: A wearable biosensing system with in-sensor adaptive machine learning for hand gesture recognition
Abstract: Wearable devices that monitor muscle activity based on surface electromyography could be of use in the development of hand gesture recognition applications. Such devices typically use machine-learning models, either locally or externally, for gesture classification. However, most devices with local processing cannot offer training and updating of the machine-learning model during use, resulting in suboptimal performance under practical conditions. Here we report a wearable surface electromyography biosensing system that is based on a screen-printed, conformal electrode array and has in-sensor adaptive learning capabilities. Our system implements a neuro-inspired hyperdimensional computing algorithm locally for real-time gesture classification, as well as model training and updating under variable conditions such as different arm positions and sensor replacement. The system can classify 13 hand gestures with 97.12% accuracy for two participants when training with a single trial per gesture. A high accuracy (92.87%) is preserved on expanding to 21 gestures, and accuracy is recovered by 9.5% by implementing model updates in response to varying conditions, without additional computation on an external device.
Berkeley News also published an article about the project! The article can be found here.
Publication:
-
A wearable biosensing system with in-sensor adaptive machine learning for hand gesture recognition
Ali Moin,
Andy Zhou,
Abbas Rahimi,
Alisha Menon,
Simone Benatti,
George Alexandrov,
Senam Tamakloe,
Jonathan Ting,
Natasha Yamamoto,
Yasser Khan,
Fred Burghardt,
Luca Benini,
Ana C. Arias,
and
Jan M. Rabaey
Nature Electronics,
2020
,
.
[Abstract]
[Bibtex]
[PDF]
Wearable devices that monitor muscle activity based on surface electromyography could be of use in the development of hand gesture recognition applications. Such devices typically use machine-learning models, either locally or externally, for gesture classification. However, most devices with local processing cannot offer training and updating of the machine-learning model during use, resulting in suboptimal performance under practical conditions. Here we report a wearable surface electromyography biosensing system that is based on a screen-printed, conformal electrode array and has in-sensor adaptive learning capabilities. Our system implements a neuro-inspired hyperdimensional computing algorithm locally for real-time gesture classification, as well as model training and updating under variable conditions such as different arm positions and sensor replacement. The system can classify 13 hand gestures with 97.12% accuracy for two participants when training with a single trial per gesture. A high accuracy (92.87%) is preserved on expanding to 21 gestures, and accuracy is recovered by 9.5% by implementing model updates in response to varying conditions, without additional computation on an external device.
@article{Ting_NatureElectronics,
author = {Moin, Ali and Zhou, Andy and Rahimi, Abbas and Menon, Alisha and Benatti, Simone and Alexandrov, George and Tamakloe, Senam and Ting, Jonathan and Yamamoto, Natasha and Khan, Yasser and Burghardt, Fred and Benini, Luca and Arias, Ana C. and Rabaey, Jan M.},
title = {A wearable biosensing system with in-sensor adaptive machine learning for hand gesture recognition},
year = {2020},
doi = {https://doi-org.libproxy.berkeley.edu/10.1038/s41928-020-00510-8},
publisher = {Nature},
url = {https://www.nature.com/articles/s41928-020-00510-8},
journal = {Nature Electronics},
volume = {},
number = {},
thumbnail = {Ting2021emg.PNG},
pdf = {Ting2021emg.pdf}
}