These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: A Wearable System for Recognizing American Sign Language in Real-Time Using IMU and Surface EMG Sensors. Author: Wu J, Sun L, Jafari R. Journal: IEEE J Biomed Health Inform; 2016 Sep; 20(5):1281-1290. PubMed ID: 27576269. Abstract: A sign language recognition system translates signs performed by deaf individuals into text/speech in real time. Inertial measurement unit and surface electromyography (sEMG) are both useful modalities to detect hand/arm gestures. They are able to capture signs and the fusion of these two complementary sensor modalities will enhance system performance. In this paper, a wearable system for recognizing American Sign Language (ASL) in real time is proposed, fusing information from an inertial sensor and sEMG sensors. An information gain-based feature selection scheme is used to select the best subset of features from a broad range of well-established features. Four popular classification algorithms are evaluated for 80 commonly used ASL signs on four subjects. The experimental results show 96.16% and 85.24% average accuracies for intra-subject and intra-subject cross session evaluation, respectively, with the selected feature subset and a support vector machine classifier. The significance of adding sEMG for ASL recognition is explored and the best channel of sEMG is highlighted.[Abstract] [Full Text] [Related] [New Search]