These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Mobile Spatiotemporal Gait Segmentation Using an Ear-Worn Motion Sensor and Deep Learning. Author: Decker J, Boborzi L, Schniepp R, Jahn K, Wuehr M. Journal: Sensors (Basel); 2024 Oct 04; 24(19):. PubMed ID: 39409482. Abstract: Mobile health technologies enable continuous, quantitative assessment of mobility and gait in real-world environments, facilitating early diagnoses of gait disorders, disease progression monitoring, and prediction of adverse events like falls. Traditionally, mobile gait assessment predominantly relied on body-fixed sensors positioned at the feet or lower trunk. Here, we investigate the potential of an algorithm utilizing an ear-worn motion sensor for spatiotemporal segmentation of gait patterns. We collected 3D acceleration profiles from the ear-worn sensor during varied walking speeds in 53 healthy adults. Temporal convolutional networks were trained to detect stepping sequences and predict spatial relations between steps. The resulting algorithm, mEar, accurately detects initial and final ground contacts (F1 score of 99% and 91%, respectively). It enables the determination of temporal and spatial gait cycle characteristics (among others, stride time and stride length) with good to excellent validity at a precision sufficient to monitor clinically relevant changes in walking speed, stride-to-stride variability, and side asymmetry. This study highlights the ear as a viable site for monitoring gait and proposes its potential integration with in-ear vital-sign monitoring. Such integration offers a practical approach to comprehensive health monitoring and telemedical applications, by integrating multiple sensors in a single anatomical location.[Abstract] [Full Text] [Related] [New Search]