These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Human Pose Estimation for Clinical Analysis of Gait Pathologies.
    Author: Ali MM, Medhat Hassan M, Zaki M.
    Journal: Bioinform Biol Insights; 2024; 18():11779322241231108. PubMed ID: 38757143.
    Abstract:
    Gait analysis serves as a critical diagnostic tool for identifying neurologic and musculoskeletal damage. Traditional manual analysis of motion data, however, is labor-intensive and heavily reliant on the expertise and judgment of the therapist. This study introduces a binary classification method for the quantitative assessment of gait impairments, specifically focusing on Duchenne muscular dystrophy (DMD), a prevalent and fatal neuromuscular genetic disorder. The research compares spatiotemporal and sagittal kinematic gait features derived from 2D and 3D human pose estimation trajectories against concurrently recorded 3D motion capture (MoCap) data from healthy children. The proposed model leverages a novel benchmark dataset, collected from YouTube and publicly available datasets of their typically developed peers, to extract time-distance variables (e.g. speed, step length, stride time, and cadence) and sagittal joint angles of the lower extremity (e.g. hip, knee, and knee flexion angles). Machine learning and deep learning techniques are employed to discern patterns that can identify children exhibiting DMD gait disturbances. While the current model is capable of distinguishing between healthy subjects and those with DMD, it does not specifically differentiate between DMD patients and patients with other gait impairments. Experimental results validate the efficacy of our cost-effective method, which relies on recorded RGB video, in detecting gait abnormalities, achieving a prediction accuracy of 96.2% for Support Vector Machine (SVM) and 97% for the deep network.
    [Abstract] [Full Text] [Related] [New Search]