These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Assessment of automated scoring of polysomnographic recordings in a population with suspected sleep-disordered breathing.
    Author: Pittman SD, MacDonald MM, Fogel RB, Malhotra A, Todros K, Levy B, Geva AB, White DP.
    Journal: Sleep; 2004 Nov 01; 27(7):1394-403. PubMed ID: 15586793.
    Abstract:
    STUDY OBJECTIVES: To assess the accuracy of an automated system (Morpheus I Sleep Scoring System) for analyzing and quantifying polysomnographic data from a population with sleep-disordered breathing. SETTING: Sleep laboratory affiliated with a tertiary care academic medical center. MEASUREMENTS AND RESULTS: 31 diagnostic polysomnograms were blindly analyzed prospectively with the investigational automated system and manually by 2 registered polysomnography technologists (M1 & M2) from the same laboratory. Sleep stages, arousals, periodic limb movements, and respiratory events (apneas and hypopneas) were scored by all 3. Agreement, Cohen kappa, and intraclass correlation coefficients were tabulated for each variable and compared between scoring pairs (A-M1, A-M2, M1-M2). The 26,876 epochs (224 hours of recording time) were analyzed. For sleep staging, agreement/kappa were A-M1: 78%/0.67, A-M2: 73%/0.61, and M1-M2: 82%/0.73. The mean respiratory disturbance indexes were M1: 20.6+/-23.0, M2: 22.5+/-24.5, and A: 23.7+/-23.4 events per hour of sleep. The respiratory disturbance index concordance between each scoring pair was excellent (intraclass correlation coefficients > or = 0.95 for all pairs), although there was disagreement in the classification of moderate sleep-disordered breathing (percentage of positive agreement: A-M1, 37.5% and A-M2, 44.4%) defined as a respiratory disturbance index between 15 and 30 events per hour of sleep. For respiratory-event detection, agreement/kappa were A-M1 and A-M2: 90%/0.66 and M1-M2: 95%/0.82. The agreement and kappa for limb movement detection were A-M1: 93%/0.68, A-M2: 92%/0.66, and M1-M2: 96%/0.77. The scoring of arousals was less reliable (agreement range: 76%-84%, kappa range: 0.28-0.57) for all pairs. CONCLUSIONS: Agreement between manual scorers in a population with moderate sleep-disordered breathing was close to the average pairwise agreement of 87% reported in the Sleep Heart Health Study. The automated classification of sleep stages was also close to this standard. The automated scoring system holds promise as a rapid method to score polysomnographic records, but expert verification of the automated scoring is required.
    [Abstract] [Full Text] [Related] [New Search]