These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Spine adverse events severity system: content validation and interobserver reliability assessment.
    Author: Rampersaud YR, Neary MA, White K.
    Journal: Spine (Phila Pa 1976); 2010 Apr 01; 35(7):790-5. PubMed ID: 20195203.
    Abstract:
    STUDY DESIGN: A prospective validation study, preliminary single-center report. OBJECTIVE: The purpose of this study was to assess the content validity and interobserver reliability of a simple severity classification system for adverse events (AEs) associated with spinal surgery. SUMMARY OF BACKGROUND DATA: In the surgical literature what is defined as an AE, the severity of an AE, and the reporting of AEs are variable. Consequently, valid comparison of AEs within or among specialties or surgical centers for the same or different procedures is often impossible. METHODS: Since 2002, a Spine Adverse Events Severity system (SAVES) has been locally developed and prospectively used. AEs were graded as I (requires none/minimal treatment, minimal effect [<1-2 days] on length of stay [LOS]), II (requires treatment and/or increases LOS [3-7 days] with no long-term sequelae), III (requires treatment and/or increased LOS [>7 days] with long-term sequelae [>6 months]), and IV (death). Content validity of the grading system was assessed using the hospital chart abstraction (current defacto gold standard) compared with the SAVES from 200 randomly selected patients. Interobserver reliability was assessed in consecutive operative cases for 1 spine surgeon during a 1-year period (2006) using 3 raters (staff surgeon, fellow, and/or resident). RESULTS: The prospectively administered form reported a higher number of surgical AEs (n = 43 vs. n = 30) and a similar number of medical AEs (n = 31 vs. n = 27). Compared with the chart, the AE form displayed substantial agreement for number (70%; weighted Kappa [wK] = 0.60) and type (75%; wK = 0.67) of AE. The interobserver reliability was near perfect (kappa = 0.8) for the actual grade of AE and moderate (kappa = 0.5) for the criteria behind the grading (i.e., clinical effect of the AE or the effect of the AE on LOS or both). CONCLUSION: The result of this study demonstrates improved capture of surgical AEs using SAVES. Excellent interobserver reliability between surgeons at different level of training was demonstrated with minimal education or training regarding the use of SAVES.
    [Abstract] [Full Text] [Related] [New Search]