These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Correlation between housestaff performance on the United States Medical Licensing Examination and standardized patient encounters.
    Author: Rifkin WD, Rifkin A.
    Journal: Mt Sinai J Med; 2005 Jan; 72(1):47-9. PubMed ID: 15682263.
    Abstract:
    BACKGROUND: There is interest in the use of "standardized patients" to assist in evaluating medical trainees' clinical skills, which may be difficult to evaluate with written exams alone. Previous studies of the validity of observed structured clinical exams have found low correlation with various written exams as well as with faculty evaluations. Since the United States Medical Licensing Examination (USMLE) results are often used by training programs in the selection of applicants, we assessed the correlation between performance on an observed structured clinical exam and the USMLE, steps 1 and 2, for internal medicine housestaff. METHODS: We collected scores on the USMLE, steps 1 and 2, and the overall score from a required standardized patient encounter for all PGY-1 trainees, in a single urban teaching hospital. Pearson coefficients were used to compare the USMLE and observed structured clinical exam performance. RESULTS: The two steps of the USMLE correlated with each other to a large extent (r=0.65, df=30, p=0.0001). However, both steps of the USMLE correlated poorly with the observed structured clinical exam (step 1 r=0.2, df=32, p=0.27; step 2 r=0.09, df=30, p=0.61). CONCLUSIONS: The low correlation between the USMLE and performance on a structured clinical exam suggests that either the written exam is a poor predictor of actual clinical performance, the small window of clinical skills measured by the structured clinical exam is inadequate, or the two methods evaluate different skill sets entirely. Our findings are consistent with previous work finding low correlations between structured clinical exams and accepted common means of evaluation, such as faculty evaluations, other written exams and program director assessments. The medical education community needs to develop an objective, valid method of measuring important, yet subjective, skill-sets such as interpersonal communication, empathy and efficient data collection.
    [Abstract] [Full Text] [Related] [New Search]