These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Novel examination for evaluating medical student clinical reasoning: reliability and association with patients seen.
    Author: Hemmer PA, Dong T, Durning SJ, Pangaro LN.
    Journal: Mil Med; 2015 Apr; 180(4 Suppl):79-87. PubMed ID: 25850132.
    Abstract:
    BACKGROUND: Medical students learn clinical reasoning, in part, through patient care. Although the numbers of patients seen is associated with knowledge examination scores, studies have not demonstrated an association between patient problems and an assessment of clinical reasoning. AIM: To examine the reliability of a clinical reasoning examination and investigate whether there was association between internal medicine core clerkship students' performance on this examination and the number of patients they saw with matching problems during their internal medicine clerkship. METHODS: Students on the core internal medicine clerkship at the Uniformed Services University students log 11 core patient problems based on the Clerkship Directors in Internal Medicine curriculum. On a final clerkship examination (Multistep), students watch a scripted video encounter between physician and patient actors that assesses three sequential steps in clinical reasoning: Step One focuses on history and physical examination; Step Two, students write a problem list after viewing additional clinical findings; Step Three, students complete a prioritized differential diagnosis and treatment plan. Each Multistep examination has three different cases. For graduating classes 2010-2012 (n = 497), we matched the number of patients seen with the problem most represented by the Multistep cases (epigastric pain, generalized edema, monoarticular arthritis, angina, syncope, pleuritic chest pain). We report two-way Pearson correlations between the number of patients students reported with similar problems and the student's percent score on: Step One, Step Two, Step Three, and Overall Test. RESULTS: Multistep reliability: Step 1, 0.6 to 0.8; Step 2, 0.41 to 0.65; Step 3, 0.53 to 0.78; Overall examination (3 cases): 0.74 to 0.83. For three problems, the number of patients seen had small to modest correlations with the Multistep Examination of Analytic Ability total score (r = 0.27 for pleuritic pain, p < 0.05, n = 81 patients; r = 0.14 for epigastric pain, p < 0.05, n = 324 patients; r = 0.19 for generalized edema, p < 0.05, n = 118 patients). DISCUSSION or CONCLUSION: Although a reliable assessment, student performance on a clinical reasoning examination was weakly associated with the numbers of patients seen with similar problems. This may be as a result of transfer of knowledge between clinical and examination settings, the complexity of clinical reasoning, or the limits of reliability with patient logs and the Multistep.
    [Abstract] [Full Text] [Related] [New Search]