BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

168 related articles for article (PubMed ID: 30458741)

  • 1. Borderline grades in high stakes clinical examinations: resolving examiner uncertainty.
    Shulruf B; Adelstein BA; Damodaran A; Harris P; Kennedy S; O'Sullivan A; Taylor S
    BMC Med Educ; 2018 Nov; 18(1):272. PubMed ID: 30458741
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Predictive validity of a tool to resolve borderline grades in OSCEs.
    Klein Nulend R; Harris P; Shulruf B
    GMS J Med Educ; 2020; 37(3):Doc31. PubMed ID: 32566733
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Enhancing the defensibility of examiners' marks in high stake OSCEs.
    Shulruf B; Damodaran A; Jones P; Kennedy S; Mangos G; O'Sullivan AJ; Rhee J; Taylor S; Velan G; Harris P
    BMC Med Educ; 2018 Jan; 18(1):10. PubMed ID: 29304806
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Sources of variation in performance on a shared OSCE station across four UK medical schools.
    Chesser A; Cameron H; Evans P; Cleland J; Boursicot K; Mires G
    Med Educ; 2009 Jun; 43(6):526-32. PubMed ID: 19493176
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Pass/fail decisions and standards: the impact of differential examiner stringency on OSCE outcomes.
    Homer M
    Adv Health Sci Educ Theory Pract; 2022 May; 27(2):457-473. PubMed ID: 35230590
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Is the assumption of equal distances between global assessment categories used in borderline regression valid?
    McGown PJ; Brown CA; Sebastian A; Le R; Amin A; Greenland A; Sam AH
    BMC Med Educ; 2022 Oct; 22(1):708. PubMed ID: 36199083
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Developing a video-based method to compare and adjust examiner effects in fully nested OSCEs.
    Yeates P; Cope N; Hawarden A; Bradshaw H; McCray G; Homer M
    Med Educ; 2019 Mar; 53(3):250-263. PubMed ID: 30575092
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Towards a more nuanced conceptualisation of differential examiner stringency in OSCEs.
    Homer M
    Adv Health Sci Educ Theory Pract; 2024 Jul; 29(3):919-934. PubMed ID: 37843678
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Factor analysis can be a useful standard setting tool in a high stakes OSCE assessment.
    Chesser AM; Laing MR; Miedzybrodzka ZH; Brittenden J; Heys SD
    Med Educ; 2004 Aug; 38(8):825-31. PubMed ID: 15271042
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Standard Setting Methods for Pass/Fail Decisions on High-Stakes Objective Structured Clinical Examinations: A Validity Study.
    Yousuf N; Violato C; Zuberi RW
    Teach Learn Med; 2015; 27(3):280-91. PubMed ID: 26158330
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Who will pass the dental OSCE? Comparison of the Angoff and the borderline regression standard setting methods.
    Schoonheim-Klein M; Muijtjens A; Habets L; Manogue M; van der Vleuten C; van der Velden U
    Eur J Dent Educ; 2009 Aug; 13(3):162-71. PubMed ID: 19630935
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Order effects in high stakes undergraduate examinations: an analysis of 5 years of administrative data in one UK medical school.
    Burt J; Abel G; Barclay M; Evans R; Benson J; Gurnell M
    BMJ Open; 2016 Oct; 6(10):e012541. PubMed ID: 27729351
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Comparison of two methods of standard setting: the performance of the three-level Angoff method.
    Jalili M; Hejri SM; Norcini JJ
    Med Educ; 2011 Dec; 45(12):1199-208. PubMed ID: 22122428
    [TBL] [Abstract][Full Text] [Related]  

  • 14. An empirical study of the predictive validity of number grades in medical school using 3 decades of longitudinal data: implications for a grading system.
    Gonnella JS; Erdmann JB; Hojat M
    Med Educ; 2004 Apr; 38(4):425-34. PubMed ID: 15025644
    [TBL] [Abstract][Full Text] [Related]  

  • 15. The consistency and uncertainty in examiners' definitions of pass/fail performance on OSCE (objective structured clinical examination) stations.
    Rothman AI; Blackmore D; Cohen R; Reznick R
    Eval Health Prof; 1996 Mar; 19(1):118-24. PubMed ID: 10186899
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Is There Variability in Scoring of Student Surgical OSCE Performance Based on Examiner Experience and Expertise?
    Donohoe CL; Reilly F; Donnelly S; Cahill RA
    J Surg Educ; 2020; 77(5):1202-1210. PubMed ID: 32336628
    [TBL] [Abstract][Full Text] [Related]  

  • 17. The Objective Borderline method (OBM): a probability-based model for setting up an objective pass/fail cut-off score in medical programme assessments.
    Shulruf B; Turner R; Poole P; Wilkinson T
    Adv Health Sci Educ Theory Pract; 2013 May; 18(2):231-44. PubMed ID: 22484963
    [TBL] [Abstract][Full Text] [Related]  

  • 18. The practical value of the standard error of measurement in borderline pass/fail decisions.
    Hays R; Gupta TS; Veitch J
    Med Educ; 2008 Aug; 42(8):810-5. PubMed ID: 18564094
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Achieving acceptable reliability in oral examinations: an analysis of the Royal College of General Practitioners membership examination's oral component.
    Wass V; Wakeford R; Neighbour R; Van der Vleuten C;
    Med Educ; 2003 Feb; 37(2):126-31. PubMed ID: 12558883
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Standard setting in an objective structured clinical examination: use of global ratings of borderline performance to determine the passing score.
    Wilkinson TJ; Newble DI; Frampton CM
    Med Educ; 2001 Nov; 35(11):1043-9. PubMed ID: 11703640
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.