These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

321 related articles for article (PubMed ID: 23585065)

  • 1. Clinicians are right not to like Cohen's κ.
    de Vet HC; Mokkink LB; Terwee CB; Hoekstra OS; Knol DL
    BMJ; 2013 Apr; 346():f2125. PubMed ID: 23585065
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Interobserver agreement: Cohen's kappa coefficient does not necessarily reflect the percentage of patients with congruent classifications.
    Steinijans VW; Diletti E; Bömches B; Greis C; Solleder P
    Int J Clin Pharmacol Ther; 1997 Mar; 35(3):93-5. PubMed ID: 9088995
    [TBL] [Abstract][Full Text] [Related]  

  • 3. [Analyzing interrater agreement for categorical data using Cohen's kappa and alternative coefficients].
    Wirtz M; Kutschmann M
    Rehabilitation (Stuttg); 2007 Dec; 46(6):370-7. PubMed ID: 18188809
    [TBL] [Abstract][Full Text] [Related]  

  • 4. [Quality criteria of assessment scales--Cohen's kappa as measure of interrator reliability (1)].
    Mayer H; Nonn C; Osterbrink J; Evers GC
    Pflege; 2004 Feb; 17(1):36-46. PubMed ID: 15040245
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Pitfalls in the use of kappa when interpreting agreement between multiple raters in reliability studies.
    O'Leary S; Lund M; Ytre-Hauge TJ; Holm SR; Naess K; Dalland LN; McPhail SM
    Physiotherapy; 2014 Mar; 100(1):27-35. PubMed ID: 24262334
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Reproducibility of the implant crown aesthetic index--rating aesthetics of single-implant crowns and adjacent soft tissues with regard to observer dental specialization.
    Gehrke P; Degidi M; Lulay-Saad Z; Dhom G
    Clin Implant Dent Relat Res; 2009 Sep; 11(3):201-13. PubMed ID: 18657148
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Weighted specific-category kappa measure of interobserver agreement.
    Kvålseth TO
    Psychol Rep; 2003 Dec; 93(3 Pt 2):1283-90. PubMed ID: 14765602
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Chance-corrected measures for 2 × 2 tables that coincide with weighted kappa.
    Warrens MJ
    Br J Math Stat Psychol; 2011 May; 64(Pt 2):355-65. PubMed ID: 21492138
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Kappa-like indices of observer agreement viewed from a latent class perspective.
    Guggenmoos-Holzmann I; Vonk R
    Stat Med; 1998 Apr; 17(8):797-812. PubMed ID: 9595612
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Interrater and intrarater agreement of the chicago classification of achalasia subtypes using high-resolution esophageal manometry.
    Hernandez JC; Ratuapli SK; Burdick GE; Dibaise JK; Crowell MD
    Am J Gastroenterol; 2012 Feb; 107(2):207-14. PubMed ID: 22008895
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Inter- and intraobserver agreement on the Load Sharing Classification of thoracolumbar spine fractures.
    Elzinga M; Segers M; Siebenga J; Heilbron E; de Lange-de Klerk ES; Bakker F
    Injury; 2012 Apr; 43(4):416-22. PubMed ID: 21645896
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Inter-observer reproducibility of 15 tests used for predicting difficult intubation.
    Adamus M; Jor O; Vavreckova T; Hrabalek L; Zapletalova J; Gabrhelik T; Tomaskova H; Janout V
    Biomed Pap Med Fac Univ Palacky Olomouc Czech Repub; 2011 Sep; 155(3):275-81. PubMed ID: 22286814
    [TBL] [Abstract][Full Text] [Related]  

  • 13. MR reproducibility in the assessment of uterine fibroids for patients scheduled for uterine artery embolization.
    Volkers NA; Hehenkamp WJ; Spijkerboer AM; Moolhuijzen AD; Birnie E; Ankum WM; Reekers JA
    Cardiovasc Intervent Radiol; 2008; 31(2):260-8. PubMed ID: 18057985
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Midwives' visual interpretation of intrapartum cardiotocographs: intra- and inter-observer agreement.
    Devane D; Lalor J
    J Adv Nurs; 2005 Oct; 52(2):133-41. PubMed ID: 16164474
    [TBL] [Abstract][Full Text] [Related]  

  • 15. [VII: Diagnostic trials: Simple measures of validity and reliability].
    Krummenauer F
    Klin Monbl Augenheilkd; 2003 Apr; 220(4):281-3. PubMed ID: 12695973
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Learning how to differ: agreement and reliability statistics in psychiatry.
    Streiner DL
    Can J Psychiatry; 1995 Mar; 40(2):60-6. PubMed ID: 7788619
    [TBL] [Abstract][Full Text] [Related]  

  • 17. The disc damage likelihood scale (DDLS): interobserver agreement of a new grading system to assess glaucomatous optic disc damage.
    Bochmann F; Howell JP; Meier C; Becht C; Thiel MA
    Klin Monbl Augenheilkd; 2009 Apr; 226(4):280-3. PubMed ID: 19384783
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Beyond kappa: an informational index for diagnostic agreement in dichotomous and multivalue ordered-categorical ratings.
    Casagrande A; Fabris F; Girometti R
    Med Biol Eng Comput; 2020 Dec; 58(12):3089-3099. PubMed ID: 33145661
    [TBL] [Abstract][Full Text] [Related]  

  • 19. The Kappa Paradox Explained.
    Derksen BM; Bruinsma W; Goslings JC; Schep NWL
    J Hand Surg Am; 2024 May; 49(5):482-485. PubMed ID: 38372689
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Reliability of clinical findings in temporomandibular disorders.
    de Wijer A; Lobbezoo-Scholte AM; Steenks MH; Bosman F
    J Orofac Pain; 1995; 9(2):181-91. PubMed ID: 7488988
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 17.