These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

107 related articles for article (PubMed ID: 28420347)

  • 1. Kappa statistic to measure agreement beyond chance in free-response assessments.
    Carpentier M; Combescure C; Merlini L; Perneger TV
    BMC Med Res Methodol; 2017 Apr; 17(1):62. PubMed ID: 28420347
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Accounting for overlap? An application of Mezzich's kappa statistic to test interrater reliability of interview data on parental accident and emergency attendance.
    Eccleston P; Werneke U; Armon K; Stephenson T; MacFaul R
    J Adv Nurs; 2001 Mar; 33(6):784-90. PubMed ID: 11298216
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Interrater reliability: the kappa statistic.
    McHugh ML
    Biochem Med (Zagreb); 2012; 22(3):276-82. PubMed ID: 23092060
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Interobserver agreement: Cohen's kappa coefficient does not necessarily reflect the percentage of patients with congruent classifications.
    Steinijans VW; Diletti E; Bömches B; Greis C; Solleder P
    Int J Clin Pharmacol Ther; 1997 Mar; 35(3):93-5. PubMed ID: 9088995
    [TBL] [Abstract][Full Text] [Related]  

  • 5. The prediction of pouch of Douglas obliteration using offline analysis of the transvaginal ultrasound 'sliding sign' technique: inter- and intra-observer reproducibility.
    Reid S; Lu C; Casikar I; Mein B; Magotti R; Ludlow J; Benzie R; Condous G
    Hum Reprod; 2013 May; 28(5):1237-46. PubMed ID: 23482338
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Observer reliability of arteriovenous malformations grading scales using current imaging modalities.
    Griessenauer CJ; Miller JH; Agee BS; Fisher WS; Curé JK; Chapman PR; Foreman PM; Fisher WA; Witcher AC; Walters BC
    J Neurosurg; 2014 May; 120(5):1179-87. PubMed ID: 24628617
    [TBL] [Abstract][Full Text] [Related]  

  • 7. [Quality criteria of assessment scales--Cohen's kappa as measure of interrator reliability (1)].
    Mayer H; Nonn C; Osterbrink J; Evers GC
    Pflege; 2004 Feb; 17(1):36-46. PubMed ID: 15040245
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Interrater agreement statistics with skewed data: evaluation of alternatives to Cohen's kappa.
    Xu S; Lorber MF
    J Consult Clin Psychol; 2014 Dec; 82(6):1219-27. PubMed ID: 25090041
    [TBL] [Abstract][Full Text] [Related]  

  • 9. [A new method for agreement evaluation based on AC
    Zhang JW; Xu J; An SL
    Nan Fang Yi Ke Da Xue Xue Bao; 2018 Apr; 38(4):455-459. PubMed ID: 29735447
    [TBL] [Abstract][Full Text] [Related]  

  • 10. The kappa statistic was representative of empirically observed inter-rater agreement for physical findings.
    Gorelick MH; Yen K
    J Clin Epidemiol; 2006 Aug; 59(8):859-61. PubMed ID: 16828681
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Assessing Binary Diagnoses of Bio-behavioral Disorders: The Clinical Relevance of Cohen's Kappa.
    Cicchetti DV; Klin A; Volkmar FR
    J Nerv Ment Dis; 2017 Jan; 205(1):58-65. PubMed ID: 27741082
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Clinicians are right not to like Cohen's κ.
    de Vet HC; Mokkink LB; Terwee CB; Hoekstra OS; Knol DL
    BMJ; 2013 Apr; 346():f2125. PubMed ID: 23585065
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Kappa statistic for clustered dichotomous responses from physicians and patients.
    Kang C; Qaqish B; Monaco J; Sheridan SL; Cai J
    Stat Med; 2013 Sep; 32(21):3700-19. PubMed ID: 23533082
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Reliability of the modified Rankin Scale across multiple raters: benefits of a structured interview.
    Wilson JT; Hareendran A; Hendry A; Potter J; Bone I; Muir KW
    Stroke; 2005 Apr; 36(4):777-81. PubMed ID: 15718510
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Statistics Refresher for Molecular Imaging Technologists, Part 2: Accuracy of Interpretation, Significance, and Variance.
    Farrell MB
    J Nucl Med Technol; 2018 Jun; 46(2):76-80. PubMed ID: 29438006
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Pitfalls in the use of kappa when interpreting agreement between multiple raters in reliability studies.
    O'Leary S; Lund M; Ytre-Hauge TJ; Holm SR; Naess K; Dalland LN; McPhail SM
    Physiotherapy; 2014 Mar; 100(1):27-35. PubMed ID: 24262334
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Variance estimation for the Kappa statistic in the presence of clustered data and heterogeneous observations.
    Ryan MM; Spotnitz WD; Gillen DL
    Stat Med; 2020 Jun; 39(14):1941-1951. PubMed ID: 32180248
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Evidence-based medicine (EBM) in practice: agreement between observers rating esophageal varices: how to cope with chance?
    Sierra F; Cárdenas A
    Am J Gastroenterol; 2007 Nov; 102(11):2363-6. PubMed ID: 17958753
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Statistical methods in epidemiology. v. Towards an understanding of the kappa coefficient.
    Rigby AS
    Disabil Rehabil; 2000 May; 22(8):339-44. PubMed ID: 10896093
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Beyond kappa: an informational index for diagnostic agreement in dichotomous and multivalue ordered-categorical ratings.
    Casagrande A; Fabris F; Girometti R
    Med Biol Eng Comput; 2020 Dec; 58(12):3089-3099. PubMed ID: 33145661
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.