These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

123 related articles for article (PubMed ID: 8691228)

  • 1. The meaning of kappa: probabilistic concepts of reliability and validity revisited.
    Guggenmoos-Holzmann I
    J Clin Epidemiol; 1996 Jul; 49(7):775-82. PubMed ID: 8691228
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Kappa-like indices of observer agreement viewed from a latent class perspective.
    Guggenmoos-Holzmann I; Vonk R
    Stat Med; 1998 Apr; 17(8):797-812. PubMed ID: 9595612
    [TBL] [Abstract][Full Text] [Related]  

  • 3. How reliable are chance-corrected measures of agreement?
    Guggenmoos-Holzmann I
    Stat Med; 1993 Dec; 12(23):2191-205. PubMed ID: 8310189
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Chance-corrected measures for 2 × 2 tables that coincide with weighted kappa.
    Warrens MJ
    Br J Math Stat Psychol; 2011 May; 64(Pt 2):355-65. PubMed ID: 21492138
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Chance-corrected measures of the validity of a binary diagnostic test.
    Brenner H; Gefeller O
    J Clin Epidemiol; 1994 Jun; 47(6):627-33. PubMed ID: 7722575
    [TBL] [Abstract][Full Text] [Related]  

  • 6. [Quality criteria of assessment scales--Cohen's kappa as measure of interrator reliability (1)].
    Mayer H; Nonn C; Osterbrink J; Evers GC
    Pflege; 2004 Feb; 17(1):36-46. PubMed ID: 15040245
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Clinicians are right not to like Cohen's κ.
    de Vet HC; Mokkink LB; Terwee CB; Hoekstra OS; Knol DL
    BMJ; 2013 Apr; 346():f2125. PubMed ID: 23585065
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A paired kappa to compare binary ratings across two medical tests.
    Nelson KP; Edwards D
    Stat Med; 2019 Jul; 38(17):3272-3287. PubMed ID: 31099902
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Quantifying Interrater Agreement and Reliability Between Thoracic Pathologists: Paradoxical Behavior of Cohen's Kappa in the Presence of a High Prevalence of the Histopathologic Feature in Lung Cancer.
    Tan KS; Yeh YC; Adusumilli PS; Travis WD
    JTO Clin Res Rep; 2024 Jan; 5(1):100618. PubMed ID: 38283651
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Sensitivity and specificity-like measures of the validity of a diagnostic test that are corrected for chance agreement.
    Coughlin SS; Pickle LW
    Epidemiology; 1992 Mar; 3(2):178-81. PubMed ID: 1576224
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Influence of true within-herd prevalence of small ruminant lentivirus infection in goats on agreement between serological immunoenzymatic tests.
    Czopowicz M; Szaluś-Jordanow O; Mickiewicz M; Moroz A; Witkowski L; Markowska-Daniel I; Bagnicka E; Kaba J
    Prev Vet Med; 2017 Sep; 144():75-80. PubMed ID: 28716207
    [TBL] [Abstract][Full Text] [Related]  

  • 12. The dependence of Cohen's kappa on the prevalence does not matter.
    Vach W
    J Clin Epidemiol; 2005 Jul; 58(7):655-61. PubMed ID: 15939215
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Interobserver agreement: Cohen's kappa coefficient does not necessarily reflect the percentage of patients with congruent classifications.
    Steinijans VW; Diletti E; Bömches B; Greis C; Solleder P
    Int J Clin Pharmacol Ther; 1997 Mar; 35(3):93-5. PubMed ID: 9088995
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Relationships between statistical measures of agreement: sensitivity, specificity and kappa.
    Feuerman M; Miller AR
    J Eval Clin Pract; 2008 Oct; 14(5):930-3. PubMed ID: 19018927
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Modelling patterns of agreement and disagreement.
    Agresti A
    Stat Methods Med Res; 1992; 1(2):201-18. PubMed ID: 1341658
    [TBL] [Abstract][Full Text] [Related]  

  • 16. The Effect of the Raters' Marginal Distributions on Their Matched Agreement: A Rescaling Framework for Interpreting Kappa.
    Karelitz TM; Budescu DV
    Multivariate Behav Res; 2013 Nov; 48(6):923-52. PubMed ID: 26745599
    [TBL] [Abstract][Full Text] [Related]  

  • 17. [Analyzing interrater agreement for categorical data using Cohen's kappa and alternative coefficients].
    Wirtz M; Kutschmann M
    Rehabilitation (Stuttg); 2007 Dec; 46(6):370-7. PubMed ID: 18188809
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Inter-observer agreement between two observers for bovine digital dermatitis identification in New Zealand using digital photographs.
    Yang DA; Laven RA
    N Z Vet J; 2019 May; 67(3):143-147. PubMed ID: 30753789
    [TBL] [Abstract][Full Text] [Related]  

  • 19. High Agreement and High Prevalence: The Paradox of Cohen's Kappa.
    Zec S; Soriani N; Comoretto R; Baldi I
    Open Nurs J; 2017; 11():211-218. PubMed ID: 29238424
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Chance-corrected measures of reliability and validity in K x K tables.
    Andrés AM; Marzo PF
    Stat Methods Med Res; 2005 Oct; 14(5):473-92. PubMed ID: 16248349
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.