139 related articles for article (PubMed ID: 37291419)
1. Measuring Agreement Using Guessing Models and Knowledge Coefficients.
Moss J
Psychometrika; 2023 Sep; 88(3):1002-1025. PubMed ID: 37291419
[TBL] [Abstract][Full Text] [Related]
2. A new coefficient of interrater agreement: The challenge of highly unequal category proportions.
van Oest R
Psychol Methods; 2019 Aug; 24(4):439-451. PubMed ID: 29723005
[TBL] [Abstract][Full Text] [Related]
3. Weighting schemes and incomplete data: A generalized Bayesian framework for chance-corrected interrater agreement.
van Oest R; Girard JM
Psychol Methods; 2022 Dec; 27(6):1069-1088. PubMed ID: 34766799
[TBL] [Abstract][Full Text] [Related]
4. A comparison of Cohen's Kappa and Gwet's AC1 when calculating inter-rater reliability coefficients: a study conducted with personality disorder samples.
Wongpakaran N; Wongpakaran T; Wedding D; Gwet KL
BMC Med Res Methodol; 2013 Apr; 13():61. PubMed ID: 23627889
[TBL] [Abstract][Full Text] [Related]
5. Homogeneity score test of AC
Honda C; Ohyama T
BMC Med Res Methodol; 2020 Feb; 20(1):20. PubMed ID: 32020851
[TBL] [Abstract][Full Text] [Related]
6. Measuring inter-rater reliability for nominal data - which coefficients and confidence intervals are appropriate?
Zapf A; Castell S; Morawietz L; Karch A
BMC Med Res Methodol; 2016 Aug; 16():93. PubMed ID: 27495131
[TBL] [Abstract][Full Text] [Related]
7. Interrater reliability estimators tested against true interrater reliabilities.
Zhao X; Feng GC; Ao SH; Liu PL
BMC Med Res Methodol; 2022 Aug; 22(1):232. PubMed ID: 36038846
[TBL] [Abstract][Full Text] [Related]
8. Measures of Agreement with Multiple Raters: Fréchet Variances and Inference.
Moss J
Psychometrika; 2024 Jun; 89(2):517-541. PubMed ID: 38190018
[TBL] [Abstract][Full Text] [Related]
9. The prediction of pouch of Douglas obliteration using offline analysis of the transvaginal ultrasound 'sliding sign' technique: inter- and intra-observer reproducibility.
Reid S; Lu C; Casikar I; Mein B; Magotti R; Ludlow J; Benzie R; Condous G
Hum Reprod; 2013 May; 28(5):1237-46. PubMed ID: 23482338
[TBL] [Abstract][Full Text] [Related]
10. [Interrater reliability of the Braden scale].
Kottner J; Tannen A; Dassen T
Pflege; 2008 Apr; 21(2):85-94. PubMed ID: 18622997
[TBL] [Abstract][Full Text] [Related]
11. Computing inter-rater reliability and its variance in the presence of high agreement.
Gwet KL
Br J Math Stat Psychol; 2008 May; 61(Pt 1):29-48. PubMed ID: 18482474
[TBL] [Abstract][Full Text] [Related]
12. Measures of agreement between many raters for ordinal classifications.
Nelson KP; Edwards D
Stat Med; 2015 Oct; 34(23):3116-32. PubMed ID: 26095449
[TBL] [Abstract][Full Text] [Related]
13. Quantifying Interrater Agreement and Reliability Between Thoracic Pathologists: Paradoxical Behavior of Cohen's Kappa in the Presence of a High Prevalence of the Histopathologic Feature in Lung Cancer.
Tan KS; Yeh YC; Adusumilli PS; Travis WD
JTO Clin Res Rep; 2024 Jan; 5(1):100618. PubMed ID: 38283651
[TBL] [Abstract][Full Text] [Related]
14. Assessing the inter-rater agreement for ordinal data through weighted indexes.
Marasini D; Quatto P; Ripamonti E
Stat Methods Med Res; 2016 Dec; 25(6):2611-2633. PubMed ID: 24740999
[TBL] [Abstract][Full Text] [Related]
15. Inter-rater agreement and reliability of thoracic ultrasonographic findings in feedlot calves, with or without naturally occurring bronchopneumonia.
Buczinski S; Buathier C; Bélanger AM; Michaux H; Tison N; Timsit E
J Vet Intern Med; 2018 Sep; 32(5):1787-1792. PubMed ID: 30133838
[TBL] [Abstract][Full Text] [Related]
16. Inter-observer reproducibility of 15 tests used for predicting difficult intubation.
Adamus M; Jor O; Vavreckova T; Hrabalek L; Zapletalova J; Gabrhelik T; Tomaskova H; Janout V
Biomed Pap Med Fac Univ Palacky Olomouc Czech Repub; 2011 Sep; 155(3):275-81. PubMed ID: 22286814
[TBL] [Abstract][Full Text] [Related]
17. Gwet's AC1 is not a substitute for Cohen's kappa - A comparison of basic properties.
Vach W; Gerke O
MethodsX; 2023; 10():102212. PubMed ID: 37234937
[TBL] [Abstract][Full Text] [Related]
18. [Analyzing interrater agreement for categorical data using Cohen's kappa and alternative coefficients].
Wirtz M; Kutschmann M
Rehabilitation (Stuttg); 2007 Dec; 46(6):370-7. PubMed ID: 18188809
[TBL] [Abstract][Full Text] [Related]
19. Hubert's multi-rater kappa revisited.
Martín Andrés A; Álvarez Hernández M
Br J Math Stat Psychol; 2020 Feb; 73(1):1-22. PubMed ID: 31056757
[TBL] [Abstract][Full Text] [Related]
20. Interobserver agreement: Cohen's kappa coefficient does not necessarily reflect the percentage of patients with congruent classifications.
Steinijans VW; Diletti E; Bömches B; Greis C; Solleder P
Int J Clin Pharmacol Ther; 1997 Mar; 35(3):93-5. PubMed ID: 9088995
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]