BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

232 related articles for article (PubMed ID: 36038846)

  • 1. Interrater reliability estimators tested against true interrater reliabilities.
    Zhao X; Feng GC; Ao SH; Liu PL
    BMC Med Res Methodol; 2022 Aug; 22(1):232. PubMed ID: 36038846
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A comparison of Cohen's Kappa and Gwet's AC1 when calculating inter-rater reliability coefficients: a study conducted with personality disorder samples.
    Wongpakaran N; Wongpakaran T; Wedding D; Gwet KL
    BMC Med Res Methodol; 2013 Apr; 13():61. PubMed ID: 23627889
    [TBL] [Abstract][Full Text] [Related]  

  • 3. An Evaluation of Interrater Reliability Measures on Binary Tasks Using
    Grant MJ; Button CM; Snook B
    Appl Psychol Meas; 2017 Jun; 41(4):264-276. PubMed ID: 29881092
    [TBL] [Abstract][Full Text] [Related]  

  • 4. A new coefficient of interrater agreement: The challenge of highly unequal category proportions.
    van Oest R
    Psychol Methods; 2019 Aug; 24(4):439-451. PubMed ID: 29723005
    [TBL] [Abstract][Full Text] [Related]  

  • 5. The Flexor Pollicis Longus Reflex: Interrater and Intrarater Reliability in Comparison With Established Muscle Stretch Reflexes.
    Gladitz LM; Schöttker-Königer T; Sturm C; Gutenbrunner C; Ranker A
    Am J Phys Med Rehabil; 2021 Jun; 100(6):539-545. PubMed ID: 33998607
    [TBL] [Abstract][Full Text] [Related]  

  • 6. The role of raters threshold in estimating interrater agreement.
    Nucci M; Spoto A; Altoè G; Pastore M
    Psychol Methods; 2021 Oct; 26(5):622-634. PubMed ID: 34855432
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Interrater agreement statistics with skewed data: evaluation of alternatives to Cohen's kappa.
    Xu S; Lorber MF
    J Consult Clin Psychol; 2014 Dec; 82(6):1219-27. PubMed ID: 25090041
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Gwet's AC1 is not a substitute for Cohen's kappa - A comparison of basic properties.
    Vach W; Gerke O
    MethodsX; 2023; 10():102212. PubMed ID: 37234937
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Better to be in agreement than in bad company : A critical analysis of many kappa-like tests.
    Silveira PSP; Siqueira JO
    Behav Res Methods; 2023 Oct; 55(7):3326-3347. PubMed ID: 36114386
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Interrater reliability of the categorization of late radiographic changes after lung stereotactic body radiation therapy.
    Faruqi S; Giuliani ME; Raziee H; Yap ML; Roberts H; Le LW; Brade A; Cho J; Sun A; Bezjak A; Hope AJ
    Int J Radiat Oncol Biol Phys; 2014 Aug; 89(5):1076-1083. PubMed ID: 25035211
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Reliability in evaluator-based tests: using simulation-constructed models to determine contextually relevant agreement thresholds.
    Beckler DT; Thumser ZC; Schofield JS; Marasco PD
    BMC Med Res Methodol; 2018 Nov; 18(1):141. PubMed ID: 30453897
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Interrater reliability: the kappa statistic.
    McHugh ML
    Biochem Med (Zagreb); 2012; 22(3):276-82. PubMed ID: 23092060
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Influence of true within-herd prevalence of small ruminant lentivirus infection in goats on agreement between serological immunoenzymatic tests.
    Czopowicz M; Szaluś-Jordanow O; Mickiewicz M; Moroz A; Witkowski L; Markowska-Daniel I; Bagnicka E; Kaba J
    Prev Vet Med; 2017 Sep; 144():75-80. PubMed ID: 28716207
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Validity and reliability of exposure assessors' ratings of exposure intensity by type of occupational questionnaire and type of rater.
    Friesen MC; Coble JB; Katki HA; Ji BT; Xue S; Lu W; Stewart PA
    Ann Occup Hyg; 2011 Jul; 55(6):601-11. PubMed ID: 21511891
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Degenerative findings in lumbar spine MRI: an inter-rater reliability study involving three raters.
    Doktor K; Jensen TS; Christensen HW; Fredberg U; Kindt M; Boyle E; Hartvigsen J
    Chiropr Man Therap; 2020 Feb; 28(1):8. PubMed ID: 32041626
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Interrater agreement and interrater reliability: key concepts, approaches, and applications.
    Gisev N; Bell JS; Chen TF
    Res Social Adm Pharm; 2013; 9(3):330-8. PubMed ID: 22695215
    [TBL] [Abstract][Full Text] [Related]  

  • 17. The interrater reliability and agreement of a 0 to 10 uterine tone score in cesarean delivery.
    Cole NM; Abushoshah I; Fields KG; Carusi DA; Robinson JN; Bateman BT; Farber MK
    Am J Obstet Gynecol MFM; 2021 May; 3(3):100342. PubMed ID: 33652161
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Interobserver agreement: Cohen's kappa coefficient does not necessarily reflect the percentage of patients with congruent classifications.
    Steinijans VW; Diletti E; Bömches B; Greis C; Solleder P
    Int J Clin Pharmacol Ther; 1997 Mar; 35(3):93-5. PubMed ID: 9088995
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Weighting schemes and incomplete data: A generalized Bayesian framework for chance-corrected interrater agreement.
    van Oest R; Girard JM
    Psychol Methods; 2022 Dec; 27(6):1069-1088. PubMed ID: 34766799
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Homogeneity score test of AC
    Honda C; Ohyama T
    BMC Med Res Methodol; 2020 Feb; 20(1):20. PubMed ID: 32020851
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 12.