These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

125 related articles for article (PubMed ID: 30786820)

  • 1. Prediction intervals with random forests.
    Roy MH; Larocque D
    Stat Methods Med Res; 2020 Jan; 29(1):205-229. PubMed ID: 30786820
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Personalized Risk Prediction in Clinical Oncology Research: Applications and Practical Issues Using Survival Trees and Random Forests.
    Hu C; Steingrimsson JA
    J Biopharm Stat; 2018; 28(2):333-349. PubMed ID: 29048993
    [TBL] [Abstract][Full Text] [Related]  

  • 3. A comparative study of forest methods for time-to-event data: variable selection and predictive performance.
    Liu Y; Zhou S; Wei H; An S
    BMC Med Res Methodol; 2021 Sep; 21(1):193. PubMed ID: 34563138
    [TBL] [Abstract][Full Text] [Related]  

  • 4. A multicenter random forest model for effective prognosis prediction in collaborative clinical research network.
    Li J; Tian Y; Zhu Y; Zhou T; Li J; Ding K; Li J
    Artif Intell Med; 2020 Mar; 103():101814. PubMed ID: 32143809
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A random forest approach for competing risks based on pseudo-values.
    Mogensen UB; Gerds TA
    Stat Med; 2013 Aug; 32(18):3102-14. PubMed ID: 23508720
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Standard errors and confidence intervals for variable importance in random forest regression, classification, and survival.
    Ishwaran H; Lu M
    Stat Med; 2019 Feb; 38(4):558-582. PubMed ID: 29869423
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Survival forests for data with dependent censoring.
    Moradian H; Larocque D; Bellavance F
    Stat Methods Med Res; 2019 Feb; 28(2):445-461. PubMed ID: 28835170
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Random forests for homogeneous and non-homogeneous Poisson processes with excess zeros.
    Mathlouthi W; Larocque D; Fredette M
    Stat Methods Med Res; 2020 Aug; 29(8):2217-2237. PubMed ID: 31762374
    [TBL] [Abstract][Full Text] [Related]  

  • 9. On the overestimation of random forest's out-of-bag error.
    Janitza S; Hornung R
    PLoS One; 2018; 13(8):e0201904. PubMed ID: 30080866
    [TBL] [Abstract][Full Text] [Related]  

  • 10. A comparative evaluation of the generalised predictive ability of eight machine learning algorithms across ten clinical metabolomics data sets for binary classification.
    Mendez KM; Reinke SN; Broadhurst DI
    Metabolomics; 2019 Nov; 15(12):150. PubMed ID: 31728648
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Block Forests: random forests for blocks of clinical and omics covariate data.
    Hornung R; Wright MN
    BMC Bioinformatics; 2019 Jun; 20(1):358. PubMed ID: 31248362
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Calibrating random forests for probability estimation.
    Dankowski T; Ziegler A
    Stat Med; 2016 Sep; 35(22):3949-60. PubMed ID: 27074747
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Prediction models for clustered data with informative priors for the random effects: a simulation study.
    Ni H; Groenwold RHH; Nielen M; Klugkist I
    BMC Med Res Methodol; 2018 Aug; 18(1):83. PubMed ID: 30081875
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Covariance regression with random forests.
    Alakus C; Larocque D; Labbe A
    BMC Bioinformatics; 2023 Jun; 24(1):258. PubMed ID: 37330468
    [TBL] [Abstract][Full Text] [Related]  

  • 15. GIS-based groundwater potential mapping using boosted regression tree, classification and regression tree, and random forest machine learning models in Iran.
    Naghibi SA; Pourghasemi HR; Dixon B
    Environ Monit Assess; 2016 Jan; 188(1):44. PubMed ID: 26687087
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Effects of nonlinearities and uncorrelated or correlated errors in realistic simulated data on the prediction abilities of augmented classical least squares and partial least squares.
    Melgaard DK; Haaland DM
    Appl Spectrosc; 2004 Sep; 58(9):1065-73. PubMed ID: 15479523
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Using Classification and Regression Trees (CART) and random forests to analyze attrition: Results from two simulations.
    Hayes T; Usami S; Jacobucci R; McArdle JJ
    Psychol Aging; 2015 Dec; 30(4):911-29. PubMed ID: 26389526
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Survival Forests with R-Squared Splitting Rules.
    Wang H; Chen X; Li G
    J Comput Biol; 2018 Apr; 25(4):388-395. PubMed ID: 29265882
    [TBL] [Abstract][Full Text] [Related]  

  • 19. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.
    Nasejje JB; Mwambi H; Dheda K; Lesosky M
    BMC Med Res Methodol; 2017 Jul; 17(1):115. PubMed ID: 28754093
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Bias in random forest variable importance measures: illustrations, sources and a solution.
    Strobl C; Boulesteix AL; Zeileis A; Hothorn T
    BMC Bioinformatics; 2007 Jan; 8():25. PubMed ID: 17254353
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.