BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

299 related articles for article (PubMed ID: 34942412)

  • 1. Explanation of machine learning models using shapley additive explanation and application for real data in hospital.
    Nohara Y; Matsumoto K; Soejima H; Nakashima N
    Comput Methods Programs Biomed; 2022 Feb; 214():106584. PubMed ID: 34942412
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Interpretation of machine learning models using shapley values: application to compound potency and multi-target activity predictions.
    Rodríguez-Pérez R; Bajorath J
    J Comput Aided Mol Des; 2020 Oct; 34(10):1013-1026. PubMed ID: 32361862
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Interpretable machine learning with tree-based shapley additive explanations: Application to metabolomics datasets for binary classification.
    Bifarin OO
    PLoS One; 2023; 18(5):e0284315. PubMed ID: 37141218
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Towards better process management in wastewater treatment plants: Process analytics based on SHAP values for tree-based machine learning methods.
    Wang D; Thunéll S; Lindberg U; Jiang L; Trygg J; Tysklind M
    J Environ Manage; 2022 Jan; 301():113941. PubMed ID: 34731954
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Machine Learning Models for Predicting Influential Factors of Early Outcomes in Acute Ischemic Stroke: Registry-Based Study.
    Su PY; Wei YC; Luo H; Liu CH; Huang WY; Chen KF; Lin CP; Wei HY; Lee TH
    JMIR Med Inform; 2022 Mar; 10(3):e32508. PubMed ID: 35072631
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Explainable machine learning for chronic lymphocytic leukemia treatment prediction using only inexpensive tests.
    Meiseles A; Paley D; Ziv M; Hadid Y; Rokach L; Tadmor T
    Comput Biol Med; 2022 Jun; 145():105490. PubMed ID: 35405402
    [TBL] [Abstract][Full Text] [Related]  

  • 7. An explainable predictive model for suicide attempt risk using an ensemble learning and Shapley Additive Explanations (SHAP) approach.
    Nordin N; Zainol Z; Mohd Noor MH; Chan LF
    Asian J Psychiatr; 2023 Jan; 79():103316. PubMed ID: 36395702
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A hybrid stacked ensemble and Kernel SHAP-based model for intelligent cardiotocography classification and interpretability.
    Feng J; Liang J; Qiang Z; Hao Y; Li X; Li L; Chen Q; Liu G; Wei H
    BMC Med Inform Decis Mak; 2023 Nov; 23(1):273. PubMed ID: 38017460
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Predicting Bulk Average Velocity with Rigid Vegetation in Open Channels Using Tree-Based Machine Learning: A Novel Approach Using Explainable Artificial Intelligence.
    Meddage DPP; Ekanayake IU; Herath S; Gobirahavan R; Muttil N; Rathnayake U
    Sensors (Basel); 2022 Jun; 22(12):. PubMed ID: 35746184
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Modeling the Effect of Streetscape Environment on Crime Using Street View Images and Interpretable Machine-Learning Technique.
    Xie H; Liu L; Yue H
    Int J Environ Res Public Health; 2022 Oct; 19(21):. PubMed ID: 36360717
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Interpretable prediction of mortality in liver transplant recipients based on machine learning.
    Zhang X; Gavaldà R; Baixeries J
    Comput Biol Med; 2022 Dec; 151(Pt A):106188. PubMed ID: 36306583
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Comparative analysis of explainable machine learning prediction models for hospital mortality.
    Stenwig E; Salvi G; Rossi PS; Skjærvold NK
    BMC Med Res Methodol; 2022 Feb; 22(1):53. PubMed ID: 35220950
    [TBL] [Abstract][Full Text] [Related]  

  • 13. On the interpretability of machine learning-based model for predicting hypertension.
    Elshawi R; Al-Mallah MH; Sakr S
    BMC Med Inform Decis Mak; 2019 Jul; 19(1):146. PubMed ID: 31357998
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Explaining multivariate molecular diagnostic tests via Shapley values.
    Roder J; Maguire L; Georgantas R; Roder H
    BMC Med Inform Decis Mak; 2021 Jul; 21(1):211. PubMed ID: 34238309
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Application of explainable machine learning for real-time safety analysis toward a connected vehicle environment.
    Yuan C; Li Y; Huang H; Wang S; Sun Z; Wang H
    Accid Anal Prev; 2022 Jun; 171():106681. PubMed ID: 35468530
    [TBL] [Abstract][Full Text] [Related]  

  • 16. A data-driven interpretable ensemble framework based on tree models for forecasting the occurrence of COVID-19 in the USA.
    Zheng HL; An SY; Qiao BJ; Guan P; Huang DS; Wu W
    Environ Sci Pollut Res Int; 2023 Jan; 30(5):13648-13659. PubMed ID: 36131178
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Breast cancer molecular subtype prediction: Improving interpretability of complex machine-learning models based on multiparametric-MRI features using SHapley Additive exPlanations (SHAP) methodology.
    Crombé A; Kataoka M
    Diagn Interv Imaging; 2024 May; 105(5):161-162. PubMed ID: 38365542
    [No Abstract]   [Full Text] [Related]  

  • 18. Development and Interpretation of Multiple Machine Learning Models for Predicting Postoperative Delayed Remission of Acromegaly Patients During Long-Term Follow-Up.
    Dai C; Fan Y; Li Y; Bao X; Li Y; Su M; Yao Y; Deng K; Xing B; Feng F; Feng M; Wang R
    Front Endocrinol (Lausanne); 2020; 11():643. PubMed ID: 33042013
    [No Abstract]   [Full Text] [Related]  

  • 19. Classification and Explanation for Intrusion Detection System Based on Ensemble Trees and SHAP Method.
    Le TT; Kim H; Kang H; Kim H
    Sensors (Basel); 2022 Feb; 22(3):. PubMed ID: 35161899
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Interpretability analysis for thermal sensation machine learning models: An exploration based on the SHAP approach.
    Yang Y; Yuan Y; Han Z; Liu G
    Indoor Air; 2022 Jan; 32(2):e12984. PubMed ID: 35048421
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 15.