BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

125 related articles for article (PubMed ID: 38339945)

  • 1. Detection of anemic condition in patients from clinical markers and explainable artificial intelligence.
    Darshan BSD; Sampathila N; Bairy MG; Belurkar S; Prabhu S; Chadaga K
    Technol Health Care; 2024 Feb; ():. PubMed ID: 38339945
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Investigation on explainable machine learning models to predict chronic kidney diseases.
    Ghosh SK; Khandoker AH
    Sci Rep; 2024 Feb; 14(1):3687. PubMed ID: 38355876
    [TBL] [Abstract][Full Text] [Related]  

  • 3. A decision support system for osteoporosis risk prediction using machine learning and explainable artificial intelligence.
    Khanna VV; Chadaga K; Sampathila N; Chadaga R; Prabhu S; K S S; Jagdale AS; Bhat D
    Heliyon; 2023 Dec; 9(12):e22456. PubMed ID: 38144333
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Explainable artificial intelligence model for identifying COVID-19 gene biomarkers.
    Yagin FH; Cicek İB; Alkhateeb A; Yagin B; Colak C; Azzeh M; Akbulut S
    Comput Biol Med; 2023 Mar; 154():106619. PubMed ID: 36738712
    [TBL] [Abstract][Full Text] [Related]  

  • 5. An Explainable Artificial Intelligence Framework for the Deterioration Risk Prediction of Hepatitis Patients.
    Peng J; Zou K; Zhou M; Teng Y; Zhu X; Zhang F; Xu J
    J Med Syst; 2021 Apr; 45(5):61. PubMed ID: 33847850
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Explainable machine learning models based on multimodal time-series data for the early detection of Parkinson's disease.
    Junaid M; Ali S; Eid F; El-Sappagh S; Abuhmed T
    Comput Methods Programs Biomed; 2023 Jun; 234():107495. PubMed ID: 37003039
    [TBL] [Abstract][Full Text] [Related]  

  • 7. A Decision Support System for Diagnosis of COVID-19 from Non-COVID-19 Influenza-like Illness Using Explainable Artificial Intelligence.
    Chadaga K; Prabhu S; Bhat V; Sampathila N; Umakanth S; Chadaga R
    Bioengineering (Basel); 2023 Mar; 10(4):. PubMed ID: 37106626
    [TBL] [Abstract][Full Text] [Related]  

  • 8. IHCP: interpretable hepatitis C prediction system based on black-box machine learning models.
    Fan Y; Lu X; Sun G
    BMC Bioinformatics; 2023 Sep; 24(1):333. PubMed ID: 37674125
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Explainable Artificial Intelligence in Quantifying Breast Cancer Factors: Saudi Arabia Context.
    Alelyani T; Alshammari MM; Almuhanna A; Asan O
    Healthcare (Basel); 2024 May; 12(10):. PubMed ID: 38786433
    [TBL] [Abstract][Full Text] [Related]  

  • 10. HGSORF: Henry Gas Solubility Optimization-based Random Forest for C-Section prediction and XAI-based cause analysis.
    Islam MS; Awal MA; Laboni JN; Pinki FT; Karmokar S; Mumenin KM; Al-Ahmadi S; Rahman MA; Hossain MS; Mirjalili S
    Comput Biol Med; 2022 Aug; 147():105671. PubMed ID: 35660327
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Model-agnostic explainable artificial intelligence tools for severity prediction and symptom analysis on Indian COVID-19 data.
    Nambiar A; S H; S S
    Front Artif Intell; 2023; 6():1272506. PubMed ID: 38111787
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Interpreting artificial intelligence models: a systematic review on the application of LIME and SHAP in Alzheimer's disease detection.
    Vimbi V; Shaffi N; Mahmud M
    Brain Inform; 2024 Apr; 11(1):10. PubMed ID: 38578524
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A Machine Learning Approach with Human-AI Collaboration for Automated Classification of Patient Safety Event Reports: Algorithm Development and Validation Study.
    Chen H; Cohen E; Wilson D; Alfred M
    JMIR Hum Factors; 2024 Jan; 11():e53378. PubMed ID: 38271086
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Evaluation of nutritional status and clinical depression classification using an explainable machine learning method.
    Hosseinzadeh Kasani P; Lee JE; Park C; Yun CH; Jang JW; Lee SA
    Front Nutr; 2023; 10():1165854. PubMed ID: 37229464
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Interpretable machine learning model for early prediction of 28-day mortality in ICU patients with sepsis-induced coagulopathy: development and validation.
    Zhou S; Lu Z; Liu Y; Wang M; Zhou W; Cui X; Zhang J; Xiao W; Hua T; Zhu H; Yang M
    Eur J Med Res; 2024 Jan; 29(1):14. PubMed ID: 38172962
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Explainable machine learning approach to predict extubation in critically ill ventilated patients: a retrospective study in central Taiwan.
    Pai KC; Su SA; Chan MC; Wu CL; Chao WC
    BMC Anesthesiol; 2022 Nov; 22(1):351. PubMed ID: 36376785
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Machine learning explainability in nasopharyngeal cancer survival using LIME and SHAP.
    Alabi RO; Elmusrati M; Leivo I; Almangush A; Mäkitie AA
    Sci Rep; 2023 Jun; 13(1):8984. PubMed ID: 37268685
    [TBL] [Abstract][Full Text] [Related]  

  • 18. An Ensemble Approach for the Prediction of Diabetes Mellitus Using a Soft Voting Classifier with an Explainable AI.
    Kibria HB; Nahiduzzaman M; Goni MOF; Ahsan M; Haider J
    Sensors (Basel); 2022 Sep; 22(19):. PubMed ID: 36236367
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Predicting preterm birth using explainable machine learning in a prospective cohort of nulliparous and multiparous pregnant women.
    Khan W; Zaki N; Ghenimi N; Ahmad A; Bian J; Masud MM; Ali N; Govender R; Ahmed LA
    PLoS One; 2023; 18(12):e0293925. PubMed ID: 38150456
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Explainable machine learning to predict long-term mortality in critically ill ventilated patients: a retrospective study in central Taiwan.
    Chan MC; Pai KC; Su SA; Wang MS; Wu CL; Chao WC
    BMC Med Inform Decis Mak; 2022 Mar; 22(1):75. PubMed ID: 35337303
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.