207 related articles for article (PubMed ID: 35746184)
41. Clinical decision support tool for breast cancer recurrence prediction using SHAP value in cooperative game theory.
Liu Y; Fu Y; Peng Y; Ming J
Heliyon; 2024 Jan; 10(2):e24876. PubMed ID: 38312672
[TBL] [Abstract][Full Text] [Related]
42. Towards better process management in wastewater treatment plants: Process analytics based on SHAP values for tree-based machine learning methods.
Wang D; Thunéll S; Lindberg U; Jiang L; Trygg J; Tysklind M
J Environ Manage; 2022 Jan; 301():113941. PubMed ID: 34731954
[TBL] [Abstract][Full Text] [Related]
43. Analysis and evaluation of explainable artificial intelligence on suicide risk assessment.
Tang H; Miri Rekavandi A; Rooprai D; Dwivedi G; Sanfilippo FM; Boussaid F; Bennamoun M
Sci Rep; 2024 Mar; 14(1):6163. PubMed ID: 38485985
[TBL] [Abstract][Full Text] [Related]
44. Explaining Multiclass Compound Activity Predictions Using Counterfactuals and Shapley Values.
Lamens A; Bajorath J
Molecules; 2023 Jul; 28(14):. PubMed ID: 37513472
[TBL] [Abstract][Full Text] [Related]
45. How does the model make predictions? A systematic literature review on the explainability power of machine learning in healthcare.
Allgaier J; Mulansky L; Draelos RL; Pryss R
Artif Intell Med; 2023 Sep; 143():102616. PubMed ID: 37673561
[TBL] [Abstract][Full Text] [Related]
46. XAI-reduct: accuracy preservation despite dimensionality reduction for heart disease classification using explainable AI.
Das S; Sultana M; Bhattacharya S; Sengupta D; De D
J Supercomput; 2023 May; ():1-31. PubMed ID: 37359323
[TBL] [Abstract][Full Text] [Related]
47. Real-world data to build explainable trustworthy artificial intelligence models for prediction of immunotherapy efficacy in NSCLC patients.
Prelaj A; Galli EG; Miskovic V; Pesenti M; Viscardi G; Pedica B; Mazzeo L; Bottiglieri A; Provenzano L; Spagnoletti A; Marinacci R; De Toma A; Proto C; Ferrara R; Brambilla M; Occhipinti M; Manglaviti S; Galli G; Signorelli D; Giani C; Beninato T; Pircher CC; Rametta A; Kosta S; Zanitti M; Di Mauro MR; Rinaldi A; Di Gregorio S; Antonia M; Garassino MC; de Braud FGM; Restelli M; Lo Russo G; Ganzinelli M; Trovò F; Pedrocchi ALG
Front Oncol; 2022; 12():1078822. PubMed ID: 36755856
[TBL] [Abstract][Full Text] [Related]
48. Explainable artificial intelligence (XAI) for interpreting the contributing factors feed into the wildfire susceptibility prediction model.
Abdollahi A; Pradhan B
Sci Total Environ; 2023 Jun; 879():163004. PubMed ID: 36965733
[TBL] [Abstract][Full Text] [Related]
49. Interpretable machine learning with tree-based shapley additive explanations: Application to metabolomics datasets for binary classification.
Bifarin OO
PLoS One; 2023; 18(5):e0284315. PubMed ID: 37141218
[TBL] [Abstract][Full Text] [Related]
50. XGBoost, A Novel Explainable AI Technique, in the Prediction of Myocardial Infarction: A UK Biobank Cohort Study.
Moore A; Bell M
Clin Med Insights Cardiol; 2022; 16():11795468221133611. PubMed ID: 36386405
[TBL] [Abstract][Full Text] [Related]
51. Exploratory Data Mining Techniques (Decision Tree Models) for Examining the Impact of Internet-Based Cognitive Behavioral Therapy for Tinnitus: Machine Learning Approach.
Rodrigo H; Beukes EW; Andersson G; Manchaiah V
J Med Internet Res; 2021 Nov; 23(11):e28999. PubMed ID: 34726612
[TBL] [Abstract][Full Text] [Related]
52. Spatio-temporal feature attribution of European summer wildfires with Explainable Artificial Intelligence (XAI).
Li H; Vulova S; Rocha AD; Kleinschmit B
Sci Total Environ; 2024 Mar; 916():170330. PubMed ID: 38278254
[TBL] [Abstract][Full Text] [Related]
53. Development and Interpretation of Multiple Machine Learning Models for Predicting Postoperative Delayed Remission of Acromegaly Patients During Long-Term Follow-Up.
Dai C; Fan Y; Li Y; Bao X; Li Y; Su M; Yao Y; Deng K; Xing B; Feng F; Feng M; Wang R
Front Endocrinol (Lausanne); 2020; 11():643. PubMed ID: 33042013
[No Abstract] [Full Text] [Related]
54. Predicting exclusive breastfeeding in maternity wards using machine learning techniques.
Oliver-Roig A; Rico-Juan JR; Richart-Martínez M; Cabrero-García J
Comput Methods Programs Biomed; 2022 Jun; 221():106837. PubMed ID: 35544962
[TBL] [Abstract][Full Text] [Related]
55. A comparison of explainable artificial intelligence methods in the phase classification of multi-principal element alloys.
Lee K; Ayyasamy MV; Ji Y; Balachandran PV
Sci Rep; 2022 Jul; 12(1):11591. PubMed ID: 35804179
[TBL] [Abstract][Full Text] [Related]
56. Integrated Evolutionary Learning: An Artificial Intelligence Approach to Joint Learning of Features and Hyperparameters for Optimized, Explainable Machine Learning.
de Lacy N; Ramshaw MJ; Kutz JN
Front Artif Intell; 2022; 5():832530. PubMed ID: 35493616
[TBL] [Abstract][Full Text] [Related]
57. Identifying key multi-modal predictors of incipient dementia in Parkinson's disease: a machine learning analysis and Tree SHAP interpretation.
McFall GP; Bohn L; Gee M; Drouin SM; Fah H; Han W; Li L; Camicioli R; Dixon RA
Front Aging Neurosci; 2023; 15():1124232. PubMed ID: 37455938
[TBL] [Abstract][Full Text] [Related]
58. Generation of Molecular Counterfactuals for Explainable Machine Learning Based on Core-Substituent Recombination.
Lamens A; Bajorath J
ChemMedChem; 2024 Feb; 19(3):e202300586. PubMed ID: 37983655
[TBL] [Abstract][Full Text] [Related]
59. An interpretable machine learning model of cross-sectional U.S. county-level obesity prevalence using explainable artificial intelligence.
Allen B
PLoS One; 2023; 18(10):e0292341. PubMed ID: 37796874
[TBL] [Abstract][Full Text] [Related]
60. Earthquake-Induced Building-Damage Mapping Using Explainable AI (XAI).
Matin SS; Pradhan B
Sensors (Basel); 2021 Jun; 21(13):. PubMed ID: 34209169
[TBL] [Abstract][Full Text] [Related]
[Previous] [Next] [New Search]