These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Federated Learning Approach for Secured Medical Recommendation in Internet of Medical Things Using Homomorphic Encryption. Author: Mantey EA, Zhou C, Anajemba JH, Arthur JK, Hamid Y, Chowhan A, Otuu OO. Journal: IEEE J Biomed Health Inform; 2024 Jun; 28(6):3329-3340. PubMed ID: 38190666. Abstract: The concept of Federated Learning (FL) is a distributed-based machine learning (ML) approach that trains its model using edge devices. Its focus is on maintaining privacy by transmitting gradient updates along with users' learning parameters to the global server in the process of training as well as preserving the integrity of data on the user-end of internet of medical things (IoMT) devices. Instead of a direct use of user data, the training which is performed on the global server is done on the parameters while the model modification is performed locally on IoMT devices. But the major drawback of this federated learning approach is its inability to preserve user privacy complete thereby resulting in gradients leakage. Thus, this study first presents a summary of the process of learning and further proposes a new approach for federated medical recommender system which employs the use of homomorphic cryptography to ensure a more privacy-preservation of user gradients during recommendations. The experimental results indicate an insignificant decrease with respect to the metrics of accuracy, however, a greater percentage of user-privacy is achieved. Further analysis also shows that performing computations on encrypted gradients at the global server scarcely has any impact on the output of the recommendation while guaranteeing a supplementary secure channel for transmitting user-based gradients back and forth the global server. The result of this analysis indicates that the performance of federated stochastic modification minimized gradient (FSMMG) algorithm is greatly increased at every given increase in the number of users and a good convergence is achieved as well. Also, experiments indicate that when compared against other existing techniques, the proposed FSMMG outperforms at 98.3% encryption accuracy.[Abstract] [Full Text] [Related] [New Search]