These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
Pubmed for Handhelds
PUBMED FOR HANDHELDS
Search MEDLINE/PubMed
Title: Multi-view Multichannel Attention Graph Convolutional Network for miRNA-disease association prediction. Author: Tang X, Luo J, Shen C, Lai Z. Journal: Brief Bioinform; 2021 Nov 05; 22(6):. PubMed ID: 33963829. Abstract: MOTIVATION: In recent years, a growing number of studies have proved that microRNAs (miRNAs) play significant roles in the development of human complex diseases. Discovering the associations between miRNAs and diseases has become an important part of the discovery and treatment of disease. Since uncovering associations via traditional experimental methods is complicated and time-consuming, many computational methods have been proposed to identify the potential associations. However, there are still challenges in accurately determining potential associations between miRNA and disease by using multisource data. RESULTS: In this study, we develop a Multi-view Multichannel Attention Graph Convolutional Network (MMGCN) to predict potential miRNA-disease associations. Different from simple multisource information integration, MMGCN employs GCN encoder to obtain the features of miRNA and disease in different similarity views, respectively. Moreover, our MMGCN can enhance the learned latent representations for association prediction by utilizing multichannel attention, which adaptively learns the importance of different features. Empirical results on two datasets demonstrate that MMGCN model can achieve superior performance compared with nine state-of-the-art methods on most of the metrics. Furthermore, we prove the effectiveness of multichannel attention mechanism and the validity of multisource data in miRNA and disease association prediction. Case studies also indicate the ability of the method for discovering new associations.[Abstract] [Full Text] [Related] [New Search]