189 related articles for article (PubMed ID: 38089911)
1. BERT-5mC: an interpretable model for predicting 5-methylcytosine sites of DNA based on BERT.
Wang S; Liu Y; Liu Y; Zhang Y; Zhu X
PeerJ; 2023; 11():e16600. PubMed ID: 38089911
[TBL] [Abstract][Full Text] [Related]
2. BERT-Kcr: prediction of lysine crotonylation sites by a transfer learning method with pre-trained BERT models.
Qiao Y; Zhu X; Gong H
Bioinformatics; 2022 Jan; 38(3):648-654. PubMed ID: 34643684
[TBL] [Abstract][Full Text] [Related]
3. BiLSTM-5mC: A Bidirectional Long Short-Term Memory-Based Approach for Predicting 5-Methylcytosine Sites in Genome-Wide DNA Promoters.
Cheng X; Wang J; Li Q; Liu T
Molecules; 2021 Dec; 26(24):. PubMed ID: 34946497
[TBL] [Abstract][Full Text] [Related]
4. DGA-5mC: A 5-methylcytosine site prediction model based on an improved DenseNet and bidirectional GRU method.
Jia J; Qin L; Lei R
Math Biosci Eng; 2023 Mar; 20(6):9759-9780. PubMed ID: 37322910
[TBL] [Abstract][Full Text] [Related]
5. Comparing Pre-trained and Feature-Based Models for Prediction of Alzheimer's Disease Based on Speech.
Balagopalan A; Eyre B; Robin J; Rudzicz F; Novikova J
Front Aging Neurosci; 2021; 13():635945. PubMed ID: 33986655
[No Abstract] [Full Text] [Related]
6. iPromoter-5mC: A Novel Fusion Decision Predictor for the Identification of 5-Methylcytosine Sites in Genome-Wide DNA Promoters.
Zhang L; Xiao X; Xu ZC
Front Cell Dev Biol; 2020; 8():614. PubMed ID: 32850787
[TBL] [Abstract][Full Text] [Related]
7. Automatic text classification of actionable radiology reports of tinnitus patients using bidirectional encoder representations from transformer (BERT) and in-domain pre-training (IDPT).
Li J; Lin Y; Zhao P; Liu W; Cai L; Sun J; Zhao L; Yang Z; Song H; Lv H; Wang Z
BMC Med Inform Decis Mak; 2022 Jul; 22(1):200. PubMed ID: 35907966
[TBL] [Abstract][Full Text] [Related]
8. Prediction of RNA-protein interactions using a nucleotide language model.
Yamada K; Hamada M
Bioinform Adv; 2022; 2(1):vbac023. PubMed ID: 36699410
[TBL] [Abstract][Full Text] [Related]
9. Extracting comprehensive clinical information for breast cancer using deep learning methods.
Zhang X; Zhang Y; Zhang Q; Ren Y; Qiu T; Ma J; Sun Q
Int J Med Inform; 2019 Dec; 132():103985. PubMed ID: 31627032
[TBL] [Abstract][Full Text] [Related]
10. A transformer architecture based on BERT and 2D convolutional neural network to identify DNA enhancers from sequence information.
Le NQK; Ho QT; Nguyen TT; Ou YY
Brief Bioinform; 2021 Sep; 22(5):. PubMed ID: 33539511
[TBL] [Abstract][Full Text] [Related]
11. An Extensive Examination of Discovering 5-Methylcytosine Sites in Genome-Wide DNA Promoters Using Machine Learning Based Approaches.
Nguyen TT; Tran TA; Le NQ; Pham DM; Ou YY
IEEE/ACM Trans Comput Biol Bioinform; 2022; 19(1):87-94. PubMed ID: 34014828
[TBL] [Abstract][Full Text] [Related]
12. i5mC-DCGA: an improved hybrid network framework based on the CBAM attention mechanism for identifying promoter 5mC sites.
Jia J; Lei R; Qin L; Wei X
BMC Genomics; 2024 Mar; 25(1):242. PubMed ID: 38443802
[TBL] [Abstract][Full Text] [Related]
13. BERT-Kgly: A Bidirectional Encoder Representations From Transformers (BERT)-Based Model for Predicting Lysine Glycation Site for
Liu Y; Liu Y; Wang GA; Cheng Y; Bi S; Zhu X
Front Bioinform; 2022; 2():834153. PubMed ID: 36304324
[TBL] [Abstract][Full Text] [Related]
14. Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction.
Su P; Vijay-Shanker K
BMC Bioinformatics; 2022 Apr; 23(1):120. PubMed ID: 35379166
[TBL] [Abstract][Full Text] [Related]
15. Use of BERT (Bidirectional Encoder Representations from Transformers)-Based Deep Learning Method for Extracting Evidences in Chinese Radiology Reports: Development of a Computer-Aided Liver Cancer Diagnosis Framework.
Liu H; Zhang Z; Xu Y; Wang N; Huang Y; Yang Z; Jiang R; Chen H
J Med Internet Res; 2021 Jan; 23(1):e19689. PubMed ID: 33433395
[TBL] [Abstract][Full Text] [Related]
16. BERT2OME: Prediction of 2'-O-Methylation Modifications From RNA Sequence by Transformer Architecture Based on BERT.
Soylu NN; Sefer E
IEEE/ACM Trans Comput Biol Bioinform; 2023; 20(3):2177-2189. PubMed ID: 37819796
[TBL] [Abstract][Full Text] [Related]
17. A Natural Language Processing Model for COVID-19 Detection Based on Dutch General Practice Electronic Health Records by Using Bidirectional Encoder Representations From Transformers: Development and Validation Study.
Homburg M; Meijer E; Berends M; Kupers T; Olde Hartman T; Muris J; de Schepper E; Velek P; Kuiper J; Berger M; Peters L
J Med Internet Res; 2023 Oct; 25():e49944. PubMed ID: 37792444
[TBL] [Abstract][Full Text] [Related]
18. A Question-and-Answer System to Extract Data From Free-Text Oncological Pathology Reports (CancerBERT Network): Development Study.
Mitchell JR; Szepietowski P; Howard R; Reisman P; Jones JD; Lewis P; Fridley BL; Rollison DE
J Med Internet Res; 2022 Mar; 24(3):e27210. PubMed ID: 35319481
[TBL] [Abstract][Full Text] [Related]
19. Bidirectional Encoder Representations from Transformers-like large language models in patient safety and pharmacovigilance: A comprehensive assessment of causal inference implications.
Wang X; Xu X; Liu Z; Tong W
Exp Biol Med (Maywood); 2023 Nov; 248(21):1908-1917. PubMed ID: 38084745
[TBL] [Abstract][Full Text] [Related]
20. A BERT Framework to Sentiment Analysis of Tweets.
Bello A; Ng SC; Leung MF
Sensors (Basel); 2023 Jan; 23(1):. PubMed ID: 36617101
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]