BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

137 related articles for article (PubMed ID: 37689906)

  • 1. Umami-BERT: An interpretable BERT-based model for umami peptides prediction.
    Zhang J; Yan W; Zhang Q; Li Z; Liang L; Zuo M; Zhang Y
    Food Res Int; 2023 Oct; 172():113142. PubMed ID: 37689906
    [TBL] [Abstract][Full Text] [Related]  

  • 2. IUP-BERT: Identification of Umami Peptides Based on BERT Features.
    Jiang L; Jiang J; Wang X; Zhang Y; Zheng B; Liu S; Zhang Y; Liu C; Wan Y; Xiang D; Lv Z
    Foods; 2022 Nov; 11(22):. PubMed ID: 36429332
    [TBL] [Abstract][Full Text] [Related]  

  • 3. BERT4Bitter: a bidirectional encoder representations from transformers (BERT)-based model for improving the prediction of bitter peptides.
    Charoenkwan P; Nantasenamat C; Hasan MM; Manavalan B; Shoombuatong W
    Bioinformatics; 2021 Sep; 37(17):2556-2562. PubMed ID: 33638635
    [TBL] [Abstract][Full Text] [Related]  

  • 4. AMP-BERT: Prediction of antimicrobial peptide function based on a BERT model.
    Lee H; Lee S; Lee I; Nam H
    Protein Sci; 2023 Jan; 32(1):e4529. PubMed ID: 36461699
    [TBL] [Abstract][Full Text] [Related]  

  • 5. UmamiPreDL: Deep learning model for umami taste prediction of peptides using BERT and CNN.
    Indiran AP; Fatima H; Chattopadhyay S; Ramadoss S; Radhakrishnan Y
    Comput Biol Chem; 2024 Aug; 111():108116. PubMed ID: 38823360
    [TBL] [Abstract][Full Text] [Related]  

  • 6. UMPred-FRL: A New Approach for Accurate Prediction of Umami Peptides Using Feature Representation Learning.
    Charoenkwan P; Nantasenamat C; Hasan MM; Moni MA; Manavalan B; Shoombuatong W
    Int J Mol Sci; 2021 Dec; 22(23):. PubMed ID: 34884927
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Automatic text classification of actionable radiology reports of tinnitus patients using bidirectional encoder representations from transformer (BERT) and in-domain pre-training (IDPT).
    Li J; Lin Y; Zhao P; Liu W; Cai L; Sun J; Zhao L; Yang Z; Song H; Lv H; Wang Z
    BMC Med Inform Decis Mak; 2022 Jul; 22(1):200. PubMed ID: 35907966
    [TBL] [Abstract][Full Text] [Related]  

  • 8. GT-Finder: Classify the family of glucose transporters with pre-trained BERT language models.
    Ali Shah SM; Taju SW; Ho QT; Nguyen TT; Ou YY
    Comput Biol Med; 2021 Apr; 131():104259. PubMed ID: 33581474
    [TBL] [Abstract][Full Text] [Related]  

  • 9. FG-BERT: a generalized and self-supervised functional group-based molecular representation learning framework for properties prediction.
    Li B; Lin M; Chen T; Wang L
    Brief Bioinform; 2023 Sep; 24(6):. PubMed ID: 37930026
    [TBL] [Abstract][Full Text] [Related]  

  • 10. PD-BertEDL: An Ensemble Deep Learning Method Using BERT and Multivariate Representation to Predict Peptide Detectability.
    Wang H; Wang J; Feng Z; Li Y; Zhao H
    Int J Mol Sci; 2022 Oct; 23(20):. PubMed ID: 36293242
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Transfer Learning for Sentiment Classification Using Bidirectional Encoder Representations from Transformers (BERT) Model.
    Areshey A; Mathkour H
    Sensors (Basel); 2023 May; 23(11):. PubMed ID: 37299959
    [TBL] [Abstract][Full Text] [Related]  

  • 12. BERT-5mC: an interpretable model for predicting 5-methylcytosine sites of DNA based on BERT.
    Wang S; Liu Y; Liu Y; Zhang Y; Zhu X
    PeerJ; 2023; 11():e16600. PubMed ID: 38089911
    [TBL] [Abstract][Full Text] [Related]  

  • 13. TRP-BERT: Discrimination of transient receptor potential (TRP) channels using contextual representations from deep bidirectional transformer based on BERT.
    Ali Shah SM; Ou YY
    Comput Biol Med; 2021 Oct; 137():104821. PubMed ID: 34508974
    [TBL] [Abstract][Full Text] [Related]  

  • 14. BERT-PPII: The Polyproline Type II Helix Structure Prediction Model Based on BERT and Multichannel CNN.
    Feng C; Wang Z; Li G; Yang X; Wu N; Wang L
    Biomed Res Int; 2022; 2022():9015123. PubMed ID: 36060139
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Modified Bidirectional Encoder Representations From Transformers Extractive Summarization Model for Hospital Information Systems Based on Character-Level Tokens (AlphaBERT): Development and Performance Evaluation.
    Chen YP; Chen YY; Lin JJ; Huang CH; Lai F
    JMIR Med Inform; 2020 Apr; 8(4):e17787. PubMed ID: 32347806
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Med-BERT: pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction.
    Rasmy L; Xiang Y; Xie Z; Tao C; Zhi D
    NPJ Digit Med; 2021 May; 4(1):86. PubMed ID: 34017034
    [TBL] [Abstract][Full Text] [Related]  

  • 17. A transformer architecture based on BERT and 2D convolutional neural network to identify DNA enhancers from sequence information.
    Le NQK; Ho QT; Nguyen TT; Ou YY
    Brief Bioinform; 2021 Sep; 22(5):. PubMed ID: 33539511
    [TBL] [Abstract][Full Text] [Related]  

  • 18. MRM-BERT: a novel deep neural network predictor of multiple RNA modifications by fusing BERT representation and sequence features.
    Wang L; Zhou Y
    RNA Biol; 2024 Jan; 21(1):1-10. PubMed ID: 38357904
    [TBL] [Abstract][Full Text] [Related]  

  • 19. BERT-Kcr: prediction of lysine crotonylation sites by a transfer learning method with pre-trained BERT models.
    Qiao Y; Zhu X; Gong H
    Bioinformatics; 2022 Jan; 38(3):648-654. PubMed ID: 34643684
    [TBL] [Abstract][Full Text] [Related]  

  • 20. BERT-Promoter: An improved sequence-based predictor of DNA promoter using BERT pre-trained model and SHAP feature selection.
    Le NQK; Ho QT; Nguyen VN; Chang JS
    Comput Biol Chem; 2022 Aug; 99():107732. PubMed ID: 35863177
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.