These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

281 related articles for article (PubMed ID: 35863177)

  • 1. BERT-Promoter: An improved sequence-based predictor of DNA promoter using BERT pre-trained model and SHAP feature selection.
    Le NQK; Ho QT; Nguyen VN; Chang JS
    Comput Biol Chem; 2022 Aug; 99():107732. PubMed ID: 35863177
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A transformer architecture based on BERT and 2D convolutional neural network to identify DNA enhancers from sequence information.
    Le NQK; Ho QT; Nguyen TT; Ou YY
    Brief Bioinform; 2021 Sep; 22(5):. PubMed ID: 33539511
    [TBL] [Abstract][Full Text] [Related]  

  • 3. BERT-Kcr: prediction of lysine crotonylation sites by a transfer learning method with pre-trained BERT models.
    Qiao Y; Zhu X; Gong H
    Bioinformatics; 2022 Jan; 38(3):648-654. PubMed ID: 34643684
    [TBL] [Abstract][Full Text] [Related]  

  • 4. GT-Finder: Classify the family of glucose transporters with pre-trained BERT language models.
    Ali Shah SM; Taju SW; Ho QT; Nguyen TT; Ou YY
    Comput Biol Med; 2021 Apr; 131():104259. PubMed ID: 33581474
    [TBL] [Abstract][Full Text] [Related]  

  • 5. TRP-BERT: Discrimination of transient receptor potential (TRP) channels using contextual representations from deep bidirectional transformer based on BERT.
    Ali Shah SM; Ou YY
    Comput Biol Med; 2021 Oct; 137():104821. PubMed ID: 34508974
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Automatic detection of actionable radiology reports using bidirectional encoder representations from transformers.
    Nakamura Y; Hanaoka S; Nomura Y; Nakao T; Miki S; Watadani T; Yoshikawa T; Hayashi N; Abe O
    BMC Med Inform Decis Mak; 2021 Sep; 21(1):262. PubMed ID: 34511100
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Automatic text classification of actionable radiology reports of tinnitus patients using bidirectional encoder representations from transformer (BERT) and in-domain pre-training (IDPT).
    Li J; Lin Y; Zhao P; Liu W; Cai L; Sun J; Zhao L; Yang Z; Song H; Lv H; Wang Z
    BMC Med Inform Decis Mak; 2022 Jul; 22(1):200. PubMed ID: 35907966
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Identification of efflux proteins based on contextual representations with deep bidirectional transformer encoders.
    Taju SW; Shah SMA; Ou YY
    Anal Biochem; 2021 Nov; 633():114416. PubMed ID: 34656612
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Korean clinical entity recognition from diagnosis text using BERT.
    Kim YM; Lee TH
    BMC Med Inform Decis Mak; 2020 Sep; 20(Suppl 7):242. PubMed ID: 32998724
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Deep transformers and convolutional neural network in identifying DNA N6-methyladenine sites in cross-species genomes.
    Le NQK; Ho QT
    Methods; 2022 Aug; 204():199-206. PubMed ID: 34915158
    [TBL] [Abstract][Full Text] [Related]  

  • 11. AMP-BERT: Prediction of antimicrobial peptide function based on a BERT model.
    Lee H; Lee S; Lee I; Nam H
    Protein Sci; 2023 Jan; 32(1):e4529. PubMed ID: 36461699
    [TBL] [Abstract][Full Text] [Related]  

  • 12. When BERT meets Bilbo: a learning curve analysis of pretrained language model on disease classification.
    Li X; Yuan W; Peng D; Mei Q; Wang Y
    BMC Med Inform Decis Mak; 2022 Apr; 21(Suppl 9):377. PubMed ID: 35382811
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Highly accurate classification of chest radiographic reports using a deep learning natural language model pre-trained on 3.8 million text reports.
    Bressem KK; Adams LC; Gaudin RA; Tröltzsch D; Hamm B; Makowski MR; Schüle CY; Vahldiek JL; Niehues SM
    Bioinformatics; 2021 Jan; 36(21):5255-5261. PubMed ID: 32702106
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Combat COVID-19 infodemic using explainable natural language processing models.
    Ayoub J; Yang XJ; Zhou F
    Inf Process Manag; 2021 Jul; 58(4):102569. PubMed ID: 33776192
    [TBL] [Abstract][Full Text] [Related]  

  • 15. BERT-TFBS: a novel BERT-based model for predicting transcription factor binding sites by transfer learning.
    Wang K; Zeng X; Zhou J; Liu F; Luan X; Wang X
    Brief Bioinform; 2024 Mar; 25(3):. PubMed ID: 38701417
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Use of BERT (Bidirectional Encoder Representations from Transformers)-Based Deep Learning Method for Extracting Evidences in Chinese Radiology Reports: Development of a Computer-Aided Liver Cancer Diagnosis Framework.
    Liu H; Zhang Z; Xu Y; Wang N; Huang Y; Yang Z; Jiang R; Chen H
    J Med Internet Res; 2021 Jan; 23(1):e19689. PubMed ID: 33433395
    [TBL] [Abstract][Full Text] [Related]  

  • 17. ActTRANS: Functional classification in active transport proteins based on transfer learning and contextual representations.
    Taju SW; Shah SMA; Ou YY
    Comput Biol Chem; 2021 Aug; 93():107537. PubMed ID: 34217007
    [TBL] [Abstract][Full Text] [Related]  

  • 18. msBERT-Promoter: a multi-scale ensemble predictor based on BERT pre-trained model for the two-stage prediction of DNA promoters and their strengths.
    Li Y; Wei X; Yang Q; Xiong A; Li X; Zou Q; Cui F; Zhang Z
    BMC Biol; 2024 May; 22(1):126. PubMed ID: 38816885
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Arabic Syntactic Diacritics Restoration Using BERT Models.
    Nazih W; Hifny Y
    Comput Intell Neurosci; 2022; 2022():3214255. PubMed ID: 36348654
    [TBL] [Abstract][Full Text] [Related]  

  • 20. BERT-5mC: an interpretable model for predicting 5-methylcytosine sites of DNA based on BERT.
    Wang S; Liu Y; Liu Y; Zhang Y; Zhu X
    PeerJ; 2023; 11():e16600. PubMed ID: 38089911
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 15.