BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

168 related articles for article (PubMed ID: 33638635)

  • 21. BERT-5mC: an interpretable model for predicting 5-methylcytosine sites of DNA based on BERT.
    Wang S; Liu Y; Liu Y; Zhang Y; Zhu X
    PeerJ; 2023; 11():e16600. PubMed ID: 38089911
    [TBL] [Abstract][Full Text] [Related]  

  • 22. BERT-Promoter: An improved sequence-based predictor of DNA promoter using BERT pre-trained model and SHAP feature selection.
    Le NQK; Ho QT; Nguyen VN; Chang JS
    Comput Biol Chem; 2022 Aug; 99():107732. PubMed ID: 35863177
    [TBL] [Abstract][Full Text] [Related]  

  • 23. BERT-GT: cross-sentence n-ary relation extraction with BERT and Graph Transformer.
    Lai PT; Lu Z
    Bioinformatics; 2021 Apr; 36(24):5678-5685. PubMed ID: 33416851
    [TBL] [Abstract][Full Text] [Related]  

  • 24. ToxIBTL: prediction of peptide toxicity based on information bottleneck and transfer learning.
    Wei L; Ye X; Sakurai T; Mu Z; Wei L
    Bioinformatics; 2022 Mar; 38(6):1514-1524. PubMed ID: 34999757
    [TBL] [Abstract][Full Text] [Related]  

  • 25. PEPred-Suite: improved and robust prediction of therapeutic peptides using adaptive feature representation learning.
    Wei L; Zhou C; Su R; Zou Q
    Bioinformatics; 2019 Nov; 35(21):4272-4280. PubMed ID: 30994882
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Modified Bidirectional Encoder Representations From Transformers Extractive Summarization Model for Hospital Information Systems Based on Character-Level Tokens (AlphaBERT): Development and Performance Evaluation.
    Chen YP; Chen YY; Lin JJ; Huang CH; Lai F
    JMIR Med Inform; 2020 Apr; 8(4):e17787. PubMed ID: 32347806
    [TBL] [Abstract][Full Text] [Related]  

  • 27. GT-Finder: Classify the family of glucose transporters with pre-trained BERT language models.
    Ali Shah SM; Taju SW; Ho QT; Nguyen TT; Ou YY
    Comput Biol Med; 2021 Apr; 131():104259. PubMed ID: 33581474
    [TBL] [Abstract][Full Text] [Related]  

  • 28. EnAMP: A novel deep learning ensemble antibacterial peptide recognition algorithm based on multi-features.
    Zhuang J; Gao W; Su R
    J Bioinform Comput Biol; 2024 Feb; 22(1):2450001. PubMed ID: 38406833
    [TBL] [Abstract][Full Text] [Related]  

  • 29. StackIL6: a stacking ensemble model for improving the prediction of IL-6 inducing peptides.
    Charoenkwan P; Chiangjong W; Nantasenamat C; Hasan MM; Manavalan B; Shoombuatong W
    Brief Bioinform; 2021 Nov; 22(6):. PubMed ID: 33963832
    [TBL] [Abstract][Full Text] [Related]  

  • 30. FAD-BERT: Improved prediction of FAD binding sites using pre-training of deep bidirectional transformers.
    Ho QT; Nguyen TT; Khanh Le NQ; Ou YY
    Comput Biol Med; 2021 Apr; 131():104258. PubMed ID: 33601085
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Automatic detection of actionable radiology reports using bidirectional encoder representations from transformers.
    Nakamura Y; Hanaoka S; Nomura Y; Nakao T; Miki S; Watadani T; Yoshikawa T; Hayashi N; Abe O
    BMC Med Inform Decis Mak; 2021 Sep; 21(1):262. PubMed ID: 34511100
    [TBL] [Abstract][Full Text] [Related]  

  • 32. BERT-based Ranking for Biomedical Entity Normalization.
    Ji Z; Wei Q; Xu H
    AMIA Jt Summits Transl Sci Proc; 2020; 2020():269-277. PubMed ID: 32477646
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Use of BERT (Bidirectional Encoder Representations from Transformers)-Based Deep Learning Method for Extracting Evidences in Chinese Radiology Reports: Development of a Computer-Aided Liver Cancer Diagnosis Framework.
    Liu H; Zhang Z; Xu Y; Wang N; Huang Y; Yang Z; Jiang R; Chen H
    J Med Internet Res; 2021 Jan; 23(1):e19689. PubMed ID: 33433395
    [TBL] [Abstract][Full Text] [Related]  

  • 34. NEPTUNE: A novel computational approach for accurate and large-scale identification of tumor homing peptides.
    Charoenkwan P; Schaduangrat N; Lio' P; Moni MA; Manavalan B; Shoombuatong W
    Comput Biol Med; 2022 Sep; 148():105700. PubMed ID: 35715261
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study.
    Kades K; Sellner J; Koehler G; Full PM; Lai TYE; Kleesiek J; Maier-Hein KH
    JMIR Med Inform; 2021 Feb; 9(2):e22795. PubMed ID: 33533728
    [TBL] [Abstract][Full Text] [Related]  

  • 36. NeuroPpred-SVM: A New Model for Predicting Neuropeptides Based on Embeddings of BERT.
    Liu Y; Wang S; Li X; Liu Y; Zhu X
    J Proteome Res; 2023 Mar; 22(3):718-728. PubMed ID: 36749151
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Using Bidirectional Encoder Representations from Transformers (BERT) to predict criminal charges and sentences from Taiwanese court judgments.
    Peng YT; Lei CL
    PeerJ Comput Sci; 2024; 10():e1841. PubMed ID: 38435559
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Extracting Multiple Worries From Breast Cancer Patient Blogs Using Multilabel Classification With the Natural Language Processing Model Bidirectional Encoder Representations From Transformers: Infodemiology Study of Blogs.
    Watanabe T; Yada S; Aramaki E; Yajima H; Kizaki H; Hori S
    JMIR Cancer; 2022 Jun; 8(2):e37840. PubMed ID: 35657664
    [TBL] [Abstract][Full Text] [Related]  

  • 39. A novel sequence-based predictor for identifying and characterizing thermophilic proteins using estimated propensity scores of dipeptides.
    Charoenkwan P; Chotpatiwetchkul W; Lee VS; Nantasenamat C; Shoombuatong W
    Sci Rep; 2021 Dec; 11(1):23782. PubMed ID: 34893688
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Identify Bitter Peptides by Using Deep Representation Learning Features.
    Jiang J; Lin X; Jiang Y; Jiang L; Lv Z
    Int J Mol Sci; 2022 Jul; 23(14):. PubMed ID: 35887225
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 9.