BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

33 related articles for article (PubMed ID: 32818666)

  • 1. Relation Extraction from Clinical Narratives Using Pre-trained Language Models.
    Wei Q; Ji Z; Si Y; Du J; Wang J; Tiryaki F; Wu S; Tao C; Roberts K; Xu H
    AMIA Annu Symp Proc; 2019; 2019():1236-1245. PubMed ID: 32308921
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Incorporating entity-level knowledge in pretrained language model for biomedical dense retrieval.
    Tan J; Hu J; Dong S
    Comput Biol Med; 2023 Nov; 166():107535. PubMed ID: 37788508
    [TBL] [Abstract][Full Text] [Related]  

  • 3. List-wise learning to rank biomedical question-answer pairs with deep ranking recursive autoencoders.
    Yan Y; Zhang BW; Li XF; Liu Z
    PLoS One; 2020; 15(11):e0242061. PubMed ID: 33166367
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Training-Free Transformer Architecture Search With Zero-Cost Proxy Guided Evolution.
    Zhou Q; Sheng K; Zheng X; Li K; Tian Y; Chen J; Ji R
    IEEE Trans Pattern Anal Mach Intell; 2024 Mar; PP():. PubMed ID: 38502633
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Using Natural Language Processing to Extract and Classify Symptoms Among Patients with Thyroid Dysfunction.
    Hwang S; Reddy S; Wainwright K; Schriver E; Cappola A; Mowery D
    Stud Health Technol Inform; 2024 Jan; 310():614-618. PubMed ID: 38269882
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Enhancing psychiatric rehabilitation outcomes through a multimodal multitask learning model based on BERT and TabNet: An approach for personalized treatment and improved decision-making.
    Yang H; Zhu D; He S; Xu Z; Liu Z; Zhang W; Cai J
    Psychiatry Res; 2024 Jun; 336():115896. PubMed ID: 38626625
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Online Health Search Via Multidimensional Information Quality Assessment Based on Deep Language Models: Algorithm Development and Validation.
    Zhang B; Naderi N; Mishra R; Teodoro D
    JMIR AI; 2024 May; 3():e42630. PubMed ID: 38875551
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Linking Cancer Clinical Trials to their Result Publications.
    Pan E; Roberts K
    AMIA Jt Summits Transl Sci Proc; 2024; 2024():642-651. PubMed ID: 38827077
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Matching Patients to Clinical Trials with Large Language Models.
    Jin Q; Wang Z; Floudas CS; Chen F; Gong C; Bracken-Clarke D; Xue E; Yang Y; Sun J; Lu Z
    ArXiv; 2024 Apr; ():. PubMed ID: 37576126
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Correction: Large Language Models in Medicine.
    Ann Intern Med; 2024 Apr; 177(4):548. PubMed ID: 38498881
    [No Abstract]   [Full Text] [Related]  

  • 11. A hybrid algorithm for clinical decision support in precision medicine based on machine learning.
    Zhang Z; Lin X; Wu S
    BMC Bioinformatics; 2023 Jan; 24(1):3. PubMed ID: 36597033
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Clinical trial search: Using biomedical language understanding models for re-ranking.
    Rybinski M; Xu J; Karimi S
    J Biomed Inform; 2020 Sep; 109():103530. PubMed ID: 32818666
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Oversampling effect in pretraining for bidirectional encoder representations from transformers (BERT) to localize medical BERT and enhance biomedical BERT.
    Wada S; Takeda T; Okada K; Manabe S; Konishi S; Kamohara J; Matsumura Y
    Artif Intell Med; 2024 Jul; 153():102889. PubMed ID: 38728811
    [TBL] [Abstract][Full Text] [Related]  

  • 14. BioBERT and Similar Approaches for Relation Extraction.
    Bhasuran B
    Methods Mol Biol; 2022; 2496():221-235. PubMed ID: 35713867
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Bidirectional Encoder Representations from Transformers in Radiology: A Systematic Review of Natural Language Processing Applications.
    Gorenstein L; Konen E; Green M; Klang E
    J Am Coll Radiol; 2024 Jun; 21(6):914-941. PubMed ID: 38302036
    [TBL] [Abstract][Full Text] [Related]  

  • 16. GT-Finder: Classify the family of glucose transporters with pre-trained BERT language models.
    Ali Shah SM; Taju SW; Ho QT; Nguyen TT; Ou YY
    Comput Biol Med; 2021 Apr; 131():104259. PubMed ID: 33581474
    [TBL] [Abstract][Full Text] [Related]  

  • 17. AMMU: A survey of transformer-based biomedical pretrained language models.
    Kalyan KS; Rajasekharan A; Sangeetha S
    J Biomed Inform; 2022 Feb; 126():103982. PubMed ID: 34974190
    [TBL] [Abstract][Full Text] [Related]  

  • 18.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

  • 19.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

  • 20.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

    [Next]    [New Search]
    of 2.