These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

182 related articles for article (PubMed ID: 36945685)

  • 1. Bioformer: an efficient transformer language model for biomedical text mining.
    Fang L; Chen Q; Wei CH; Lu Z; Wang K
    ArXiv; 2023 Feb; ():. PubMed ID: 36945685
    [TBL] [Abstract][Full Text] [Related]  

  • 2. BioBERT: a pre-trained biomedical language representation model for biomedical text mining.
    Lee J; Yoon W; Kim S; Kim D; Kim S; So CH; Kang J
    Bioinformatics; 2020 Feb; 36(4):1234-1240. PubMed ID: 31501885
    [TBL] [Abstract][Full Text] [Related]  

  • 3. BioBERT and Similar Approaches for Relation Extraction.
    Bhasuran B
    Methods Mol Biol; 2022; 2496():221-235. PubMed ID: 35713867
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Benchmarking for biomedical natural language processing tasks with a domain specific ALBERT.
    Naseem U; Dunn AG; Khushi M; Kim J
    BMC Bioinformatics; 2022 Apr; 23(1):144. PubMed ID: 35448946
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Oversampling effect in pretraining for bidirectional encoder representations from transformers (BERT) to localize medical BERT and enhance biomedical BERT.
    Wada S; Takeda T; Okada K; Manabe S; Konishi S; Kamohara J; Matsumura Y
    Artif Intell Med; 2024 Jul; 153():102889. PubMed ID: 38728811
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Deep learning to refine the identification of high-quality clinical research articles from the biomedical literature: Performance evaluation.
    Lokker C; Bagheri E; Abdelkader W; Parrish R; Afzal M; Navarro T; Cotoi C; Germini F; Linkins L; Haynes RB; Chu L; Iorio A
    J Biomed Inform; 2023 Jun; 142():104384. PubMed ID: 37164244
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)-Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study.
    Li F; Jin Y; Liu W; Rawat BPS; Cai P; Yu H
    JMIR Med Inform; 2019 Sep; 7(3):e14830. PubMed ID: 31516126
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Few-Shot Learning for Clinical Natural Language Processing Using Siamese Neural Networks: Algorithm Development and Validation Study.
    Oniani D; Chandrasekar P; Sivarajkumar S; Wang Y
    JMIR AI; 2023 May; 2():e44293. PubMed ID: 38875537
    [TBL] [Abstract][Full Text] [Related]  

  • 9. BERT-based Ranking for Biomedical Entity Normalization.
    Ji Z; Wei Q; Xu H
    AMIA Jt Summits Transl Sci Proc; 2020; 2020():269-277. PubMed ID: 32477646
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Evaluation of GPT and BERT-based models on identifying proteinprotein interactions in biomedical text.
    Rehana H; Çam NB; Basmaci M; Zheng J; Jemiyo C; He Y; Özgür A; Hur J
    ArXiv; 2023 Dec; ():. PubMed ID: 38764593
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Multi-label topic classification for COVID-19 literature with Bioformer.
    Fang L; Wang K
    ArXiv; 2022 Apr; ():. PubMed ID: 35441084
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Clinical concept extraction using transformers.
    Yang X; Bian J; Hogan WR; Wu Y
    J Am Med Inform Assoc; 2020 Dec; 27(12):1935-1942. PubMed ID: 33120431
    [TBL] [Abstract][Full Text] [Related]  

  • 13. BioGPT: generative pre-trained transformer for biomedical text generation and mining.
    Luo R; Sun L; Xia Y; Qin T; Zhang S; Poon H; Liu TY
    Brief Bioinform; 2022 Nov; 23(6):. PubMed ID: 36156661
    [TBL] [Abstract][Full Text] [Related]  

  • 14. BERT-GT: cross-sentence n-ary relation extraction with BERT and Graph Transformer.
    Lai PT; Lu Z
    Bioinformatics; 2021 Apr; 36(24):5678-5685. PubMed ID: 33416851
    [TBL] [Abstract][Full Text] [Related]  

  • 15. When BERT meets Bilbo: a learning curve analysis of pretrained language model on disease classification.
    Li X; Yuan W; Peng D; Mei Q; Wang Y
    BMC Med Inform Decis Mak; 2022 Apr; 21(Suppl 9):377. PubMed ID: 35382811
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical-drug relation extraction?
    Tang A; Deléger L; Bossy R; Zweigenbaum P; Nédellec C
    Database (Oxford); 2022 Aug; 2022():. PubMed ID: 36006843
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Transformers-sklearn: a toolkit for medical language understanding with transformer-based models.
    Yang F; Wang X; Ma H; Li J
    BMC Med Inform Decis Mak; 2021 Jul; 21(Suppl 2):90. PubMed ID: 34330244
    [TBL] [Abstract][Full Text] [Related]  

  • 18. AMMU: A survey of transformer-based biomedical pretrained language models.
    Kalyan KS; Rajasekharan A; Sangeetha S
    J Biomed Inform; 2022 Feb; 126():103982. PubMed ID: 34974190
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Contextual Word Embedding for Biomedical Knowledge Extraction: a Rapid Review and Case Study.
    Vithanage D; Yu P; Wang L; Deng C
    J Healthc Inform Res; 2024 Mar; 8(1):158-179. PubMed ID: 38273979
    [TBL] [Abstract][Full Text] [Related]  

  • 20. BatteryDataExtractor: battery-aware text-mining software embedded with BERT models.
    Huang S; Cole JM
    Chem Sci; 2022 Oct; 13(39):11487-11495. PubMed ID: 36348711
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 10.