These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

355 related articles for article (PubMed ID: 34974190)

  • 1. AMMU: A survey of transformer-based biomedical pretrained language models.
    Kalyan KS; Rajasekharan A; Sangeetha S
    J Biomed Inform; 2022 Feb; 126():103982. PubMed ID: 34974190
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Clinical concept extraction using transformers.
    Yang X; Bian J; Hogan WR; Wu Y
    J Am Med Inform Assoc; 2020 Dec; 27(12):1935-1942. PubMed ID: 33120431
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Transformers-sklearn: a toolkit for medical language understanding with transformer-based models.
    Yang F; Wang X; Ma H; Li J
    BMC Med Inform Decis Mak; 2021 Jul; 21(Suppl 2):90. PubMed ID: 34330244
    [TBL] [Abstract][Full Text] [Related]  

  • 4. BioBERT and Similar Approaches for Relation Extraction.
    Bhasuran B
    Methods Mol Biol; 2022; 2496():221-235. PubMed ID: 35713867
    [TBL] [Abstract][Full Text] [Related]  

  • 5. BioBERT: a pre-trained biomedical language representation model for biomedical text mining.
    Lee J; Yoon W; Kim S; Kim D; Kim S; So CH; Kang J
    Bioinformatics; 2020 Feb; 36(4):1234-1240. PubMed ID: 31501885
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Few-Shot Learning for Clinical Natural Language Processing Using Siamese Neural Networks: Algorithm Development and Validation Study.
    Oniani D; Chandrasekar P; Sivarajkumar S; Wang Y
    JMIR AI; 2023 May; 2():e44293. PubMed ID: 38875537
    [TBL] [Abstract][Full Text] [Related]  

  • 7. KEBLM: Knowledge-Enhanced Biomedical Language Models.
    Lai TM; Zhai C; Ji H
    J Biomed Inform; 2023 Jul; 143():104392. PubMed ID: 37211194
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Oversampling effect in pretraining for bidirectional encoder representations from transformers (BERT) to localize medical BERT and enhance biomedical BERT.
    Wada S; Takeda T; Okada K; Manabe S; Konishi S; Kamohara J; Matsumura Y
    Artif Intell Med; 2024 Jul; 153():102889. PubMed ID: 38728811
    [TBL] [Abstract][Full Text] [Related]  

  • 9. A comparative study of pretrained language models for long clinical text.
    Li Y; Wehbe RM; Ahmad FS; Wang H; Luo Y
    J Am Med Inform Assoc; 2023 Jan; 30(2):340-347. PubMed ID: 36451266
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Bioformer: an efficient transformer language model for biomedical text mining.
    Fang L; Chen Q; Wei CH; Lu Z; Wang K
    ArXiv; 2023 Feb; ():. PubMed ID: 36945685
    [TBL] [Abstract][Full Text] [Related]  

  • 11. RadBERT: Adapting Transformer-based Language Models to Radiology.
    Yan A; McAuley J; Lu X; Du J; Chang EY; Gentili A; Hsu CN
    Radiol Artif Intell; 2022 Jul; 4(4):e210258. PubMed ID: 35923376
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical-drug relation extraction?
    Tang A; Deléger L; Bossy R; Zweigenbaum P; Nédellec C
    Database (Oxford); 2022 Aug; 2022():. PubMed ID: 36006843
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Disease Concept-Embedding Based on the Self-Supervised Method for Medical Information Extraction from Electronic Health Records and Disease Retrieval: Algorithm Development and Validation Study.
    Chen YP; Lo YH; Lai F; Huang CH
    J Med Internet Res; 2021 Jan; 23(1):e25113. PubMed ID: 33502324
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Measurement of Semantic Textual Similarity in Clinical Texts: Comparison of Transformer-Based Models.
    Yang X; He X; Zhang H; Ma Y; Bian J; Wu Y
    JMIR Med Inform; 2020 Nov; 8(11):e19735. PubMed ID: 33226350
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Pretrained Transformer Language Models Versus Pretrained Word Embeddings for the Detection of Accurate Health Information on Arabic Social Media: Comparative Study.
    Albalawi Y; Nikolov NS; Buckley J
    JMIR Form Res; 2022 Jun; 6(6):e34834. PubMed ID: 35767322
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Improved biomedical word embeddings in the transformer era.
    Noh J; Kavuluru R
    J Biomed Inform; 2021 Aug; 120():103867. PubMed ID: 34284119
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Predicting Semantic Similarity Between Clinical Sentence Pairs Using Transformer Models: Evaluation and Representational Analysis.
    Ormerod M; Martínez Del Rincón J; Devereux B
    JMIR Med Inform; 2021 May; 9(5):e23099. PubMed ID: 34037527
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Benchmarking for biomedical natural language processing tasks with a domain specific ALBERT.
    Naseem U; Dunn AG; Khushi M; Kim J
    BMC Bioinformatics; 2022 Apr; 23(1):144. PubMed ID: 35448946
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Critical assessment of transformer-based AI models for German clinical notes.
    Lentzen M; Madan S; Lage-Rupprecht V; Kühnel L; Fluck J; Jacobs M; Mittermaier M; Witzenrath M; Brunecker P; Hofmann-Apitius M; Weber J; Fröhlich H
    JAMIA Open; 2022 Dec; 5(4):ooac087. PubMed ID: 36380848
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A review on Natural Language Processing Models for COVID-19 research.
    Hall K; Chang V; Jayne C
    Healthc Anal (N Y); 2022 Nov; 2():100078. PubMed ID: 37520621
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 18.