These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

128 related articles for article (PubMed ID: 34752490)

  • 1. A clinical specific BERT developed using a huge Japanese clinical text corpus.
    Kawazoe Y; Shibata D; Shinohara E; Aramaki E; Ohe K
    PLoS One; 2021; 16(11):e0259763. PubMed ID: 34752490
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Semantic Textual Similarity in Japanese Clinical Domain Texts Using BERT.
    Mutinda FW; Yada S; Wakamiya S; Aramaki E
    Methods Inf Med; 2021 Jun; 60(S 01):e56-e64. PubMed ID: 34237783
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Oversampling effect in pretraining for bidirectional encoder representations from transformers (BERT) to localize medical BERT and enhance biomedical BERT.
    Wada S; Takeda T; Okada K; Manabe S; Konishi S; Kamohara J; Matsumura Y
    Artif Intell Med; 2024 Jul; 153():102889. PubMed ID: 38728811
    [TBL] [Abstract][Full Text] [Related]  

  • 4. COVID-Twitter-BERT: A natural language processing model to analyse COVID-19 content on Twitter.
    Müller M; Salathé M; Kummervold PE
    Front Artif Intell; 2023; 6():1023281. PubMed ID: 36998290
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Classifying the lifestyle status for Alzheimer's disease from clinical notes using deep learning with weak supervision.
    Shen Z; Schutte D; Yi Y; Bompelli A; Yu F; Wang Y; Zhang R
    BMC Med Inform Decis Mak; 2022 Jul; 22(Suppl 1):88. PubMed ID: 35799294
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A pre-trained BERT for Korean medical natural language processing.
    Kim Y; Kim JH; Lee JM; Jang MJ; Yum YJ; Kim S; Shin U; Kim YM; Joo HJ; Song S
    Sci Rep; 2022 Aug; 12(1):13847. PubMed ID: 35974113
    [TBL] [Abstract][Full Text] [Related]  

  • 7. A Question-and-Answer System to Extract Data From Free-Text Oncological Pathology Reports (CancerBERT Network): Development Study.
    Mitchell JR; Szepietowski P; Howard R; Reisman P; Jones JD; Lewis P; Fridley BL; Rollison DE
    J Med Internet Res; 2022 Mar; 24(3):e27210. PubMed ID: 35319481
    [TBL] [Abstract][Full Text] [Related]  

  • 8. BatteryBERT: A Pretrained Language Model for Battery Database Enhancement.
    Huang S; Cole JM
    J Chem Inf Model; 2022 Dec; 62(24):6365-6377. PubMed ID: 35533012
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)-Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study.
    Li F; Jin Y; Liu W; Rawat BPS; Cai P; Yu H
    JMIR Med Inform; 2019 Sep; 7(3):e14830. PubMed ID: 31516126
    [TBL] [Abstract][Full Text] [Related]  

  • 10. exKidneyBERT: a language model for kidney transplant pathology reports and the crucial role of extended vocabularies.
    Yang T; Sucholutsky I; Jen KY; Schonlau M
    PeerJ Comput Sci; 2024; 10():e1888. PubMed ID: 38435545
    [TBL] [Abstract][Full Text] [Related]  

  • 11. BioBERT: a pre-trained biomedical language representation model for biomedical text mining.
    Lee J; Yoon W; Kim S; Kim D; Kim S; So CH; Kang J
    Bioinformatics; 2020 Feb; 36(4):1234-1240. PubMed ID: 31501885
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Korean clinical entity recognition from diagnosis text using BERT.
    Kim YM; Lee TH
    BMC Med Inform Decis Mak; 2020 Sep; 20(Suppl 7):242. PubMed ID: 32998724
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Transformers-sklearn: a toolkit for medical language understanding with transformer-based models.
    Yang F; Wang X; Ma H; Li J
    BMC Med Inform Decis Mak; 2021 Jul; 21(Suppl 2):90. PubMed ID: 34330244
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Prediction of Personal Experience Tweets of Medication Use via Contextual Word Representations
    Jiang K; Chen T; Calix RA; Bernard GR
    Annu Int Conf IEEE Eng Med Biol Soc; 2019 Jul; 2019():6093-6096. PubMed ID: 31947235
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Med-BERT: pretrained contextualized embeddings on large-scale structured electronic health records for disease prediction.
    Rasmy L; Xiang Y; Xie Z; Tao C; Zhi D
    NPJ Digit Med; 2021 May; 4(1):86. PubMed ID: 34017034
    [TBL] [Abstract][Full Text] [Related]  

  • 16. BioBERT and Similar Approaches for Relation Extraction.
    Bhasuran B
    Methods Mol Biol; 2022; 2496():221-235. PubMed ID: 35713867
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Tweet Classification Toward Twitter-Based Disease Surveillance: New Data, Methods, and Evaluations.
    Wakamiya S; Morita M; Kano Y; Ohkuma T; Aramaki E
    J Med Internet Res; 2019 Feb; 21(2):e12783. PubMed ID: 30785407
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Comparison of pretrained transformer-based models for influenza and COVID-19 detection using social media text data in Saskatchewan, Canada.
    Tian Y; Zhang W; Duan L; McDonald W; Osgood N
    Front Digit Health; 2023; 5():1203874. PubMed ID: 37448834
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study.
    Kades K; Sellner J; Koehler G; Full PM; Lai TYE; Kleesiek J; Maier-Hein KH
    JMIR Med Inform; 2021 Feb; 9(2):e22795. PubMed ID: 33533728
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Automatic text classification of actionable radiology reports of tinnitus patients using bidirectional encoder representations from transformer (BERT) and in-domain pre-training (IDPT).
    Li J; Lin Y; Zhao P; Liu W; Cai L; Sun J; Zhao L; Yang Z; Song H; Lv H; Wang Z
    BMC Med Inform Decis Mak; 2022 Jul; 22(1):200. PubMed ID: 35907966
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.