These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

135 related articles for article (PubMed ID: 37128387)

  • 1. PathologyBERT - Pre-trained Vs. A New Transformer Language Model for Pathology Domain.
    Santos T; Tariq A; Das S; Vayalpati K; Smith GH; Trivedi H; Banerjee I
    AMIA Annu Symp Proc; 2022; 2022():962-971. PubMed ID: 37128387
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A comparative study of pre-trained language models for named entity recognition in clinical trial eligibility criteria from multiple corpora.
    Li J; Wei Q; Ghiasvand O; Chen M; Lobanov V; Weng C; Xu H
    BMC Med Inform Decis Mak; 2022 Sep; 22(Suppl 3):235. PubMed ID: 36068551
    [TBL] [Abstract][Full Text] [Related]  

  • 3. BioBERT: a pre-trained biomedical language representation model for biomedical text mining.
    Lee J; Yoon W; Kim S; Kim D; Kim S; So CH; Kang J
    Bioinformatics; 2020 Feb; 36(4):1234-1240. PubMed ID: 31501885
    [TBL] [Abstract][Full Text] [Related]  

  • 4. BioGPT: generative pre-trained transformer for biomedical text generation and mining.
    Luo R; Sun L; Xia Y; Qin T; Zhang S; Poon H; Liu TY
    Brief Bioinform; 2022 Nov; 23(6):. PubMed ID: 36156661
    [TBL] [Abstract][Full Text] [Related]  

  • 5. CLIN-X: pre-trained language models and a study on cross-task transfer for concept extraction in the clinical domain.
    Lange L; Adel H; Strötgen J; Klakow D
    Bioinformatics; 2022 Jun; 38(12):3267-3274. PubMed ID: 35485748
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A Question-and-Answer System to Extract Data From Free-Text Oncological Pathology Reports (CancerBERT Network): Development Study.
    Mitchell JR; Szepietowski P; Howard R; Reisman P; Jones JD; Lewis P; Fridley BL; Rollison DE
    J Med Internet Res; 2022 Mar; 24(3):e27210. PubMed ID: 35319481
    [TBL] [Abstract][Full Text] [Related]  

  • 7. exKidneyBERT: a language model for kidney transplant pathology reports and the crucial role of extended vocabularies.
    Yang T; Sucholutsky I; Jen KY; Schonlau M
    PeerJ Comput Sci; 2024; 10():e1888. PubMed ID: 38435545
    [TBL] [Abstract][Full Text] [Related]  

  • 8. BioVAE: a pre-trained latent variable language model for biomedical text mining.
    Trieu HL; Miwa M; Ananiadou S
    Bioinformatics; 2022 Jan; 38(3):872-874. PubMed ID: 34636886
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Critical assessment of transformer-based AI models for German clinical notes.
    Lentzen M; Madan S; Lage-Rupprecht V; Kühnel L; Fluck J; Jacobs M; Mittermaier M; Witzenrath M; Brunecker P; Hofmann-Apitius M; Weber J; Fröhlich H
    JAMIA Open; 2022 Dec; 5(4):ooac087. PubMed ID: 36380848
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Discovering Thematically Coherent Biomedical Documents Using Contextualized Bidirectional Encoder Representations from Transformers-Based Clustering.
    Davagdorj K; Wang L; Li M; Pham VH; Ryu KH; Theera-Umpon N
    Int J Environ Res Public Health; 2022 May; 19(10):. PubMed ID: 35627429
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Localizing in-domain adaptation of transformer-based biomedical language models.
    Buonocore TM; Crema C; Redolfi A; Bellazzi R; Parimbelli E
    J Biomed Inform; 2023 Aug; 144():104431. PubMed ID: 37385327
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Predicting Semantic Similarity Between Clinical Sentence Pairs Using Transformer Models: Evaluation and Representational Analysis.
    Ormerod M; Martínez Del Rincón J; Devereux B
    JMIR Med Inform; 2021 May; 9(5):e23099. PubMed ID: 34037527
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Folic acid supplementation and malaria susceptibility and severity among people taking antifolate antimalarial drugs in endemic areas.
    Crider K; Williams J; Qi YP; Gutman J; Yeung L; Mai C; Finkelstain J; Mehta S; Pons-Duran C; Menéndez C; Moraleda C; Rogers L; Daniels K; Green P
    Cochrane Database Syst Rev; 2022 Feb; 2(2022):. PubMed ID: 36321557
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Transformer-based structuring of free-text radiology report databases.
    Nowak S; Biesner D; Layer YC; Theis M; Schneider H; Block W; Wulff B; Attenberger UI; Sifa R; Sprinkart AM
    Eur Radiol; 2023 Jun; 33(6):4228-4236. PubMed ID: 36905469
    [TBL] [Abstract][Full Text] [Related]  

  • 15. On the Construction of Multilingual Corpora for Clinical Text Mining.
    Villena F; Eisenmann U; Knaup P; Dunstan J; Ganzinger M
    Stud Health Technol Inform; 2020 Jun; 270():347-351. PubMed ID: 32570404
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Transformers-sklearn: a toolkit for medical language understanding with transformer-based models.
    Yang F; Wang X; Ma H; Li J
    BMC Med Inform Decis Mak; 2021 Jul; 21(Suppl 2):90. PubMed ID: 34330244
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction.
    Su P; Vijay-Shanker K
    BMC Bioinformatics; 2022 Apr; 23(1):120. PubMed ID: 35379166
    [TBL] [Abstract][Full Text] [Related]  

  • 18. STonKGs: a sophisticated transformer trained on biomedical text and knowledge graphs.
    Balabin H; Hoyt CT; Birkenbihl C; Gyori BM; Bachman J; Kodamullil AT; Plöger PG; Hofmann-Apitius M; Domingo-Fernández D
    Bioinformatics; 2022 Mar; 38(6):1648-1656. PubMed ID: 34986221
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Transformer-based models for ICD-10 coding of death certificates with Portuguese text.
    Coutinho I; Martins B
    J Biomed Inform; 2022 Dec; 136():104232. PubMed ID: 36307020
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Ensemble of Deep Masked Language Models for Effective Named Entity Recognition in Health and Life Science Corpora.
    Naderi N; Knafou J; Copara J; Ruch P; Teodoro D
    Front Res Metr Anal; 2021; 6():689803. PubMed ID: 34870074
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.