These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

122 related articles for article (PubMed ID: 38435559)

  • 1. Using Bidirectional Encoder Representations from Transformers (BERT) to predict criminal charges and sentences from Taiwanese court judgments.
    Peng YT; Lei CL
    PeerJ Comput Sci; 2024; 10():e1841. PubMed ID: 38435559
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Predicting Semantic Similarity Between Clinical Sentence Pairs Using Transformer Models: Evaluation and Representational Analysis.
    Ormerod M; Martínez Del Rincón J; Devereux B
    JMIR Med Inform; 2021 May; 9(5):e23099. PubMed ID: 34037527
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study.
    Kades K; Sellner J; Koehler G; Full PM; Lai TYE; Kleesiek J; Maier-Hein KH
    JMIR Med Inform; 2021 Feb; 9(2):e22795. PubMed ID: 33533728
    [TBL] [Abstract][Full Text] [Related]  

  • 4. BERT-GT: cross-sentence n-ary relation extraction with BERT and Graph Transformer.
    Lai PT; Lu Z
    Bioinformatics; 2021 Apr; 36(24):5678-5685. PubMed ID: 33416851
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Developing Artificial Intelligence Models for Extracting Oncologic Outcomes from Japanese Electronic Health Records.
    Araki K; Matsumoto N; Togo K; Yonemoto N; Ohki E; Xu L; Hasegawa Y; Satoh D; Takemoto R; Miyazaki T
    Adv Ther; 2023 Mar; 40(3):934-950. PubMed ID: 36547809
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Bidirectional Encoder Representations from Transformers in Radiology: A Systematic Review of Natural Language Processing Applications.
    Gorenstein L; Konen E; Green M; Klang E
    J Am Coll Radiol; 2024 Jun; 21(6):914-941. PubMed ID: 38302036
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Modified Bidirectional Encoder Representations From Transformers Extractive Summarization Model for Hospital Information Systems Based on Character-Level Tokens (AlphaBERT): Development and Performance Evaluation.
    Chen YP; Chen YY; Lin JJ; Huang CH; Lai F
    JMIR Med Inform; 2020 Apr; 8(4):e17787. PubMed ID: 32347806
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Multi-Label Classification in Patient-Doctor Dialogues With the RoBERTa-WWM-ext + CNN (Robustly Optimized Bidirectional Encoder Representations From Transformers Pretraining Approach With Whole Word Masking Extended Combining a Convolutional Neural Network) Model: Named Entity Study.
    Sun Y; Gao D; Shen X; Li M; Nan J; Zhang W
    JMIR Med Inform; 2022 Apr; 10(4):e35606. PubMed ID: 35451969
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Automatic text classification of actionable radiology reports of tinnitus patients using bidirectional encoder representations from transformer (BERT) and in-domain pre-training (IDPT).
    Li J; Lin Y; Zhao P; Liu W; Cai L; Sun J; Zhao L; Yang Z; Song H; Lv H; Wang Z
    BMC Med Inform Decis Mak; 2022 Jul; 22(1):200. PubMed ID: 35907966
    [TBL] [Abstract][Full Text] [Related]  

  • 10. DR-BERT: A protein language model to annotate disordered regions.
    Nambiar A; Forsyth JM; Liu S; Maslov S
    Structure; 2024 Apr; ():. PubMed ID: 38701796
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Construction of a Multi-Label Classifier for Extracting Multiple Incident Factors From Medication Incident Reports in Residential Care Facilities: Natural Language Processing Approach.
    Kizaki H; Satoh H; Ebara S; Watabe S; Sawada Y; Imai S; Hori S
    JMIR Med Inform; 2024 Jul; 12():e58141. PubMed ID: 39042454
    [TBL] [Abstract][Full Text] [Related]  

  • 12. RadBERT: Adapting Transformer-based Language Models to Radiology.
    Yan A; McAuley J; Lu X; Du J; Chang EY; Gentili A; Hsu CN
    Radiol Artif Intell; 2022 Jul; 4(4):e210258. PubMed ID: 35923376
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Deep Learning Approach for Negation and Speculation Detection for Automated Important Finding Flagging and Extraction in Radiology Report: Internal Validation and Technique Comparison Study.
    Weng KH; Liu CF; Chen CJ
    JMIR Med Inform; 2023 Apr; 11():e46348. PubMed ID: 37097731
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Predicting Patients' Satisfaction With Mental Health Drug Treatment Using Their Reviews: Unified Interchangeable Model Fusion Approach.
    Wang Y; Yu Y; Liu Y; Ma Y; Pang PC
    JMIR Ment Health; 2023 Dec; 10():e49894. PubMed ID: 38051580
    [TBL] [Abstract][Full Text] [Related]  

  • 15. GT-Finder: Classify the family of glucose transporters with pre-trained BERT language models.
    Ali Shah SM; Taju SW; Ho QT; Nguyen TT; Ou YY
    Comput Biol Med; 2021 Apr; 131():104259. PubMed ID: 33581474
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Transfer Learning from BERT to Support Insertion of New Concepts into SNOMED CT.
    Liu H; Perl Y; Geller J
    AMIA Annu Symp Proc; 2019; 2019():1129-1138. PubMed ID: 32308910
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Knowledge Graph Completion for the Chinese Text of Cultural Relics Based on Bidirectional Encoder Representations from Transformers with Entity-Type Information.
    Zhang M; Geng G; Zeng S; Jia H
    Entropy (Basel); 2020 Oct; 22(10):. PubMed ID: 33286937
    [TBL] [Abstract][Full Text] [Related]  

  • 18. RxBERT: Enhancing drug labeling text mining and analysis with AI language modeling.
    Wu L; Gray M; Dang O; Xu J; Fang H; Tong W
    Exp Biol Med (Maywood); 2023 Nov; 248(21):1937-1943. PubMed ID: 38166420
    [TBL] [Abstract][Full Text] [Related]  

  • 19. BERT-Kgly: A Bidirectional Encoder Representations From Transformers (BERT)-Based Model for Predicting Lysine Glycation Site for
    Liu Y; Liu Y; Wang GA; Cheng Y; Bi S; Zhu X
    Front Bioinform; 2022; 2():834153. PubMed ID: 36304324
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A Natural Language Processing Model for COVID-19 Detection Based on Dutch General Practice Electronic Health Records by Using Bidirectional Encoder Representations From Transformers: Development and Validation Study.
    Homburg M; Meijer E; Berends M; Kupers T; Olde Hartman T; Muris J; de Schepper E; Velek P; Kuiper J; Berger M; Peters L
    J Med Internet Res; 2023 Oct; 25():e49944. PubMed ID: 37792444
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.