These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

283 related articles for article (PubMed ID: 33416851)

  • 1. BERT-GT: cross-sentence n-ary relation extraction with BERT and Graph Transformer.
    Lai PT; Lu Z
    Bioinformatics; 2021 Apr; 36(24):5678-5685. PubMed ID: 33416851
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Extracting comprehensive clinical information for breast cancer using deep learning methods.
    Zhang X; Zhang Y; Zhang Q; Ren Y; Qiu T; Ma J; Sun Q
    Int J Med Inform; 2019 Dec; 132():103985. PubMed ID: 31627032
    [TBL] [Abstract][Full Text] [Related]  

  • 3. BioBERT and Similar Approaches for Relation Extraction.
    Bhasuran B
    Methods Mol Biol; 2022; 2496():221-235. PubMed ID: 35713867
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Predicting Semantic Similarity Between Clinical Sentence Pairs Using Transformer Models: Evaluation and Representational Analysis.
    Ormerod M; Martínez Del Rincón J; Devereux B
    JMIR Med Inform; 2021 May; 9(5):e23099. PubMed ID: 34037527
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study.
    Kades K; Sellner J; Koehler G; Full PM; Lai TYE; Kleesiek J; Maier-Hein KH
    JMIR Med Inform; 2021 Feb; 9(2):e22795. PubMed ID: 33533728
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Incorporating representation learning and multihead attention to improve biomedical cross-sentence n-ary relation extraction.
    Zhao D; Wang J; Zhang Y; Wang X; Lin H; Yang Z
    BMC Bioinformatics; 2020 Jul; 21(1):312. PubMed ID: 32677883
    [TBL] [Abstract][Full Text] [Related]  

  • 7. BioBERT: a pre-trained biomedical language representation model for biomedical text mining.
    Lee J; Yoon W; Kim S; Kim D; Kim S; So CH; Kang J
    Bioinformatics; 2020 Feb; 36(4):1234-1240. PubMed ID: 31501885
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction.
    Su P; Vijay-Shanker K
    BMC Bioinformatics; 2022 Apr; 23(1):120. PubMed ID: 35379166
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Clinical concept extraction using transformers.
    Yang X; Bian J; Hogan WR; Wu Y
    J Am Med Inform Assoc; 2020 Dec; 27(12):1935-1942. PubMed ID: 33120431
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Benchmarking for biomedical natural language processing tasks with a domain specific ALBERT.
    Naseem U; Dunn AG; Khushi M; Kim J
    BMC Bioinformatics; 2022 Apr; 23(1):144. PubMed ID: 35448946
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Identify diabetic retinopathy-related clinical concepts and their attributes using transformer-based natural language processing methods.
    Yu Z; Yang X; Sweeting GL; Ma Y; Stolte SE; Fang R; Wu Y
    BMC Med Inform Decis Mak; 2022 Sep; 22(Suppl 3):255. PubMed ID: 36167551
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Bioformer: an efficient transformer language model for biomedical text mining.
    Fang L; Chen Q; Wei CH; Lu Z; Wang K
    ArXiv; 2023 Feb; ():. PubMed ID: 36945685
    [TBL] [Abstract][Full Text] [Related]  

  • 13. N-GPETS: Neural Attention Graph-Based Pretrained Statistical Model for Extractive Text Summarization.
    Umair M; Alam I; Khan A; Khan I; Ullah N; Momand MY
    Comput Intell Neurosci; 2022; 2022():6241373. PubMed ID: 36458230
    [TBL] [Abstract][Full Text] [Related]  

  • 14. A span-based joint model for extracting entities and relations of bacteria biotopes.
    Zuo M; Zhang Y
    Bioinformatics; 2021 Dec; 38(1):220-227. PubMed ID: 34398194
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Extracting biomedical relation from cross-sentence text using syntactic dependency graph attention network.
    Zhou X; Fu Q; Chen J; Liu L; Wang Y; Lu Y; Wu H
    J Biomed Inform; 2023 Aug; 144():104445. PubMed ID: 37467835
    [TBL] [Abstract][Full Text] [Related]  

  • 16. DocR-BERT: Document-Level R-BERT for Chemical-Induced Disease Relation Extraction via Gaussian Probability Distribution.
    Li Z; Chen H; Qi R; Lin H; Chen H
    IEEE J Biomed Health Inform; 2022 Mar; 26(3):1341-1352. PubMed ID: 34591774
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Transformers-sklearn: a toolkit for medical language understanding with transformer-based models.
    Yang F; Wang X; Ma H; Li J
    BMC Med Inform Decis Mak; 2021 Jul; 21(Suppl 2):90. PubMed ID: 34330244
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical-drug relation extraction?
    Tang A; Deléger L; Bossy R; Zweigenbaum P; Nédellec C
    Database (Oxford); 2022 Aug; 2022():. PubMed ID: 36006843
    [TBL] [Abstract][Full Text] [Related]  

  • 19. BioRED: a rich biomedical relation extraction dataset.
    Luo L; Lai PT; Wei CH; Arighi CN; Lu Z
    Brief Bioinform; 2022 Sep; 23(5):. PubMed ID: 35849818
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Multi-Label Classification in Patient-Doctor Dialogues With the RoBERTa-WWM-ext + CNN (Robustly Optimized Bidirectional Encoder Representations From Transformers Pretraining Approach With Whole Word Masking Extended Combining a Convolutional Neural Network) Model: Named Entity Study.
    Sun Y; Gao D; Shen X; Li M; Nan J; Zhang W
    JMIR Med Inform; 2022 Apr; 10(4):e35606. PubMed ID: 35451969
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 15.