These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

120 related articles for article (PubMed ID: 38940177)

  • 1. MolLM: a unified language model for integrating biomedical text with 2D and 3D molecular representations.
    Tang X; Tran A; Tan J; Gerstein MB
    Bioinformatics; 2024 Jun; 40(Supplement_1):i357-i368. PubMed ID: 38940177
    [TBL] [Abstract][Full Text] [Related]  

  • 2. BioBERT: a pre-trained biomedical language representation model for biomedical text mining.
    Lee J; Yoon W; Kim S; Kim D; Kim S; So CH; Kang J
    Bioinformatics; 2020 Feb; 36(4):1234-1240. PubMed ID: 31501885
    [TBL] [Abstract][Full Text] [Related]  

  • 3. STonKGs: a sophisticated transformer trained on biomedical text and knowledge graphs.
    Balabin H; Hoyt CT; Birkenbihl C; Gyori BM; Bachman J; Kodamullil AT; Plöger PG; Hofmann-Apitius M; Domingo-Fernández D
    Bioinformatics; 2022 Mar; 38(6):1648-1656. PubMed ID: 34986221
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Evaluating sentence representations for biomedical text: Methods and experimental results.
    Tawfik NS; Spruit MR
    J Biomed Inform; 2020 Apr; 104():103396. PubMed ID: 32147441
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Deep contextualized embeddings for quantifying the informative content in biomedical text summarization.
    Moradi M; Dorffner G; Samwald M
    Comput Methods Programs Biomed; 2020 Feb; 184():105117. PubMed ID: 31627150
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Improved biomedical word embeddings in the transformer era.
    Noh J; Kavuluru R
    J Biomed Inform; 2021 Aug; 120():103867. PubMed ID: 34284119
    [TBL] [Abstract][Full Text] [Related]  

  • 7. BioVAE: a pre-trained latent variable language model for biomedical text mining.
    Trieu HL; Miwa M; Ananiadou S
    Bioinformatics; 2022 Jan; 38(3):872-874. PubMed ID: 34636886
    [TBL] [Abstract][Full Text] [Related]  

  • 8. LBERT: Lexically aware Transformer-based Bidirectional Encoder Representation model for learning universal bio-entity relations.
    Warikoo N; Chang YC; Hsu WL
    Bioinformatics; 2021 Apr; 37(3):404-412. PubMed ID: 32810217
    [TBL] [Abstract][Full Text] [Related]  

  • 9. MedCPT: Contrastive Pre-trained Transformers with large-scale PubMed search logs for zero-shot biomedical information retrieval.
    Jin Q; Kim W; Chen Q; Comeau DC; Yeganova L; Wilbur WJ; Lu Z
    Bioinformatics; 2023 Nov; 39(11):. PubMed ID: 37930897
    [TBL] [Abstract][Full Text] [Related]  

  • 10. SwinCross: Cross-modal Swin transformer for head-and-neck tumor segmentation in PET/CT images.
    Li GY; Chen J; Jang SI; Gong K; Li Q
    Med Phys; 2024 Mar; 51(3):2096-2107. PubMed ID: 37776263
    [TBL] [Abstract][Full Text] [Related]  

  • 11. FTMMR: Fusion Transformer for Integrating Multiple Molecular Representations.
    Son YH; Shin DH; Kam TE
    IEEE J Biomed Health Inform; 2024 Jul; 28(7):4361-4372. PubMed ID: 38551824
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Discovering Thematically Coherent Biomedical Documents Using Contextualized Bidirectional Encoder Representations from Transformers-Based Clustering.
    Davagdorj K; Wang L; Li M; Pham VH; Ryu KH; Theera-Umpon N
    Int J Environ Res Public Health; 2022 May; 19(10):. PubMed ID: 35627429
    [TBL] [Abstract][Full Text] [Related]  

  • 13. LMCrot: an enhanced protein crotonylation site predictor by leveraging an interpretable window-level embedding from a transformer-based protein language model.
    Pratyush P; Bahmani S; Pokharel S; Ismail HD; Kc DB
    Bioinformatics; 2024 May; 40(5):. PubMed ID: 38662579
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Towards Transfer Learning Techniques-BERT, DistilBERT, BERTimbau, and DistilBERTimbau for Automatic Text Classification from Different Languages: A Case Study.
    Silva Barbon R; Akabane AT
    Sensors (Basel); 2022 Oct; 22(21):. PubMed ID: 36365883
    [TBL] [Abstract][Full Text] [Related]  

  • 15. A comparative study on deep learning models for text classification of unstructured medical notes with various levels of class imbalance.
    Lu H; Ehwerhemuepha L; Rakovski C
    BMC Med Res Methodol; 2022 Jul; 22(1):181. PubMed ID: 35780100
    [TBL] [Abstract][Full Text] [Related]  

  • 16. A comparison of word embeddings for the biomedical natural language processing.
    Wang Y; Liu S; Afzal N; Rastegar-Mojarad M; Wang L; Shen F; Kingsbury P; Liu H
    J Biomed Inform; 2018 Nov; 87():12-20. PubMed ID: 30217670
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Identification of Semantically Similar Sentences in Clinical Notes: Iterative Intermediate Training Using Multi-Task Learning.
    Mahajan D; Poddar A; Liang JJ; Lin YT; Prager JM; Suryanarayanan P; Raghavan P; Tsou CH
    JMIR Med Inform; 2020 Nov; 8(11):e22508. PubMed ID: 33245284
    [TBL] [Abstract][Full Text] [Related]  

  • 18. GIT-Mol: A multi-modal large language model for molecular science with graph, image, and text.
    Liu P; Ren Y; Tao J; Ren Z
    Comput Biol Med; 2024 Mar; 171():108073. PubMed ID: 38359660
    [TBL] [Abstract][Full Text] [Related]  

  • 19. ProteinBERT: a universal deep-learning model of protein sequence and function.
    Brandes N; Ofer D; Peleg Y; Rappoport N; Linial M
    Bioinformatics; 2022 Apr; 38(8):2102-2110. PubMed ID: 35020807
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A clinical text classification paradigm using weak supervision and deep representation.
    Wang Y; Sohn S; Liu S; Shen F; Wang L; Atkinson EJ; Amin S; Liu H
    BMC Med Inform Decis Mak; 2019 Jan; 19(1):1. PubMed ID: 30616584
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.