BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

306 related articles for article (PubMed ID: 36156661)

  • 1. BioGPT: generative pre-trained transformer for biomedical text generation and mining.
    Luo R; Sun L; Xia Y; Qin T; Zhang S; Poon H; Liu TY
    Brief Bioinform; 2022 Nov; 23(6):. PubMed ID: 36156661
    [TBL] [Abstract][Full Text] [Related]  

  • 2. BioBERT: a pre-trained biomedical language representation model for biomedical text mining.
    Lee J; Yoon W; Kim S; Kim D; Kim S; So CH; Kang J
    Bioinformatics; 2020 Feb; 36(4):1234-1240. PubMed ID: 31501885
    [TBL] [Abstract][Full Text] [Related]  

  • 3. BioVAE: a pre-trained latent variable language model for biomedical text mining.
    Trieu HL; Miwa M; Ananiadou S
    Bioinformatics; 2022 Jan; 38(3):872-874. PubMed ID: 34636886
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Evaluation of GPT and BERT-based models on identifying proteinprotein interactions in biomedical text.
    Rehana H; Çam NB; Basmaci M; Zheng J; Jemiyo C; He Y; Özgür A; Hur J
    ArXiv; 2023 Dec; ():. PubMed ID: 38764593
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Leveraging pre-trained language models for mining microbiome-disease relationships.
    Karkera N; Acharya S; Palaniappan SK
    BMC Bioinformatics; 2023 Jul; 24(1):290. PubMed ID: 37468830
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction.
    Su P; Vijay-Shanker K
    BMC Bioinformatics; 2022 Apr; 23(1):120. PubMed ID: 35379166
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Bioformer: an efficient transformer language model for biomedical text mining.
    Fang L; Chen Q; Wei CH; Lu Z; Wang K
    ArXiv; 2023 Feb; ():. PubMed ID: 36945685
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Discovering Thematically Coherent Biomedical Documents Using Contextualized Bidirectional Encoder Representations from Transformers-Based Clustering.
    Davagdorj K; Wang L; Li M; Pham VH; Ryu KH; Theera-Umpon N
    Int J Environ Res Public Health; 2022 May; 19(10):. PubMed ID: 35627429
    [TBL] [Abstract][Full Text] [Related]  

  • 9. BioBERT and Similar Approaches for Relation Extraction.
    Bhasuran B
    Methods Mol Biol; 2022; 2496():221-235. PubMed ID: 35713867
    [TBL] [Abstract][Full Text] [Related]  

  • 10. A Study of Biomedical Relation Extraction Using GPT Models.
    Zhang J; Wibert M; Zhou H; Peng X; Chen Q; Keloth VK; Hu Y; Zhang R; Xu H; Raja K
    AMIA Jt Summits Transl Sci Proc; 2024; 2024():391-400. PubMed ID: 38827097
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Benchmarking for biomedical natural language processing tasks with a domain specific ALBERT.
    Naseem U; Dunn AG; Khushi M; Kim J
    BMC Bioinformatics; 2022 Apr; 23(1):144. PubMed ID: 35448946
    [TBL] [Abstract][Full Text] [Related]  

  • 12. A comparative study of pre-trained language models for named entity recognition in clinical trial eligibility criteria from multiple corpora.
    Li J; Wei Q; Ghiasvand O; Chen M; Lobanov V; Weng C; Xu H
    BMC Med Inform Decis Mak; 2022 Sep; 22(Suppl 3):235. PubMed ID: 36068551
    [TBL] [Abstract][Full Text] [Related]  

  • 13. BactInt: A domain driven transfer learning approach for extracting inter-bacterial associations from biomedical text.
    Das Baksi K; Pokhrel V; Pudavar AE; Mande SS; Kuntal BK
    Comput Biol Chem; 2024 Apr; 109():108012. PubMed ID: 38198963
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Biomedical named entity recognition with the combined feature attention and fully-shared multi-task learning.
    Zhang Z; Chen ALP
    BMC Bioinformatics; 2022 Nov; 23(1):458. PubMed ID: 36329384
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Do syntactic trees enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical-drug relation extraction?
    Tang A; Deléger L; Bossy R; Zweigenbaum P; Nédellec C
    Database (Oxford); 2022 Aug; 2022():. PubMed ID: 36006843
    [TBL] [Abstract][Full Text] [Related]  

  • 16. ChIP-GPT: a managed large language model for robust data extraction from biomedical database records.
    Cinquin O
    Brief Bioinform; 2024 Jan; 25(2):. PubMed ID: 38314912
    [TBL] [Abstract][Full Text] [Related]  

  • 17. AMMU: A survey of transformer-based biomedical pretrained language models.
    Kalyan KS; Rajasekharan A; Sangeetha S
    J Biomed Inform; 2022 Feb; 126():103982. PubMed ID: 34974190
    [TBL] [Abstract][Full Text] [Related]  

  • 18. A prefix and attention map discrimination fusion guided attention for biomedical named entity recognition.
    Guan Z; Zhou X
    BMC Bioinformatics; 2023 Feb; 24(1):42. PubMed ID: 36755230
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Biomedical generative pre-trained based transformer language model for age-related disease target discovery.
    Zagirova D; Pushkov S; Leung GHD; Liu BHM; Urban A; Sidorenko D; Kalashnikov A; Kozlova E; Naumov V; Pun FW; Ozerov IV; Aliper A; Zhavoronkov A
    Aging (Albany NY); 2023 Sep; 15(18):9293-9309. PubMed ID: 37742294
    [TBL] [Abstract][Full Text] [Related]  

  • 20. STonKGs: a sophisticated transformer trained on biomedical text and knowledge graphs.
    Balabin H; Hoyt CT; Birkenbihl C; Gyori BM; Bachman J; Kodamullil AT; Plöger PG; Hofmann-Apitius M; Domingo-Fernández D
    Bioinformatics; 2022 Mar; 38(6):1648-1656. PubMed ID: 34986221
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 16.