176 related articles for article (PubMed ID: 33635801)
1. Limitations of Transformers on Clinical Text Classification.
Gao S; Alawad M; Young MT; Gounley J; Schaefferkoetter N; Yoon HJ; Wu XC; Durbin EB; Doherty J; Stroup A; Coyle L; Tourassi G
IEEE J Biomed Health Inform; 2021 Sep; 25(9):3596-3607. PubMed ID: 33635801
[TBL] [Abstract][Full Text] [Related]
2. When BERT meets Bilbo: a learning curve analysis of pretrained language model on disease classification.
Li X; Yuan W; Peng D; Mei Q; Wang Y
BMC Med Inform Decis Mak; 2022 Apr; 21(Suppl 9):377. PubMed ID: 35382811
[TBL] [Abstract][Full Text] [Related]
3. Oversampling effect in pretraining for bidirectional encoder representations from transformers (BERT) to localize medical BERT and enhance biomedical BERT.
Wada S; Takeda T; Okada K; Manabe S; Konishi S; Kamohara J; Matsumura Y
Artif Intell Med; 2024 Jul; 153():102889. PubMed ID: 38728811
[TBL] [Abstract][Full Text] [Related]
4. Stacked DeBERT: All attention in incomplete data for text classification.
Cunha Sergio G; Lee M
Neural Netw; 2021 Apr; 136():87-96. PubMed ID: 33453522
[TBL] [Abstract][Full Text] [Related]
5. Classifying social determinants of health from unstructured electronic health records using deep learning-based natural language processing.
Han S; Zhang RF; Shi L; Richie R; Liu H; Tseng A; Quan W; Ryan N; Brent D; Tsui FR
J Biomed Inform; 2022 Mar; 127():103984. PubMed ID: 35007754
[TBL] [Abstract][Full Text] [Related]
6. Multi-Label Classification in Patient-Doctor Dialogues With the RoBERTa-WWM-ext + CNN (Robustly Optimized Bidirectional Encoder Representations From Transformers Pretraining Approach With Whole Word Masking Extended Combining a Convolutional Neural Network) Model: Named Entity Study.
Sun Y; Gao D; Shen X; Li M; Nan J; Zhang W
JMIR Med Inform; 2022 Apr; 10(4):e35606. PubMed ID: 35451969
[TBL] [Abstract][Full Text] [Related]
7. Chinese text classification by combining Chinese-BERTology-wwm and GCN.
Xu X; Chang Y; An J; Du Y
PeerJ Comput Sci; 2023; 9():e1544. PubMed ID: 37705631
[TBL] [Abstract][Full Text] [Related]
8. Use of BERT (Bidirectional Encoder Representations from Transformers)-Based Deep Learning Method for Extracting Evidences in Chinese Radiology Reports: Development of a Computer-Aided Liver Cancer Diagnosis Framework.
Liu H; Zhang Z; Xu Y; Wang N; Huang Y; Yang Z; Jiang R; Chen H
J Med Internet Res; 2021 Jan; 23(1):e19689. PubMed ID: 33433395
[TBL] [Abstract][Full Text] [Related]
9. A Question-and-Answer System to Extract Data From Free-Text Oncological Pathology Reports (CancerBERT Network): Development Study.
Mitchell JR; Szepietowski P; Howard R; Reisman P; Jones JD; Lewis P; Fridley BL; Rollison DE
J Med Internet Res; 2022 Mar; 24(3):e27210. PubMed ID: 35319481
[TBL] [Abstract][Full Text] [Related]
10. Korean clinical entity recognition from diagnosis text using BERT.
Kim YM; Lee TH
BMC Med Inform Decis Mak; 2020 Sep; 20(Suppl 7):242. PubMed ID: 32998724
[TBL] [Abstract][Full Text] [Related]
11. Transfer Learning from BERT to Support Insertion of New Concepts into SNOMED CT.
Liu H; Perl Y; Geller J
AMIA Annu Symp Proc; 2019; 2019():1129-1138. PubMed ID: 32308910
[TBL] [Abstract][Full Text] [Related]
12. Unified Medical Language System resources improve sieve-based generation and Bidirectional Encoder Representations from Transformers (BERT)-based ranking for concept normalization.
Xu D; Gopale M; Zhang J; Brown K; Begoli E; Bethard S
J Am Med Inform Assoc; 2020 Oct; 27(10):1510-1519. PubMed ID: 32719838
[TBL] [Abstract][Full Text] [Related]
13. Extracting comprehensive clinical information for breast cancer using deep learning methods.
Zhang X; Zhang Y; Zhang Q; Ren Y; Qiu T; Ma J; Sun Q
Int J Med Inform; 2019 Dec; 132():103985. PubMed ID: 31627032
[TBL] [Abstract][Full Text] [Related]
14. Automatic text classification of actionable radiology reports of tinnitus patients using bidirectional encoder representations from transformer (BERT) and in-domain pre-training (IDPT).
Li J; Lin Y; Zhao P; Liu W; Cai L; Sun J; Zhao L; Yang Z; Song H; Lv H; Wang Z
BMC Med Inform Decis Mak; 2022 Jul; 22(1):200. PubMed ID: 35907966
[TBL] [Abstract][Full Text] [Related]
15. Comparing deep learning architectures for sentiment analysis on drug reviews.
Colón-Ruiz C; Segura-Bedmar I
J Biomed Inform; 2020 Oct; 110():103539. PubMed ID: 32818665
[TBL] [Abstract][Full Text] [Related]
16. Bidirectional Encoder Representations from Transformers in Radiology: A Systematic Review of Natural Language Processing Applications.
Gorenstein L; Konen E; Green M; Klang E
J Am Coll Radiol; 2024 Jun; 21(6):914-941. PubMed ID: 38302036
[TBL] [Abstract][Full Text] [Related]
17. Automatic detection of actionable radiology reports using bidirectional encoder representations from transformers.
Nakamura Y; Hanaoka S; Nomura Y; Nakao T; Miki S; Watadani T; Yoshikawa T; Hayashi N; Abe O
BMC Med Inform Decis Mak; 2021 Sep; 21(1):262. PubMed ID: 34511100
[TBL] [Abstract][Full Text] [Related]
18. Highly accurate classification of chest radiographic reports using a deep learning natural language model pre-trained on 3.8 million text reports.
Bressem KK; Adams LC; Gaudin RA; Tröltzsch D; Hamm B; Makowski MR; Schüle CY; Vahldiek JL; Niehues SM
Bioinformatics; 2021 Jan; 36(21):5255-5261. PubMed ID: 32702106
[TBL] [Abstract][Full Text] [Related]
19. Identifying the Perceived Severity of Patient-Generated Telemedical Queries Regarding COVID: Developing and Evaluating a Transfer Learning-Based Solution.
Gatto J; Seegmiller P; Johnston G; Preum SM
JMIR Med Inform; 2022 Sep; 10(9):e37770. PubMed ID: 35981230
[TBL] [Abstract][Full Text] [Related]
20. TRP-BERT: Discrimination of transient receptor potential (TRP) channels using contextual representations from deep bidirectional transformer based on BERT.
Ali Shah SM; Ou YY
Comput Biol Med; 2021 Oct; 137():104821. PubMed ID: 34508974
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]