These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
463 related articles for article (PubMed ID: 35161598)
1. Investigating the Efficient Use of Word Embedding with Neural-Topic Models for Interpretable Topics from Short Texts. Murakami R; Chakraborty B Sensors (Basel); 2022 Jan; 22(3):. PubMed ID: 35161598 [TBL] [Abstract][Full Text] [Related]
2. A Topic Recognition Method of News Text Based on Word Embedding Enhancement. Du Q; Li N; Liu W; Sun D; Yang S; Yue F Comput Intell Neurosci; 2022; 2022():4582480. PubMed ID: 35222628 [TBL] [Abstract][Full Text] [Related]
3. Evaluation of clustering and topic modeling methods over health-related tweets and emails. Lossio-Ventura JA; Gonzales S; Morzan J; Alatrista-Salas H; Hernandez-Boussard T; Bian J Artif Intell Med; 2021 Jul; 117():102096. PubMed ID: 34127235 [TBL] [Abstract][Full Text] [Related]
4. Short text topic modelling using local and global word-context semantic correlation. Kinariwala S; Deshmukh S Multimed Tools Appl; 2023 Feb; ():1-23. PubMed ID: 36747894 [TBL] [Abstract][Full Text] [Related]
5. Combining Knowledge Graph and Word Embeddings for Spherical Topic Modeling. Ennajari H; Bouguila N; Bentahar J IEEE Trans Neural Netw Learn Syst; 2023 Jul; 34(7):3609-3623. PubMed ID: 34559665 [TBL] [Abstract][Full Text] [Related]
7. Impact of word embedding models on text analytics in deep learning environment: a review. Asudani DS; Nagwani NK; Singh P Artif Intell Rev; 2023 Feb; ():1-81. PubMed ID: 36844886 [TBL] [Abstract][Full Text] [Related]
8. Clinical Context-Aware Biomedical Text Summarization Using Deep Neural Network: Model Development and Validation. Afzal M; Alam F; Malik KM; Malik GM J Med Internet Res; 2020 Oct; 22(10):e19810. PubMed ID: 33095174 [TBL] [Abstract][Full Text] [Related]
9. TopicBERT: A Topic-Enhanced Neural Language Model Fine-Tuned for Sentiment Classification. Zhou Y; Liao L; Gao Y; Wang R; Huang H IEEE Trans Neural Netw Learn Syst; 2023 Jan; 34(1):380-393. PubMed ID: 34357867 [TBL] [Abstract][Full Text] [Related]
10. A Method of Short Text Representation Based on the Feature Probability Embedded Vector. Zhou W; Wang H; Sun H; Sun T Sensors (Basel); 2019 Aug; 19(17):. PubMed ID: 31466389 [TBL] [Abstract][Full Text] [Related]
11. Intelligent diagnosis with Chinese electronic medical records based on convolutional neural networks. Li X; Wang H; He H; Du J; Chen J; Wu J BMC Bioinformatics; 2019 Feb; 20(1):62. PubMed ID: 30709336 [TBL] [Abstract][Full Text] [Related]
12. A new word embedding model integrated with medical knowledge for deep learning-based sentiment classification. Khine AH; Wettayaprasit W; Duangsuwan J Artif Intell Med; 2024 Feb; 148():102758. PubMed ID: 38325934 [TBL] [Abstract][Full Text] [Related]
13. Extracting information and inferences from a large text corpus. Avasthi S; Chauhan R; Acharjya DP Int J Inf Technol; 2023; 15(1):435-445. PubMed ID: 36440061 [TBL] [Abstract][Full Text] [Related]
14. Identifying health related occupations of Twitter users through word embedding and deep neural networks. Zainab K; Srivastava G; Mago V BMC Bioinformatics; 2022 Sep; 22(Suppl 10):630. PubMed ID: 36171569 [TBL] [Abstract][Full Text] [Related]
15. Using topic-noise models to generate domain-specific topics across data sources. Churchill R; Singh L Knowl Inf Syst; 2023; 65(5):2159-2186. PubMed ID: 36683608 [TBL] [Abstract][Full Text] [Related]
16. Enhancing clinical concept extraction with contextual embeddings. Si Y; Wang J; Xu H; Roberts K J Am Med Inform Assoc; 2019 Nov; 26(11):1297-1304. PubMed ID: 31265066 [TBL] [Abstract][Full Text] [Related]
18. Identifying Medication-Related Intents From a Bidirectional Text Messaging Platform for Hypertension Management Using an Unsupervised Learning Approach: Retrospective Observational Pilot Study. Davoudi A; Lee NS; Luong T; Delaney T; Asch E; Chaiyachati K; Mowery D J Med Internet Res; 2022 Jun; 24(6):e36151. PubMed ID: 35767327 [TBL] [Abstract][Full Text] [Related]