These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
3. English-Chinese Machine Translation Based on Transfer Learning and Chinese-English Corpus. Xu B Comput Intell Neurosci; 2022; 2022():1563731. PubMed ID: 36203725 [TBL] [Abstract][Full Text] [Related]
4. Quantum neural network based machine translator for Hindi to English. Narayan R; Singh VP; Chakraverty S ScientificWorldJournal; 2014; 2014():485737. PubMed ID: 24977198 [TBL] [Abstract][Full Text] [Related]
5. The neural machine translation models for the low-resource Kazakh-English language pair. Karyukin V; Rakhimova D; Karibayeva A; Turganbayeva A; Turarbek A PeerJ Comput Sci; 2023; 9():e1224. PubMed ID: 37346576 [TBL] [Abstract][Full Text] [Related]
6. Video captioning based on vision transformer and reinforcement learning. Zhao H; Chen Z; Guo L; Han Z PeerJ Comput Sci; 2022; 8():e916. PubMed ID: 35494808 [TBL] [Abstract][Full Text] [Related]
7. Automatic generation of conclusions from neuroradiology MRI reports through natural language processing. López-Úbeda P; Martín-Noguerol T; Escartín J; Luna A Neuroradiology; 2024 Apr; 66(4):477-485. PubMed ID: 38381144 [TBL] [Abstract][Full Text] [Related]
8. Efficient incremental training using a novel NMT-SMT hybrid framework for translation of low-resource languages. Bhuvaneswari K; Varalakshmi M Front Artif Intell; 2024; 7():1381290. PubMed ID: 39386916 [TBL] [Abstract][Full Text] [Related]
9. Improving neural machine translation with POS-tag features for low-resource language pairs. Hlaing ZZ; Thu YK; Supnithi T; Netisopakul P Heliyon; 2022 Aug; 8(8):e10375. PubMed ID: 36033261 [TBL] [Abstract][Full Text] [Related]
10. Beyond the Transformer: A Novel Polynomial Inherent Attention (PIA) Model and Its Great Impact on Neural Machine Translation. ELAffendi M; Alrajhi K Comput Intell Neurosci; 2022; 2022():1912750. PubMed ID: 36188704 [TBL] [Abstract][Full Text] [Related]
11. CADxReport: Chest x-ray report generation using co-attention mechanism and reinforcement learning. Kaur N; Mittal A Comput Biol Med; 2022 Jun; 145():105498. PubMed ID: 35585727 [TBL] [Abstract][Full Text] [Related]
12. Evaluating the performance of multilingual models in answer extraction and question generation. Moreno-Cediel A; Del-Hoyo-Gabaldon JA; Garcia-Lopez E; Garcia-Cabot A; de-Fitero-Dominguez D Sci Rep; 2024 Jul; 14(1):15477. PubMed ID: 38969767 [TBL] [Abstract][Full Text] [Related]
13. Cross Encoder-Decoder Transformer with Global-Local Visual Extractor for Medical Image Captioning. Lee H; Cho H; Park J; Chae J; Kim J Sensors (Basel); 2022 Feb; 22(4):. PubMed ID: 35214330 [TBL] [Abstract][Full Text] [Related]
14. A Transformer-Based Neural Machine Translation Model for Arabic Dialects That Utilizes Subword Units. Baniata LH; Ampomah IKE; Park S Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640835 [TBL] [Abstract][Full Text] [Related]
15. Adoption of Wireless Network and Artificial Intelligence Algorithm in Chinese-English Tense Translation. Li X Comput Intell Neurosci; 2022; 2022():1662311. PubMed ID: 35726286 [TBL] [Abstract][Full Text] [Related]
17. An intelligent Chatbot using deep learning with Bidirectional RNN and attention model. Dhyani M; Kumar R Mater Today Proc; 2021; 34():817-824. PubMed ID: 32837917 [TBL] [Abstract][Full Text] [Related]
18. PET: Parameter-efficient Knowledge Distillation on Transformer. Jeon H; Park S; Kim JG; Kang U PLoS One; 2023; 18(7):e0288060. PubMed ID: 37410716 [TBL] [Abstract][Full Text] [Related]
19. News headline generation based on improved decoder from transformer. Li Z; Wu J; Miao J; Yu X Sci Rep; 2022 Jul; 12(1):11648. PubMed ID: 35804183 [TBL] [Abstract][Full Text] [Related]
20. Towards Lightweight Transformer Via Group-Wise Transformation for Vision-and-Language Tasks. Luo G; Zhou Y; Sun X; Wang Y; Cao L; Wu Y; Huang F; Ji R IEEE Trans Image Process; 2022; 31():3386-3398. PubMed ID: 35471883 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]