These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
3. Cross-modal knowledge distillation for continuous sign language recognition. Gao L; Shi P; Hu L; Feng J; Zhu L; Wan L; Feng W Neural Netw; 2024 Nov; 179():106587. PubMed ID: 39111160 [TBL] [Abstract][Full Text] [Related]
4. Restructuring the Teacher and Student in Self-Distillation. Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482 [TBL] [Abstract][Full Text] [Related]
5. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation. Li R; Yun L; Zhang M; Yang Y; Cheng F Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675 [TBL] [Abstract][Full Text] [Related]
6. Human Activity Recognition Using Cascaded Dual Attention CNN and Bi-Directional GRU Framework. Ullah H; Munir A J Imaging; 2023 Jun; 9(7):. PubMed ID: 37504807 [TBL] [Abstract][Full Text] [Related]
7. Relation Knowledge Distillation by Auxiliary Learning for Object Detection. Wang H; Jia T; Wang Q; Zuo W IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414 [TBL] [Abstract][Full Text] [Related]
8. A General Dynamic Knowledge Distillation Method for Visual Analytics. Tu Z; Liu X; Xiao X IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819 [TBL] [Abstract][Full Text] [Related]
9. DCCD: Reducing Neural Network Redundancy via Distillation. Liu Y; Chen J; Liu Y IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254 [TBL] [Abstract][Full Text] [Related]
10. LHAR: Lightweight Human Activity Recognition on Knowledge Distillation. Deng S; Chen J; Teng D; Yang C; Chen D; Jia T; Wang H IEEE J Biomed Health Inform; 2024 Nov; 28(11):6318-6328. PubMed ID: 37494155 [TBL] [Abstract][Full Text] [Related]
11. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT. Zhang Y; Chen Z; Yang X Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339 [TBL] [Abstract][Full Text] [Related]
14. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images. Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373 [TBL] [Abstract][Full Text] [Related]
15. Multistage feature fusion knowledge distillation. Li G; Wang K; Lv P; He P; Zhou Z; Xu C Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547 [TBL] [Abstract][Full Text] [Related]
16. Self-architectural knowledge distillation for spiking neural networks. Qiu H; Ning M; Song Z; Fang W; Chen Y; Sun T; Ma Z; Yuan L; Tian Y Neural Netw; 2024 Oct; 178():106475. PubMed ID: 38941738 [TBL] [Abstract][Full Text] [Related]
17. Real-Time Correlation Tracking via Joint Model Compression and Transfer. Wang N; Zhou W; Song Y; Ma C; Li H IEEE Trans Image Process; 2020 Apr; ():. PubMed ID: 32356748 [TBL] [Abstract][Full Text] [Related]
18. MSKD: Structured knowledge distillation for efficient medical image segmentation. Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439 [TBL] [Abstract][Full Text] [Related]
19. A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection. Ying M; Wang Y; Yang K; Wang H; Liu X Front Bioeng Biotechnol; 2023; 11():1326706. PubMed ID: 38292305 [No Abstract] [Full Text] [Related]
20. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data. Tseng W; Liu H; Yang Y; Liu C; Lu B Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689 [No Abstract] [Full Text] [Related] [Next] [New Search]