These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
153 related articles for article (PubMed ID: 33954248)
1. Knowledge distillation in deep learning and its applications. Alkhulaifi A; Alsahli F; Ahmad I PeerJ Comput Sci; 2021; 7():e474. PubMed ID: 33954248 [TBL] [Abstract][Full Text] [Related]
2. Mitigating carbon footprint for knowledge distillation based deep learning model compression. Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614 [TBL] [Abstract][Full Text] [Related]
3. Teacher-student complementary sample contrastive distillation. Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039 [TBL] [Abstract][Full Text] [Related]
4. DCCD: Reducing Neural Network Redundancy via Distillation. Liu Y; Chen J; Liu Y IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254 [TBL] [Abstract][Full Text] [Related]
5. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition. Ullah H; Munir A J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233 [TBL] [Abstract][Full Text] [Related]
6. Generalized Knowledge Distillation via Relationship Matching. Ye HJ; Lu S; Zhan DC IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374 [TBL] [Abstract][Full Text] [Related]
7. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms. Li L; Su W; Liu F; He M; Liang X Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739 [TBL] [Abstract][Full Text] [Related]
9. Resolution-Aware Knowledge Distillation for Efficient Inference. Feng Z; Lai J; Xie X IEEE Trans Image Process; 2021; 30():6985-6996. PubMed ID: 34347598 [TBL] [Abstract][Full Text] [Related]
10. Classification of diabetic retinopathy using unlabeled data and knowledge distillation. Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798 [TBL] [Abstract][Full Text] [Related]
11. LAD: Layer-Wise Adaptive Distillation for BERT Model Compression. Lin YJ; Chen KY; Kao HY Sensors (Basel); 2023 Jan; 23(3):. PubMed ID: 36772523 [TBL] [Abstract][Full Text] [Related]
12. Complementary label learning based on knowledge distillation. Ying P; Li Z; Sun R; Xu X Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542 [TBL] [Abstract][Full Text] [Related]
13. Adversarial Distillation for Learning with Privileged Provisions. Wang X; Zhang R; Sun Y; Qi J IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712 [TBL] [Abstract][Full Text] [Related]
14. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method. Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419 [TBL] [Abstract][Full Text] [Related]
15. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification. Sepahvand M; Abdali-Mohammadi F Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060 [TBL] [Abstract][Full Text] [Related]
16. Collaborative Knowledge Distillation via Multiknowledge Transfer. Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723 [TBL] [Abstract][Full Text] [Related]
18. Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks. Wang L; Yoon KJ IEEE Trans Pattern Anal Mach Intell; 2022 Jun; 44(6):3048-3068. PubMed ID: 33513099 [TBL] [Abstract][Full Text] [Related]