These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
130 related articles for article (PubMed ID: 35254994)
1. Distilling a Powerful Student Model via Online Knowledge Distillation. Li S; Lin M; Wang Y; Wu Y; Tian Y; Shao L; Ji R IEEE Trans Neural Netw Learn Syst; 2023 Nov; 34(11):8743-8752. PubMed ID: 35254994 [TBL] [Abstract][Full Text] [Related]
2. Restructuring the Teacher and Student in Self-Distillation. Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482 [TBL] [Abstract][Full Text] [Related]
3. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression. Su T; Zhang J; Yu Z; Wang G; Liu X IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989 [TBL] [Abstract][Full Text] [Related]
4. Knowledge distillation based on multi-layer fusion features. Tan S; Guo R; Tang J; Jiang N; Zou J PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443 [TBL] [Abstract][Full Text] [Related]
6. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms. Li L; Su W; Liu F; He M; Liang X Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739 [TBL] [Abstract][Full Text] [Related]
7. Block-Wise Partner Learning for Model Compression. Zhang X; Xie W; Li Y; Lei J; Jiang K; Fang L; Du Q IEEE Trans Neural Netw Learn Syst; 2023 Sep; PP():. PubMed ID: 37656638 [TBL] [Abstract][Full Text] [Related]
8. ResKD: Residual-Guided Knowledge Distillation. Li X; Li S; Omar B; Wu F; Li X IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924 [TBL] [Abstract][Full Text] [Related]
9. DCCD: Reducing Neural Network Redundancy via Distillation. Liu Y; Chen J; Liu Y IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254 [TBL] [Abstract][Full Text] [Related]
10. Leveraging different learning styles for improved knowledge distillation in biomedical imaging. Niyaz U; Sambyal AS; Bathula DR Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210 [TBL] [Abstract][Full Text] [Related]
11. Comprehensive learning and adaptive teaching: Distilling multi-modal knowledge for pathological glioma grading. Xing X; Zhu M; Chen Z; Yuan Y Med Image Anal; 2024 Jan; 91():102990. PubMed ID: 37864912 [TBL] [Abstract][Full Text] [Related]
12. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution. Yang C; An Z; Cai L; Xu Y IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2094-2108. PubMed ID: 35820013 [TBL] [Abstract][Full Text] [Related]
13. Towards efficient network compression via Few-Shot Slimming. He J; Ding Y; Zhang M; Li D Neural Netw; 2022 Mar; 147():113-125. PubMed ID: 34999388 [TBL] [Abstract][Full Text] [Related]
14. Spot-Adaptive Knowledge Distillation. Song J; Chen Y; Ye J; Song M IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832 [TBL] [Abstract][Full Text] [Related]
15. Relation Knowledge Distillation by Auxiliary Learning for Object Detection. Wang H; Jia T; Wang Q; Zuo W IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414 [TBL] [Abstract][Full Text] [Related]
16. Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition. Yang C; An Z; Zhou H; Zhuang F; Xu Y; Zhang Q IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10212-10227. PubMed ID: 37030723 [TBL] [Abstract][Full Text] [Related]
17. A class-incremental learning approach for learning feature-compatible embeddings. An H; Yang J; Zhang X; Ruan X; Wu Y; Li S; Hu J Neural Netw; 2024 Dec; 180():106685. PubMed ID: 39243512 [TBL] [Abstract][Full Text] [Related]
18. Efficient Crowd Counting via Dual Knowledge Distillation. Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611 [TBL] [Abstract][Full Text] [Related]
19. Localization Distillation for Object Detection. Zheng Z; Ye R; Hou Q; Ren D; Wang P; Zuo W; Cheng MM IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10070-10083. PubMed ID: 37027640 [TBL] [Abstract][Full Text] [Related]
20. Self-Distillation: Towards Efficient and Compact Neural Networks. Zhang L; Bao C; Ma K IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4388-4403. PubMed ID: 33735074 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]