These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
205 related articles for article (PubMed ID: 32721909)
1. Highlight Every Step: Knowledge Distillation via Collaborative Teaching. Zhao H; Sun X; Dong J; Chen C; Dong Z IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909 [TBL] [Abstract][Full Text] [Related]
2. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression. Su T; Zhang J; Yu Z; Wang G; Liu X IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989 [TBL] [Abstract][Full Text] [Related]
3. DCCD: Reducing Neural Network Redundancy via Distillation. Liu Y; Chen J; Liu Y IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254 [TBL] [Abstract][Full Text] [Related]
4. ResKD: Residual-Guided Knowledge Distillation. Li X; Li S; Omar B; Wu F; Li X IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924 [TBL] [Abstract][Full Text] [Related]
5. Mitigating carbon footprint for knowledge distillation based deep learning model compression. Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614 [TBL] [Abstract][Full Text] [Related]
8. Learning Student Networks via Feature Embedding. Chen H; Wang Y; Xu C; Xu C; Tao D IEEE Trans Neural Netw Learn Syst; 2021 Jan; 32(1):25-35. PubMed ID: 32092018 [TBL] [Abstract][Full Text] [Related]
9. Memory-Replay Knowledge Distillation. Wang J; Zhang P; Li Y Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068 [TBL] [Abstract][Full Text] [Related]
10. Restructuring the Teacher and Student in Self-Distillation. Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482 [TBL] [Abstract][Full Text] [Related]
11. A Novel Approach to Classifying Breast Cancer Histopathology Biopsy Images Using Bilateral Knowledge Distillation and Label Smoothing Regularization. Chaudhury S; Shelke N; Sau K; Prasanalakshmi B; Shabaz M Comput Math Methods Med; 2021; 2021():4019358. PubMed ID: 34721657 [TBL] [Abstract][Full Text] [Related]
13. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT. Zhang Y; Chen Z; Yang X Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339 [TBL] [Abstract][Full Text] [Related]
14. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector. Shang R; Li W; Zhu S; Jiao L; Li Y Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850 [TBL] [Abstract][Full Text] [Related]
15. Collaborative Knowledge Distillation via Multiknowledge Transfer. Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723 [TBL] [Abstract][Full Text] [Related]
16. MSKD: Structured knowledge distillation for efficient medical image segmentation. Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439 [TBL] [Abstract][Full Text] [Related]
17. Graph Flow: Cross-Layer Graph Flow Distillation for Dual Efficient Medical Image Segmentation. Zou W; Qi X; Zhou W; Sun M; Sun Z; Shan C IEEE Trans Med Imaging; 2023 Apr; 42(4):1159-1171. PubMed ID: 36423314 [TBL] [Abstract][Full Text] [Related]
18. Leveraging different learning styles for improved knowledge distillation in biomedical imaging. Niyaz U; Sambyal AS; Bathula DR Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210 [TBL] [Abstract][Full Text] [Related]
19. A New Framework of Collaborative Learning for Adaptive Metric Distillation. Liu H; Ye M; Wang Y; Zhao S; Li P; Shen J IEEE Trans Neural Netw Learn Syst; 2024 Jun; 35(6):8266-8277. PubMed ID: 37022854 [TBL] [Abstract][Full Text] [Related]
20. Teacher-student complementary sample contrastive distillation. Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]