These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
116 related articles for article (PubMed ID: 39316482)
1. Restructuring the Teacher and Student in Self-Distillation. Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482 [TBL] [Abstract][Full Text] [Related]
2. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression. Su T; Zhang J; Yu Z; Wang G; Liu X IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989 [TBL] [Abstract][Full Text] [Related]
3. Teacher-student complementary sample contrastive distillation. Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039 [TBL] [Abstract][Full Text] [Related]
4. DCCD: Reducing Neural Network Redundancy via Distillation. Liu Y; Chen J; Liu Y IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254 [TBL] [Abstract][Full Text] [Related]
5. ResKD: Residual-Guided Knowledge Distillation. Li X; Li S; Omar B; Wu F; Li X IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924 [TBL] [Abstract][Full Text] [Related]
6. MSKD: Structured knowledge distillation for efficient medical image segmentation. Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439 [TBL] [Abstract][Full Text] [Related]
7. Memory-Replay Knowledge Distillation. Wang J; Zhang P; Li Y Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068 [TBL] [Abstract][Full Text] [Related]
8. On Representation Knowledge Distillation for Graph Neural Networks. Joshi CK; Liu F; Xun X; Lin J; Foo CS IEEE Trans Neural Netw Learn Syst; 2024 Apr; 35(4):4656-4667. PubMed ID: 36459610 [TBL] [Abstract][Full Text] [Related]
9. Cosine similarity-guided knowledge distillation for robust object detectors. Park S; Kang D; Paik J Sci Rep; 2024 Aug; 14(1):18888. PubMed ID: 39143179 [TBL] [Abstract][Full Text] [Related]
14. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition. Ullah H; Munir A J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233 [TBL] [Abstract][Full Text] [Related]
15. Highlight Every Step: Knowledge Distillation via Collaborative Teaching. Zhao H; Sun X; Dong J; Chen C; Dong Z IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909 [TBL] [Abstract][Full Text] [Related]
16. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms. Li L; Su W; Liu F; He M; Liang X Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739 [TBL] [Abstract][Full Text] [Related]
17. Distilling a Powerful Student Model via Online Knowledge Distillation. Li S; Lin M; Wang Y; Wu Y; Tian Y; Shao L; Ji R IEEE Trans Neural Netw Learn Syst; 2023 Nov; 34(11):8743-8752. PubMed ID: 35254994 [TBL] [Abstract][Full Text] [Related]
18. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT. Zhang Y; Chen Z; Yang X Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339 [TBL] [Abstract][Full Text] [Related]
19. A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection. Ying M; Wang Y; Yang K; Wang H; Liu X Front Bioeng Biotechnol; 2023; 11():1326706. PubMed ID: 38292305 [No Abstract] [Full Text] [Related]
20. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution. Yang C; An Z; Cai L; Xu Y IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2094-2108. PubMed ID: 35820013 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]