These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
169 related articles for article (PubMed ID: 37989039)
1. Teacher-student complementary sample contrastive distillation. Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039 [TBL] [Abstract][Full Text] [Related]
2. Leveraging different learning styles for improved knowledge distillation in biomedical imaging. Niyaz U; Sambyal AS; Bathula DR Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210 [TBL] [Abstract][Full Text] [Related]
3. Restructuring the Teacher and Student in Self-Distillation. Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482 [TBL] [Abstract][Full Text] [Related]
4. ScribSD+: Scribble-supervised medical image segmentation based on simultaneous multi-scale knowledge distillation and class-wise contrastive regularization. Qu Y; Lu T; Zhang S; Wang G Comput Med Imaging Graph; 2024 Sep; 116():102416. PubMed ID: 39018640 [TBL] [Abstract][Full Text] [Related]
5. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector. Shang R; Li W; Zhu S; Jiao L; Li Y Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850 [TBL] [Abstract][Full Text] [Related]
6. DCCD: Reducing Neural Network Redundancy via Distillation. Liu Y; Chen J; Liu Y IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254 [TBL] [Abstract][Full Text] [Related]
7. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation. Chen C; Dou Q; Jin Y; Liu Q; Heng PA IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927 [TBL] [Abstract][Full Text] [Related]
8. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification. Sepahvand M; Abdali-Mohammadi F Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060 [TBL] [Abstract][Full Text] [Related]
9. Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT. Cho I; Kang U PLoS One; 2022; 17(2):e0263592. PubMed ID: 35180258 [TBL] [Abstract][Full Text] [Related]
10. Collaborative Knowledge Distillation via Multiknowledge Transfer. Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723 [TBL] [Abstract][Full Text] [Related]
11. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT. Zhang Y; Chen Z; Yang X Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339 [TBL] [Abstract][Full Text] [Related]
14. Relation Knowledge Distillation by Auxiliary Learning for Object Detection. Wang H; Jia T; Wang Q; Zuo W IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414 [TBL] [Abstract][Full Text] [Related]
15. Complementary label learning based on knowledge distillation. Ying P; Li Z; Sun R; Xu X Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542 [TBL] [Abstract][Full Text] [Related]
16. Memory-Replay Knowledge Distillation. Wang J; Zhang P; Li Y Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068 [TBL] [Abstract][Full Text] [Related]
17. On Representation Knowledge Distillation for Graph Neural Networks. Joshi CK; Liu F; Xun X; Lin J; Foo CS IEEE Trans Neural Netw Learn Syst; 2024 Apr; 35(4):4656-4667. PubMed ID: 36459610 [TBL] [Abstract][Full Text] [Related]
18. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging. Yang Y; Guo X; Ye C; Xiang Y; Ma T Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611 [TBL] [Abstract][Full Text] [Related]
19. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method. Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419 [TBL] [Abstract][Full Text] [Related]
20. Learning to Explore Distillability and Sparsability: A Joint Framework for Model Compression. Liu Y; Cao J; Li B; Hu W; Maybank S IEEE Trans Pattern Anal Mach Intell; 2023 Mar; 45(3):3378-3395. PubMed ID: 35731774 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]