These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
180 related articles for article (PubMed ID: 37549611)
1. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging. Yang Y; Guo X; Ye C; Xiang Y; Ma T Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611 [TBL] [Abstract][Full Text] [Related]
2. MSKD: Structured knowledge distillation for efficient medical image segmentation. Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439 [TBL] [Abstract][Full Text] [Related]
3. Exploring Generalizable Distillation for Efficient Medical Image Segmentation. Qi X; Wu Z; Zou W; Ren M; Gao Y; Sun M; Zhang S; Shan C; Sun Z IEEE J Biomed Health Inform; 2024 Jul; 28(7):4170-4183. PubMed ID: 38954557 [TBL] [Abstract][Full Text] [Related]
4. Memory-Replay Knowledge Distillation. Wang J; Zhang P; Li Y Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068 [TBL] [Abstract][Full Text] [Related]
5. Classification of Alzheimer's disease in MRI images using knowledge distillation framework: an investigation. Li Y; Luo J; Zhang J Int J Comput Assist Radiol Surg; 2022 Jul; 17(7):1235-1243. PubMed ID: 35633492 [TBL] [Abstract][Full Text] [Related]
6. Teacher-student complementary sample contrastive distillation. Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039 [TBL] [Abstract][Full Text] [Related]
7. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT. Zhang Y; Chen Z; Yang X Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339 [TBL] [Abstract][Full Text] [Related]
8. MT4MTL-KD: A Multi-Teacher Knowledge Distillation Framework for Triplet Recognition. Gui S; Wang Z; Chen J; Zhou X; Zhang C; Cao Y IEEE Trans Med Imaging; 2024 Apr; 43(4):1628-1639. PubMed ID: 38127608 [TBL] [Abstract][Full Text] [Related]
9. Leveraging different learning styles for improved knowledge distillation in biomedical imaging. Niyaz U; Sambyal AS; Bathula DR Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210 [TBL] [Abstract][Full Text] [Related]
10. Relation Knowledge Distillation by Auxiliary Learning for Object Detection. Wang H; Jia T; Wang Q; Zuo W IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414 [TBL] [Abstract][Full Text] [Related]
11. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation. Chen C; Dou Q; Jin Y; Liu Q; Heng PA IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927 [TBL] [Abstract][Full Text] [Related]
12. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation. Zhang C; Liu C; Gong H; Teng J PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020 [TBL] [Abstract][Full Text] [Related]
13. Generalized fused group lasso regularized multi-task feature learning for predicting cognitive outcomes in Alzheimers disease. Cao P; Liu X; Liu H; Yang J; Zhao D; Huang M; Zaiane O Comput Methods Programs Biomed; 2018 Aug; 162():19-45. PubMed ID: 29903486 [TBL] [Abstract][Full Text] [Related]
15. A single stage knowledge distillation network for brain tumor segmentation on limited MR image modalities. Choi Y; Al-Masni MA; Jung KJ; Yoo RE; Lee SY; Kim DH Comput Methods Programs Biomed; 2023 Oct; 240():107644. PubMed ID: 37307766 [TBL] [Abstract][Full Text] [Related]
16. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation. Yuan W; Lu X; Zhang R; Liu Y Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266 [TBL] [Abstract][Full Text] [Related]
17. Collaborative Knowledge Distillation via Multiknowledge Transfer. Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723 [TBL] [Abstract][Full Text] [Related]
18. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data. Tseng W; Liu H; Yang Y; Liu C; Lu B Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689 [No Abstract] [Full Text] [Related]
19. RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging. Jaiswal A; Ashutosh K; Rousseau JF; Peng Y; Wang Z; Ding Y Proc IEEE Int Conf Data Min; 2022; 2022():981-986. PubMed ID: 37038389 [TBL] [Abstract][Full Text] [Related]
20. Layer-Specific Knowledge Distillation for Class Incremental Semantic Segmentation. Wang Q; Wu Y; Yang L; Zuo W; Hu Q IEEE Trans Image Process; 2024; 33():1977-1989. PubMed ID: 38451756 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]