These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
175 related articles for article (PubMed ID: 34768246)
1. Efficient knowledge distillation for liver CT segmentation using growing assistant network. Xu P; Kim K; Koh J; Wu D; Rim Lee Y; Young Park S; Young Tak W; Liu H; Li Q Phys Med Biol; 2021 Nov; 66(23):. PubMed ID: 34768246 [TBL] [Abstract][Full Text] [Related]
2. G-MBRMD: Lightweight liver segmentation model based on guided teaching with multi-head boundary reconstruction mapping distillation. Huang B; Li H; Fujita H; Sun X; Fang Z; Wang H; Su B Comput Biol Med; 2024 Aug; 178():108733. PubMed ID: 38897144 [TBL] [Abstract][Full Text] [Related]
4. MSKD: Structured knowledge distillation for efficient medical image segmentation. Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439 [TBL] [Abstract][Full Text] [Related]
5. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT. Zhang Y; Chen Z; Yang X Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339 [TBL] [Abstract][Full Text] [Related]
6. ABUS tumor segmentation via decouple contrastive knowledge distillation. Pan P; Li Y; Chen H; Sun J; Li X; Cheng L Phys Med Biol; 2023 Dec; 69(1):. PubMed ID: 38052091 [No Abstract] [Full Text] [Related]
7. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation. Yuan W; Lu X; Zhang R; Liu Y Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266 [TBL] [Abstract][Full Text] [Related]
8. Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation. Noothout JMH; Lessmann N; van Eede MC; van Harten LD; Sogancioglu E; Heslinga FG; Veta M; van Ginneken B; IĆĄgum I J Med Imaging (Bellingham); 2022 Sep; 9(5):052407. PubMed ID: 35692896 [No Abstract] [Full Text] [Related]
9. Leveraging different learning styles for improved knowledge distillation in biomedical imaging. Niyaz U; Sambyal AS; Bathula DR Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210 [TBL] [Abstract][Full Text] [Related]
10. Resolution-Aware Knowledge Distillation for Efficient Inference. Feng Z; Lai J; Xie X IEEE Trans Image Process; 2021; 30():6985-6996. PubMed ID: 34347598 [TBL] [Abstract][Full Text] [Related]
11. A General Dynamic Knowledge Distillation Method for Visual Analytics. Tu Z; Liu X; Xiao X IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819 [TBL] [Abstract][Full Text] [Related]
12. Relation Knowledge Distillation by Auxiliary Learning for Object Detection. Wang H; Jia T; Wang Q; Zuo W IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414 [TBL] [Abstract][Full Text] [Related]
14. Frameless Graph Knowledge Distillation. Shi D; Shao Z; Gao J; Wang Z; Guo Y IEEE Trans Neural Netw Learn Syst; 2024 Sep; PP():. PubMed ID: 39231057 [TBL] [Abstract][Full Text] [Related]
15. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition. Ullah H; Munir A J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233 [TBL] [Abstract][Full Text] [Related]
16. Paced-curriculum distillation with prediction and label uncertainty for image segmentation. Islam M; Seenivasan L; Sharan SP; Viekash VK; Gupta B; Glocker B; Ren H Int J Comput Assist Radiol Surg; 2023 Oct; 18(10):1875-1883. PubMed ID: 36862365 [TBL] [Abstract][Full Text] [Related]
17. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation. Chen C; Dou Q; Jin Y; Liu Q; Heng PA IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927 [TBL] [Abstract][Full Text] [Related]
18. Knowledge Distillation for Semantic Segmentation Using Channel and Spatial Correlations and Adaptive Cross Entropy. Park S; Heo YS Sensors (Basel); 2020 Aug; 20(16):. PubMed ID: 32824456 [TBL] [Abstract][Full Text] [Related]
19. Teacher-student complementary sample contrastive distillation. Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039 [TBL] [Abstract][Full Text] [Related]
20. Adversarial Distillation for Learning with Privileged Provisions. Wang X; Zhang R; Sun Y; Qi J IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]