These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
122 related articles for article (PubMed ID: 37409081)
1. A non-negative feedback self-distillation method for salient object detection. Chen L; Cao T; Zheng Y; Yang J; Wang Y; Wang Y; Zhang B PeerJ Comput Sci; 2023; 9():e1435. PubMed ID: 37409081 [TBL] [Abstract][Full Text] [Related]
2. Memory-Replay Knowledge Distillation. Wang J; Zhang P; Li Y Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068 [TBL] [Abstract][Full Text] [Related]
3. Structured Knowledge Distillation for Accurate and Efficient Object Detection. Zhang L; Ma K IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):15706-15724. PubMed ID: 37527292 [TBL] [Abstract][Full Text] [Related]
4. Relation Knowledge Distillation by Auxiliary Learning for Object Detection. Wang H; Jia T; Wang Q; Zuo W IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414 [TBL] [Abstract][Full Text] [Related]
5. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation. Zhang C; Liu C; Gong H; Teng J PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020 [TBL] [Abstract][Full Text] [Related]
6. Dynamic Knowledge Distillation with Noise Elimination for RGB-D Salient Object Detection. Ren G; Yu Y; Liu H; Stathaki T Sensors (Basel); 2022 Aug; 22(16):. PubMed ID: 36015947 [TBL] [Abstract][Full Text] [Related]
7. Segmentation with mixed supervision: Confidence maximization helps knowledge distillation. Liu B; Desrosiers C; Ben Ayed I; Dolz J Med Image Anal; 2023 Jan; 83():102670. PubMed ID: 36413905 [TBL] [Abstract][Full Text] [Related]
8. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation. Yuan W; Lu X; Zhang R; Liu Y Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266 [TBL] [Abstract][Full Text] [Related]
9. NTCE-KD: Non-Target-Class-Enhanced Knowledge Distillation. Li C; Teng X; Ding Y; Lan L Sensors (Basel); 2024 Jun; 24(11):. PubMed ID: 38894408 [TBL] [Abstract][Full Text] [Related]
10. A General Dynamic Knowledge Distillation Method for Visual Analytics. Tu Z; Liu X; Xiao X IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819 [TBL] [Abstract][Full Text] [Related]
11. Person re-identification based on multi-branch visual transformer and self-distillation. Chen W; Yin K; Wu Y; Hu Y Sci Prog; 2024; 107(1):368504231219172. PubMed ID: 38312037 [TBL] [Abstract][Full Text] [Related]
12. WaveNet: Wavelet Network With Knowledge Distillation for RGB-T Salient Object Detection. Zhou W; Sun F; Jiang Q; Cong R; Hwang JN IEEE Trans Image Process; 2023; 32():3027-3039. PubMed ID: 37192028 [TBL] [Abstract][Full Text] [Related]
13. Fault detection in the distillation column process using Kullback Leibler divergence. Aggoune L; Chetouani Y; Raïssi T ISA Trans; 2016 Jul; 63():394-400. PubMed ID: 27020311 [TBL] [Abstract][Full Text] [Related]
14. Restructuring the Teacher and Student in Self-Distillation. Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482 [TBL] [Abstract][Full Text] [Related]
15. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector. Shang R; Li W; Zhu S; Jiao L; Li Y Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850 [TBL] [Abstract][Full Text] [Related]
16. Pixel Distillation: Cost-Flexible Distillation Across Image Sizes and Heterogeneous Networks. Guo G; Zhang D; Han L; Liu N; Cheng MM; Han J IEEE Trans Pattern Anal Mach Intell; 2024 Dec; 46(12):9536-9550. PubMed ID: 38949946 [TBL] [Abstract][Full Text] [Related]
17. Deep Neural Network Self-Distillation Exploiting Data Representation Invariance. Xu TB; Liu CL IEEE Trans Neural Netw Learn Syst; 2022 Jan; 33(1):257-269. PubMed ID: 33074828 [TBL] [Abstract][Full Text] [Related]
18. Multilayer Semantic Features Adaptive Distillation for Object Detectors. Zhang Z; Liu J; Chen Y; Mei W; Huang F; Chen L Sensors (Basel); 2023 Sep; 23(17):. PubMed ID: 37688070 [TBL] [Abstract][Full Text] [Related]
19. Knowledge Distillation for Semantic Segmentation Using Channel and Spatial Correlations and Adaptive Cross Entropy. Park S; Heo YS Sensors (Basel); 2020 Aug; 20(16):. PubMed ID: 32824456 [TBL] [Abstract][Full Text] [Related]
20. Cervical Cell Image Classification-Based Knowledge Distillation. Gao W; Xu C; Li G; Zhang Y; Bai N; Li M Biomimetics (Basel); 2022 Nov; 7(4):. PubMed ID: 36412723 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]