155 related articles for article (PubMed ID: 36619739)
1. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
Li L; Su W; Liu F; He M; Liang X
Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
[TBL] [Abstract][Full Text] [Related]
2. Knowledge distillation based on multi-layer fusion features.
Tan S; Guo R; Tang J; Jiang N; Zou J
PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
[TBL] [Abstract][Full Text] [Related]
3. Multistage feature fusion knowledge distillation.
Li G; Wang K; Lv P; He P; Zhou Z; Xu C
Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
[TBL] [Abstract][Full Text] [Related]
4. Teacher-student complementary sample contrastive distillation.
Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
[TBL] [Abstract][Full Text] [Related]
5. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
Javed S; Mahmood A; Qaiser T; Werghi N
IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
[TBL] [Abstract][Full Text] [Related]
6. Collaborative Knowledge Distillation via Multiknowledge Transfer.
Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D
IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723
[TBL] [Abstract][Full Text] [Related]
7. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
Niyaz U; Sambyal AS; Bathula DR
Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
[TBL] [Abstract][Full Text] [Related]
8. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT.
Zhang Y; Chen Z; Yang X
Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339
[TBL] [Abstract][Full Text] [Related]
9. Efficient skin lesion segmentation with boundary distillation.
Zhang Z; Lu B
Med Biol Eng Comput; 2024 May; ():. PubMed ID: 38691269
[TBL] [Abstract][Full Text] [Related]
10. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
Yuan W; Lu X; Zhang R; Liu Y
Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
[TBL] [Abstract][Full Text] [Related]
11. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
[TBL] [Abstract][Full Text] [Related]
12. Self-knowledge distillation for surgical phase recognition.
Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
[TBL] [Abstract][Full Text] [Related]
13. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation.
Li R; Yun L; Zhang M; Yang Y; Cheng F
Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675
[TBL] [Abstract][Full Text] [Related]
14. Feature Map Distillation of Thin Nets for Low-Resolution Object Recognition.
Huang Z; Yang S; Zhou M; Li Z; Gong Z; Chen Y
IEEE Trans Image Process; 2022; 31():1364-1379. PubMed ID: 35025743
[TBL] [Abstract][Full Text] [Related]
15. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
Ullah H; Munir A
J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
[TBL] [Abstract][Full Text] [Related]
16. Self-Distillation for Randomized Neural Networks.
Hu M; Gao R; Suganthan PN
IEEE Trans Neural Netw Learn Syst; 2023 Aug; PP():. PubMed ID: 37585327
[TBL] [Abstract][Full Text] [Related]
17. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
[TBL] [Abstract][Full Text] [Related]
18. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
Sepahvand M; Abdali-Mohammadi F
Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
[TBL] [Abstract][Full Text] [Related]
19. Distilling a Powerful Student Model via Online Knowledge Distillation.
Li S; Lin M; Wang Y; Wu Y; Tian Y; Shao L; Ji R
IEEE Trans Neural Netw Learn Syst; 2023 Nov; 34(11):8743-8752. PubMed ID: 35254994
[TBL] [Abstract][Full Text] [Related]
20. A General Dynamic Knowledge Distillation Method for Visual Analytics.
Tu Z; Liu X; Xiao X
IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]