153 related articles for article (PubMed ID: 38005675)
1. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation.
Li R; Yun L; Zhang M; Yang Y; Cheng F
Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675
[TBL] [Abstract][Full Text] [Related]
2. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
Sepahvand M; Abdali-Mohammadi F
Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
[TBL] [Abstract][Full Text] [Related]
3. Multistage feature fusion knowledge distillation.
Li G; Wang K; Lv P; He P; Zhou Z; Xu C
Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
[TBL] [Abstract][Full Text] [Related]
4. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
Ullah H; Munir A
J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
[TBL] [Abstract][Full Text] [Related]
5. Multi-view Teacher-Student Network.
Tian Y; Sun S; Tang J
Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
[TBL] [Abstract][Full Text] [Related]
6. Self-knowledge distillation for surgical phase recognition.
Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
[TBL] [Abstract][Full Text] [Related]
7. LHAR: Lightweight Human Activity Recognition on Knowledge Distillation.
Deng S; Chen J; Teng D; Yang C; Chen D; Jia T; Wang H
IEEE J Biomed Health Inform; 2023 Jul; PP():. PubMed ID: 37494155
[TBL] [Abstract][Full Text] [Related]
8. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
Li L; Su W; Liu F; He M; Liang X
Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
[TBL] [Abstract][Full Text] [Related]
9. MSKD: Structured knowledge distillation for efficient medical image segmentation.
Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
[TBL] [Abstract][Full Text] [Related]
10. Adversarial Distillation for Learning with Privileged Provisions.
Wang X; Zhang R; Sun Y; Qi J
IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712
[TBL] [Abstract][Full Text] [Related]
11. Modulation format recognition in a UVLC system based on an ultra-lightweight model with communication-informed knowledge distillation.
Yao L; Li F; Zhang H; Zhou Y; Wei Y; Li Z; Shi J; Zhang J; Shen C; Chi N
Opt Express; 2024 Apr; 32(8):13095-13110. PubMed ID: 38859288
[TBL] [Abstract][Full Text] [Related]
12. Surface Defect Detection System for Carrot Combine Harvest Based on Multi-Stage Knowledge Distillation.
Zhou W; Song C; Song K; Wen N; Sun X; Gao P
Foods; 2023 Feb; 12(4):. PubMed ID: 36832869
[TBL] [Abstract][Full Text] [Related]
13. Research on a lightweight electronic component detection method based on knowledge distillation.
Xia Z; Gu J; Wang W; Huang Z
Math Biosci Eng; 2023 Nov; 20(12):20971-20994. PubMed ID: 38124584
[TBL] [Abstract][Full Text] [Related]
14. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
Javed S; Mahmood A; Qaiser T; Werghi N
IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
[TBL] [Abstract][Full Text] [Related]
15. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
[TBL] [Abstract][Full Text] [Related]
16. A General Dynamic Knowledge Distillation Method for Visual Analytics.
Tu Z; Liu X; Xiao X
IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
[TBL] [Abstract][Full Text] [Related]
17. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
[TBL] [Abstract][Full Text] [Related]
18. DCCD: Reducing Neural Network Redundancy via Distillation.
Liu Y; Chen J; Liu Y
IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
[TBL] [Abstract][Full Text] [Related]
19. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
Shang R; Li W; Zhu S; Jiao L; Li Y
Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
[TBL] [Abstract][Full Text] [Related]
20. Knowledge distillation based on multi-layer fusion features.
Tan S; Guo R; Tang J; Jiang N; Zou J
PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]