120 related articles for article (PubMed ID: 38862547)
1. Multistage feature fusion knowledge distillation.
Li G; Wang K; Lv P; He P; Zhou Z; Xu C
Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
[TBL] [Abstract][Full Text] [Related]
2. Knowledge distillation based on multi-layer fusion features.
Tan S; Guo R; Tang J; Jiang N; Zou J
PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
[TBL] [Abstract][Full Text] [Related]
3. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation.
Li R; Yun L; Zhang M; Yang Y; Cheng F
Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675
[TBL] [Abstract][Full Text] [Related]
4. DCCD: Reducing Neural Network Redundancy via Distillation.
Liu Y; Chen J; Liu Y
IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
[TBL] [Abstract][Full Text] [Related]
5. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
Li L; Su W; Liu F; He M; Liang X
Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
[TBL] [Abstract][Full Text] [Related]
6. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
Ullah H; Munir A
J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
[TBL] [Abstract][Full Text] [Related]
7. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
[TBL] [Abstract][Full Text] [Related]
8. Self-knowledge distillation for surgical phase recognition.
Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
[TBL] [Abstract][Full Text] [Related]
9. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT.
Zhang Y; Chen Z; Yang X
Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339
[TBL] [Abstract][Full Text] [Related]
10. Research on a lightweight electronic component detection method based on knowledge distillation.
Xia Z; Gu J; Wang W; Huang Z
Math Biosci Eng; 2023 Nov; 20(12):20971-20994. PubMed ID: 38124584
[TBL] [Abstract][Full Text] [Related]
11. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
Zhang C; Liu C; Gong H; Teng J
PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
[TBL] [Abstract][Full Text] [Related]
12. A lightweight speech recognition method with target-swap knowledge distillation for Mandarin air traffic control communications.
Ren J; Yang S; Shi Y; Yang J
PeerJ Comput Sci; 2023; 9():e1650. PubMed ID: 38077570
[TBL] [Abstract][Full Text] [Related]
13. Multiscale knowledge distillation with attention based fusion for robust human activity recognition.
Yuan Z; Yang Z; Ning H; Tang X
Sci Rep; 2024 May; 14(1):12411. PubMed ID: 38816446
[TBL] [Abstract][Full Text] [Related]
14. ABUS tumor segmentation via decouple contrastive knowledge distillation.
Pan P; Li Y; Chen H; Sun J; Li X; Cheng L
Phys Med Biol; 2023 Dec; 69(1):. PubMed ID: 38052091
[No Abstract] [Full Text] [Related]
15. Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition.
Yang C; An Z; Zhou H; Zhuang F; Xu Y; Zhang Q
IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10212-10227. PubMed ID: 37030723
[TBL] [Abstract][Full Text] [Related]
16. Surface Defect Detection System for Carrot Combine Harvest Based on Multi-Stage Knowledge Distillation.
Zhou W; Song C; Song K; Wen N; Sun X; Gao P
Foods; 2023 Feb; 12(4):. PubMed ID: 36832869
[TBL] [Abstract][Full Text] [Related]
17. Feature Map Distillation of Thin Nets for Low-Resolution Object Recognition.
Huang Z; Yang S; Zhou M; Li Z; Gong Z; Chen Y
IEEE Trans Image Process; 2022; 31():1364-1379. PubMed ID: 35025743
[TBL] [Abstract][Full Text] [Related]
18. Efficient Crowd Counting via Dual Knowledge Distillation.
Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I
IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611
[TBL] [Abstract][Full Text] [Related]
19. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
Sepahvand M; Abdali-Mohammadi F
Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
[TBL] [Abstract][Full Text] [Related]
20. LHAR: Lightweight Human Activity Recognition on Knowledge Distillation.
Deng S; Chen J; Teng D; Yang C; Chen D; Jia T; Wang H
IEEE J Biomed Health Inform; 2023 Jul; PP():. PubMed ID: 37494155
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]