226 related articles for article (PubMed ID: 33921068)
1. Memory-Replay Knowledge Distillation.
Wang J; Zhang P; Li Y
Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
[TBL] [Abstract][Full Text] [Related]
2. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
Shang R; Li W; Zhu S; Jiao L; Li Y
Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
[TBL] [Abstract][Full Text] [Related]
3. NTCE-KD: Non-Target-Class-Enhanced Knowledge Distillation.
Li C; Teng X; Ding Y; Lan L
Sensors (Basel); 2024 Jun; 24(11):. PubMed ID: 38894408
[TBL] [Abstract][Full Text] [Related]
4. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
Zhang C; Liu C; Gong H; Teng J
PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
[TBL] [Abstract][Full Text] [Related]
5. Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT.
Cho I; Kang U
PLoS One; 2022; 17(2):e0263592. PubMed ID: 35180258
[TBL] [Abstract][Full Text] [Related]
6. A non-negative feedback self-distillation method for salient object detection.
Chen L; Cao T; Zheng Y; Yang J; Wang Y; Wang Y; Zhang B
PeerJ Comput Sci; 2023; 9():e1435. PubMed ID: 37409081
[TBL] [Abstract][Full Text] [Related]
7. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
[TBL] [Abstract][Full Text] [Related]
8. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
Zhao H; Sun X; Dong J; Chen C; Dong Z
IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
[TBL] [Abstract][Full Text] [Related]
9. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging.
Yang Y; Guo X; Ye C; Xiang Y; Ma T
Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611
[TBL] [Abstract][Full Text] [Related]
10. Teacher-student complementary sample contrastive distillation.
Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
[TBL] [Abstract][Full Text] [Related]
11. A General Dynamic Knowledge Distillation Method for Visual Analytics.
Tu Z; Liu X; Xiao X
IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
[TBL] [Abstract][Full Text] [Related]
12. Multi-view Teacher-Student Network.
Tian Y; Sun S; Tang J
Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
[TBL] [Abstract][Full Text] [Related]
13. DCCD: Reducing Neural Network Redundancy via Distillation.
Liu Y; Chen J; Liu Y
IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
[TBL] [Abstract][Full Text] [Related]
14. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
Yuan W; Lu X; Zhang R; Liu Y
Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
[TBL] [Abstract][Full Text] [Related]
15. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
[TBL] [Abstract][Full Text] [Related]
16. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression.
Su T; Zhang J; Yu Z; Wang G; Liu X
IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989
[TBL] [Abstract][Full Text] [Related]
17. Knowledge distillation based on multi-layer fusion features.
Tan S; Guo R; Tang J; Jiang N; Zou J
PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
[TBL] [Abstract][Full Text] [Related]
18. CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning.
Li K; Wan J; Yu S
IEEE Trans Image Process; 2022; 31():3825-3837. PubMed ID: 35609094
[TBL] [Abstract][Full Text] [Related]
19. Deep Neural Network Self-Distillation Exploiting Data Representation Invariance.
Xu TB; Liu CL
IEEE Trans Neural Netw Learn Syst; 2022 Jan; 33(1):257-269. PubMed ID: 33074828
[TBL] [Abstract][Full Text] [Related]
20. Building a Compact Convolutional Neural Network for Embedded Intelligent Sensor Systems Using Group Sparsity and Knowledge Distillation.
Cho J; Lee M
Sensors (Basel); 2019 Oct; 19(19):. PubMed ID: 31590266
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]