22 related articles for article (PubMed ID: 38451756)
1. RETHINKING INTERMEDIATE LAYERS DESIGN IN KNOWLEDGE DISTILLATION FOR KIDNEY AND LIVER TUMOR SEGMENTATION.
Gorade V; Mittal S; Jha D; Bagci U
ArXiv; 2024 May; ():. PubMed ID: 38855539
[TBL] [Abstract][Full Text] [Related]
2. Localization Distillation for Object Detection.
Zheng Z; Ye R; Hou Q; Ren D; Wang P; Zuo W; Cheng MM
IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10070-10083. PubMed ID: 37027640
[TBL] [Abstract][Full Text] [Related]
3. Multistage feature fusion knowledge distillation.
Li G; Wang K; Lv P; He P; Zhou Z; Xu C
Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
[TBL] [Abstract][Full Text] [Related]
4. Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition.
Yang C; An Z; Zhou H; Zhuang F; Xu Y; Zhang Q
IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10212-10227. PubMed ID: 37030723
[TBL] [Abstract][Full Text] [Related]
5. When Object Detection Meets Knowledge Distillation: A Survey.
Li Z; Xu P; Chang X; Yang L; Zhang Y; Yao L; Chen X
IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10555-10579. PubMed ID: 37028387
[TBL] [Abstract][Full Text] [Related]
6. NTCE-KD: Non-Target-Class-Enhanced Knowledge Distillation.
Li C; Teng X; Ding Y; Lan L
Sensors (Basel); 2024 Jun; 24(11):. PubMed ID: 38894408
[TBL] [Abstract][Full Text] [Related]
7. Prototype-based sample-weighted distillation unified framework adapted to missing modality sentiment analysis.
Zhang Y; Liu F; Zhuang X; Hou Y; Zhang Y
Neural Netw; 2024 Sep; 177():106397. PubMed ID: 38805799
[TBL] [Abstract][Full Text] [Related]
8. Deep Transfer Learning Method Using Self-Pixel and Global Channel Attentive Regularization.
Kang C; Kang SU
Sensors (Basel); 2024 May; 24(11):. PubMed ID: 38894313
[TBL] [Abstract][Full Text] [Related]
9. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
Yuan W; Lu X; Zhang R; Liu Y
Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
[TBL] [Abstract][Full Text] [Related]
10. Layer-Specific Knowledge Distillation for Class Incremental Semantic Segmentation.
Wang Q; Wu Y; Yang L; Zuo W; Hu Q
IEEE Trans Image Process; 2024; 33():1977-1989. PubMed ID: 38451756
[TBL] [Abstract][Full Text] [Related]
11. Inherit With Distillation and Evolve With Contrast: Exploring Class Incremental Semantic Segmentation Without Exemplar Memory.
Zhao D; Yuan B; Shi Z
IEEE Trans Pattern Anal Mach Intell; 2023 Oct; 45(10):11932-11947. PubMed ID: 37155379
[TBL] [Abstract][Full Text] [Related]
12. Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation.
Song Z; Zhang X; Shi Z
Sensors (Basel); 2023 Sep; 23(18):. PubMed ID: 37765877
[TBL] [Abstract][Full Text] [Related]
13. Double Similarity Distillation for Semantic Image Segmentation.
Feng Y; Sun X; Diao W; Li J; Gao X
IEEE Trans Image Process; 2021; 30():5363-5376. PubMed ID: 34048345
[TBL] [Abstract][Full Text] [Related]
14. Uncertainty-Aware Contrastive Distillation for Incremental Semantic Segmentation.
Yang G; Fini E; Xu D; Rota P; Ding M; Nabi M; Alameda-Pineda X; Ricci E
IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):2567-2581. PubMed ID: 35358042
[TBL] [Abstract][Full Text] [Related]
15.
; ; . PubMed ID:
[No Abstract] [Full Text] [Related]
16.
; ; . PubMed ID:
[No Abstract] [Full Text] [Related]
17.
; ; . PubMed ID:
[No Abstract] [Full Text] [Related]
18.
; ; . PubMed ID:
[No Abstract] [Full Text] [Related]
19.
; ; . PubMed ID:
[No Abstract] [Full Text] [Related]
20.
; ; . PubMed ID:
[No Abstract] [Full Text] [Related]
[Next] [New Search]