133 related articles for article (PubMed ID: 38569460)
21. Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks.
Wang L; Yoon KJ
IEEE Trans Pattern Anal Mach Intell; 2022 Jun; 44(6):3048-3068. PubMed ID: 33513099
[TBL] [Abstract][Full Text] [Related]
22. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression.
Su T; Zhang J; Yu Z; Wang G; Liu X
IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989
[TBL] [Abstract][Full Text] [Related]
23. Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition.
Yang C; An Z; Zhou H; Zhuang F; Xu Y; Zhang Q
IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10212-10227. PubMed ID: 37030723
[TBL] [Abstract][Full Text] [Related]
24. Efficient Crowd Counting via Dual Knowledge Distillation.
Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I
IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611
[TBL] [Abstract][Full Text] [Related]
25. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution.
Yang C; An Z; Cai L; Xu Y
IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2094-2108. PubMed ID: 35820013
[TBL] [Abstract][Full Text] [Related]
26. MSKD: Structured knowledge distillation for efficient medical image segmentation.
Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
[TBL] [Abstract][Full Text] [Related]
27. A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection.
Ying M; Wang Y; Yang K; Wang H; Liu X
Front Bioeng Biotechnol; 2023; 11():1326706. PubMed ID: 38292305
[No Abstract] [Full Text] [Related]
28. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
[TBL] [Abstract][Full Text] [Related]
29. DCCD: Reducing Neural Network Redundancy via Distillation.
Liu Y; Chen J; Liu Y
IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
[TBL] [Abstract][Full Text] [Related]
30. Leveraging Symbolic Knowledge Bases for Commonsense Natural Language Inference Using Pattern Theory.
Aakur SN; Sarkar S
IEEE Trans Pattern Anal Mach Intell; 2023 Nov; 45(11):13185-13202. PubMed ID: 37339033
[TBL] [Abstract][Full Text] [Related]
31. Localization Distillation for Object Detection.
Zheng Z; Ye R; Hou Q; Ren D; Wang P; Zuo W; Cheng MM
IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10070-10083. PubMed ID: 37027640
[TBL] [Abstract][Full Text] [Related]
32. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
Niyaz U; Sambyal AS; Bathula DR
Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
[TBL] [Abstract][Full Text] [Related]
33. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
[TBL] [Abstract][Full Text] [Related]
34. Classification of diabetic retinopathy using unlabeled data and knowledge distillation.
Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798
[TBL] [Abstract][Full Text] [Related]
35. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
Yuan W; Lu X; Zhang R; Liu Y
Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
[TBL] [Abstract][Full Text] [Related]
36. Improving Knowledge Distillation With a Customized Teacher.
Tan C; Liu J
IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2290-2299. PubMed ID: 35877790
[TBL] [Abstract][Full Text] [Related]
37. Vision-Language-Knowledge Co-Embedding for Visual Commonsense Reasoning.
Lee J; Kim I
Sensors (Basel); 2021 Apr; 21(9):. PubMed ID: 33919196
[TBL] [Abstract][Full Text] [Related]
38. RETHINKING INTERMEDIATE LAYERS DESIGN IN KNOWLEDGE DISTILLATION FOR KIDNEY AND LIVER TUMOR SEGMENTATION.
Gorade V; Mittal S; Jha D; Bagci U
ArXiv; 2024 May; ():. PubMed ID: 38855539
[TBL] [Abstract][Full Text] [Related]
39. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
Sepahvand M; Abdali-Mohammadi F
Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
[TBL] [Abstract][Full Text] [Related]
40. On Representation Knowledge Distillation for Graph Neural Networks.
Joshi CK; Liu F; Xun X; Lin J; Foo CS
IEEE Trans Neural Netw Learn Syst; 2024 Apr; 35(4):4656-4667. PubMed ID: 36459610
[TBL] [Abstract][Full Text] [Related]
[Previous] [Next] [New Search]