191 related articles for article (PubMed ID: 37021915)
21. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT.
Zhang Y; Chen Z; Yang X
Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339
[TBL] [Abstract][Full Text] [Related]
22. Data-Efficient Sensor Upgrade Path Using Knowledge Distillation.
Van Molle P; De Boom C; Verbelen T; Vankeirsbilck B; De Vylder J; Diricx B; Simoens P; Dhoedt B
Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640843
[TBL] [Abstract][Full Text] [Related]
23. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation.
Li R; Yun L; Zhang M; Yang Y; Cheng F
Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675
[TBL] [Abstract][Full Text] [Related]
24. Complementary label learning based on knowledge distillation.
Ying P; Li Z; Sun R; Xu X
Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
[TBL] [Abstract][Full Text] [Related]
25. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
Sepahvand M; Abdali-Mohammadi F
Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
[TBL] [Abstract][Full Text] [Related]
26. Segmentation with mixed supervision: Confidence maximization helps knowledge distillation.
Liu B; Desrosiers C; Ben Ayed I; Dolz J
Med Image Anal; 2023 Jan; 83():102670. PubMed ID: 36413905
[TBL] [Abstract][Full Text] [Related]
27. Generalized Knowledge Distillation via Relationship Matching.
Ye HJ; Lu S; Zhan DC
IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374
[TBL] [Abstract][Full Text] [Related]
28. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
Ullah H; Munir A
J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
[TBL] [Abstract][Full Text] [Related]
29. Determining Top Fully Connected Layer's Hidden Neuron Count for Transfer Learning, Using Knowledge Distillation: a Case Study on Chest X-Ray Classification of Pneumonia and COVID-19.
Ghosh R
J Digit Imaging; 2021 Dec; 34(6):1349-1358. PubMed ID: 34590199
[TBL] [Abstract][Full Text] [Related]
30. Distilling Knowledge by Mimicking Features.
Wang GH; Ge Y; Wu J
IEEE Trans Pattern Anal Mach Intell; 2022 Nov; 44(11):8183-8195. PubMed ID: 34379588
[TBL] [Abstract][Full Text] [Related]
31. Efficient skin lesion segmentation with boundary distillation.
Zhang Z; Lu B
Med Biol Eng Comput; 2024 May; ():. PubMed ID: 38691269
[TBL] [Abstract][Full Text] [Related]
32. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
[TBL] [Abstract][Full Text] [Related]
33. LAD: Layer-Wise Adaptive Distillation for BERT Model Compression.
Lin YJ; Chen KY; Kao HY
Sensors (Basel); 2023 Jan; 23(3):. PubMed ID: 36772523
[TBL] [Abstract][Full Text] [Related]
34. A deep dive into understanding tumor foci classification using multiparametric MRI based on convolutional neural network.
Zong W; Lee JK; Liu C; Carver EN; Feldman AM; Janic B; Elshaikh MA; Pantelic MV; Hearshen D; Chetty IJ; Movsas B; Wen N
Med Phys; 2020 Sep; 47(9):4077-4086. PubMed ID: 32449176
[TBL] [Abstract][Full Text] [Related]
35. Knowledge Distillation for Semantic Segmentation Using Channel and Spatial Correlations and Adaptive Cross Entropy.
Park S; Heo YS
Sensors (Basel); 2020 Aug; 20(16):. PubMed ID: 32824456
[TBL] [Abstract][Full Text] [Related]
36. A General Dynamic Knowledge Distillation Method for Visual Analytics.
Tu Z; Liu X; Xiao X
IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
[TBL] [Abstract][Full Text] [Related]
37. Multiplex Cellular Communities in Multi-Gigapixel Colorectal Cancer Histology Images for Tissue Phenotyping.
Javed S; Mahmood A; Werghi N; Benes K; Rajpoot N
IEEE Trans Image Process; 2020 Sep; PP():. PubMed ID: 32966218
[TBL] [Abstract][Full Text] [Related]
38. Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation.
Noothout JMH; Lessmann N; van Eede MC; van Harten LD; Sogancioglu E; Heslinga FG; Veta M; van Ginneken B; IĆĄgum I
J Med Imaging (Bellingham); 2022 Sep; 9(5):052407. PubMed ID: 35692896
[No Abstract] [Full Text] [Related]
39. Physical-model guided self-distillation network for single image dehazing.
Lan Y; Cui Z; Su Y; Wang N; Li A; Han D
Front Neurorobot; 2022; 16():1036465. PubMed ID: 36531917
[TBL] [Abstract][Full Text] [Related]
40. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
Shang R; Li W; Zhu S; Jiao L; Li Y
Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
[TBL] [Abstract][Full Text] [Related]
[Previous] [Next] [New Search]