138 related articles for article (PubMed ID: 37038389)
41. Distilling Knowledge by Mimicking Features.
Wang GH; Ge Y; Wu J
IEEE Trans Pattern Anal Mach Intell; 2022 Nov; 44(11):8183-8195. PubMed ID: 34379588
[TBL] [Abstract][Full Text] [Related]
42. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
Javed S; Mahmood A; Qaiser T; Werghi N
IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
[TBL] [Abstract][Full Text] [Related]
43. Using ensembles and distillation to optimize the deployment of deep learning models for the classification of electronic cancer pathology reports.
De Angeli K; Gao S; Blanchard A; Durbin EB; Wu XC; Stroup A; Doherty J; Schwartz SM; Wiggins C; Coyle L; Penberthy L; Tourassi G; Yoon HJ
JAMIA Open; 2022 Oct; 5(3):ooac075. PubMed ID: 36110150
[TBL] [Abstract][Full Text] [Related]
44. Learning Privacy-Preserving Student Networks via Discriminative-Generative Distillation.
Ge S; Liu B; Wang P; Li Y; Zeng D
IEEE Trans Image Process; 2022 Dec; PP():. PubMed ID: 37015525
[TBL] [Abstract][Full Text] [Related]
45. Multiple skin lesions diagnostics via integrated deep convolutional networks for segmentation and classification.
Al-Masni MA; Kim DH; Kim TS
Comput Methods Programs Biomed; 2020 Jul; 190():105351. PubMed ID: 32028084
[TBL] [Abstract][Full Text] [Related]
46. The self-distillation trained multitask dense-attention network for diagnosing lung cancers based on CT scans.
Chen L; Zhang Z
Med Phys; 2024 Mar; 51(3):1738-1753. PubMed ID: 37715993
[TBL] [Abstract][Full Text] [Related]
47. One-shot Federated Learning on Medical Data using Knowledge Distillation with Image Synthesis and Client Model Adaptation.
Kang M; Chikontwe P; Kim S; Jin KH; Adeli E; Pohl KM; Park SH
Med Image Comput Comput Assist Interv; 2023 Oct; 14221():521-531. PubMed ID: 38204983
[TBL] [Abstract][Full Text] [Related]
48. Multi-view Teacher-Student Network.
Tian Y; Sun S; Tang J
Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
[TBL] [Abstract][Full Text] [Related]
49. MetaLabelNet: Learning to Generate Soft-Labels From Noisy-Labels.
Algan G; Ulusoy I
IEEE Trans Image Process; 2022; 31():4352-4362. PubMed ID: 35731778
[TBL] [Abstract][Full Text] [Related]
50. Improving adversarial robustness of medical imaging systems via adding global attention noise.
Dai Y; Qian Y; Lu F; Wang B; Gu Z; Wang W; Wan J; Zhang Y
Comput Biol Med; 2023 Sep; 164():107251. PubMed ID: 37480679
[TBL] [Abstract][Full Text] [Related]
51. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
Yuan W; Lu X; Zhang R; Liu Y
Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
[TBL] [Abstract][Full Text] [Related]
52. SimCVD: Simple Contrastive Voxel-Wise Representation Distillation for Semi-Supervised Medical Image Segmentation.
You C; Zhou Y; Zhao R; Staib L; Duncan JS
IEEE Trans Med Imaging; 2022 Sep; 41(9):2228-2237. PubMed ID: 35320095
[TBL] [Abstract][Full Text] [Related]
53. Importance-aware adaptive dataset distillation.
Li G; Togo R; Ogawa T; Haseyama M
Neural Netw; 2024 Apr; 172():106154. PubMed ID: 38309137
[TBL] [Abstract][Full Text] [Related]
54. NTCE-KD: Non-Target-Class-Enhanced Knowledge Distillation.
Li C; Teng X; Ding Y; Lan L
Sensors (Basel); 2024 Jun; 24(11):. PubMed ID: 38894408
[TBL] [Abstract][Full Text] [Related]
55. Topological Knowledge Distillation for Wearable Sensor Data.
Jeon ES; Choi H; Shukla A; Wang Y; Buman MP; Turaga P
Conf Rec Asilomar Conf Signals Syst Comput; 2022; 2022():837-842. PubMed ID: 37583442
[TBL] [Abstract][Full Text] [Related]
56. Interpolated Joint Space Adversarial Training for Robust and Generalizable Defenses.
Lau CP; Liu J; Souri H; Lin WA; Feizi S; Chellappa R
IEEE Trans Pattern Anal Mach Intell; 2023 Nov; 45(11):13054-13067. PubMed ID: 37335791
[TBL] [Abstract][Full Text] [Related]
57. Learning With Noisy Labels Over Imbalanced Subpopulations.
Chen M; Zhao Y; He B; Han Z; Huang J; Wu B; Yao J
IEEE Trans Neural Netw Learn Syst; 2024 May; PP():. PubMed ID: 38691432
[TBL] [Abstract][Full Text] [Related]
58. Point Cloud Instance Segmentation with Inaccurate Bounding-Box Annotations.
Peng Y; Feng H; Chen T; Hu B
Sensors (Basel); 2023 Feb; 23(4):. PubMed ID: 36850943
[TBL] [Abstract][Full Text] [Related]
59. CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning.
Li K; Wan J; Yu S
IEEE Trans Image Process; 2022; 31():3825-3837. PubMed ID: 35609094
[TBL] [Abstract][Full Text] [Related]
60. Learning With Auxiliary Less-Noisy Labels.
Duan Y; Wu O
IEEE Trans Neural Netw Learn Syst; 2017 Jul; 28(7):1716-1721. PubMed ID: 27071201
[TBL] [Abstract][Full Text] [Related]
[Previous] [Next] [New Search]