These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
204 related articles for article (PubMed ID: 31545712)
1. Adversarial Distillation for Learning with Privileged Provisions. Wang X; Zhang R; Sun Y; Qi J IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712 [TBL] [Abstract][Full Text] [Related]
2. Relation Knowledge Distillation by Auxiliary Learning for Object Detection. Wang H; Jia T; Wang Q; Zuo W IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414 [TBL] [Abstract][Full Text] [Related]
3. Learning Privacy-Preserving Student Networks via Discriminative-Generative Distillation. Ge S; Liu B; Wang P; Li Y; Zeng D IEEE Trans Image Process; 2022 Dec; PP():. PubMed ID: 37015525 [TBL] [Abstract][Full Text] [Related]
4. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection. Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770 [TBL] [Abstract][Full Text] [Related]
5. Efficient Crowd Counting via Dual Knowledge Distillation. Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611 [TBL] [Abstract][Full Text] [Related]
6. Mitigating Accuracy-Robustness Trade-Off via Balanced Multi-Teacher Adversarial Distillation. Zhao S; Wang X; Wei X IEEE Trans Pattern Anal Mach Intell; 2024 Dec; 46(12):9338-9352. PubMed ID: 38889035 [TBL] [Abstract][Full Text] [Related]
7. Dual Distillation Discriminator Networks for Domain Adaptive Few-Shot Learning. Liu X; Ji Z; Pang Y; Han Z Neural Netw; 2023 Aug; 165():625-633. PubMed ID: 37364472 [TBL] [Abstract][Full Text] [Related]
8. Resolution-Aware Knowledge Distillation for Efficient Inference. Feng Z; Lai J; Xie X IEEE Trans Image Process; 2021; 30():6985-6996. PubMed ID: 34347598 [TBL] [Abstract][Full Text] [Related]
10. Adversarial Multi-Teacher Distillation for Semi-Supervised Relation Extraction. Li W; Qian T; Li X; Zou L IEEE Trans Neural Netw Learn Syst; 2024 Aug; 35(8):11291-11301. PubMed ID: 37030799 [TBL] [Abstract][Full Text] [Related]
11. Restructuring the Teacher and Student in Self-Distillation. Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482 [TBL] [Abstract][Full Text] [Related]
12. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition. Ullah H; Munir A J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233 [TBL] [Abstract][Full Text] [Related]
13. Knowledge Distillation for Face Photo-Sketch Synthesis. Zhu M; Li J; Wang N; Gao X IEEE Trans Neural Netw Learn Syst; 2022 Feb; 33(2):893-906. PubMed ID: 33108298 [TBL] [Abstract][Full Text] [Related]
14. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation. Li R; Yun L; Zhang M; Yang Y; Cheng F Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675 [TBL] [Abstract][Full Text] [Related]
15. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT. Zhang Y; Chen Z; Yang X Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339 [TBL] [Abstract][Full Text] [Related]
16. Teacher-student complementary sample contrastive distillation. Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039 [TBL] [Abstract][Full Text] [Related]
17. A General Dynamic Knowledge Distillation Method for Visual Analytics. Tu Z; Liu X; Xiao X IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819 [TBL] [Abstract][Full Text] [Related]
18. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation. Chen C; Dou Q; Jin Y; Liu Q; Heng PA IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927 [TBL] [Abstract][Full Text] [Related]
19. Generalized Knowledge Distillation via Relationship Matching. Ye HJ; Lu S; Zhan DC IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374 [TBL] [Abstract][Full Text] [Related]
20. Cross-modal knowledge distillation for continuous sign language recognition. Gao L; Shi P; Hu L; Feng J; Zhu L; Wan L; Feng W Neural Netw; 2024 Nov; 179():106587. PubMed ID: 39111160 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]