These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

165 related articles for article (PubMed ID: 36227819)

  • 1. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Relation Knowledge Distillation by Auxiliary Learning for Object Detection.
    Wang H; Jia T; Wang Q; Zuo W
    IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Collaborative Knowledge Distillation via Multiknowledge Transfer.
    Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D
    IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Feature Map Distillation of Thin Nets for Low-Resolution Object Recognition.
    Huang Z; Yang S; Zhou M; Li Z; Gong Z; Chen Y
    IEEE Trans Image Process; 2022; 31():1364-1379. PubMed ID: 35025743
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
    Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
    [TBL] [Abstract][Full Text] [Related]  

  • 10. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Cross-modal knowledge distillation for continuous sign language recognition.
    Gao L; Shi P; Hu L; Feng J; Zhu L; Wan L; Feng W
    Neural Netw; 2024 Nov; 179():106587. PubMed ID: 39111160
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Efficient Crowd Counting via Dual Knowledge Distillation.
    Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I
    IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Restructuring the Teacher and Student in Self-Distillation.
    Zheng Y; Wang C; Tao C; Lin S; Qian J; Wu J
    IEEE Trans Image Process; 2024; 33():5551-5563. PubMed ID: 39316482
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Structured Knowledge Distillation for Accurate and Efficient Object Detection.
    Zhang L; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):15706-15724. PubMed ID: 37527292
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Dynamic Knowledge Distillation with Noise Elimination for RGB-D Salient Object Detection.
    Ren G; Yu Y; Liu H; Stathaki T
    Sensors (Basel); 2022 Aug; 22(16):. PubMed ID: 36015947
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Real-Time Correlation Tracking via Joint Model Compression and Transfer.
    Wang N; Zhou W; Song Y; Ma C; Li H
    IEEE Trans Image Process; 2020 Apr; ():. PubMed ID: 32356748
    [TBL] [Abstract][Full Text] [Related]  

  • 18. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition.
    Yang C; An Z; Zhou H; Zhuang F; Xu Y; Zhang Q
    IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10212-10227. PubMed ID: 37030723
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.