BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

134 related articles for article (PubMed ID: 37038389)

  • 21. Knowledge Distillation for Face Photo-Sketch Synthesis.
    Zhu M; Li J; Wang N; Gao X
    IEEE Trans Neural Netw Learn Syst; 2022 Feb; 33(2):893-906. PubMed ID: 33108298
    [TBL] [Abstract][Full Text] [Related]  

  • 22. STKD: Distilling Knowledge From Synchronous Teaching for Efficient Model Compression.
    Su T; Zhang J; Yu Z; Wang G; Liu X
    IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10051-10064. PubMed ID: 35420989
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Collaborative Knowledge Distillation via Multiknowledge Transfer.
    Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D
    IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Dual Distillation Discriminator Networks for Domain Adaptive Few-Shot Learning.
    Liu X; Ji Z; Pang Y; Han Z
    Neural Netw; 2023 Aug; 165():625-633. PubMed ID: 37364472
    [TBL] [Abstract][Full Text] [Related]  

  • 25. RETHINKING INTERMEDIATE LAYERS DESIGN IN KNOWLEDGE DISTILLATION FOR KIDNEY AND LIVER TUMOR SEGMENTATION.
    Gorade V; Mittal S; Jha D; Bagci U
    ArXiv; 2024 May; ():. PubMed ID: 38855539
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution.
    Yang C; An Z; Cai L; Xu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2094-2108. PubMed ID: 35820013
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Explainable Knowledge Distillation for On-Device Chest X-Ray Classification.
    Termritthikun C; Umer A; Suwanwimolkul S; Xia F; Lee I
    IEEE/ACM Trans Comput Biol Bioinform; 2023 May; PP():. PubMed ID: 37130250
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Paced-curriculum distillation with prediction and label uncertainty for image segmentation.
    Islam M; Seenivasan L; Sharan SP; Viekash VK; Gupta B; Glocker B; Ren H
    Int J Comput Assist Radiol Surg; 2023 Oct; 18(10):1875-1883. PubMed ID: 36862365
    [TBL] [Abstract][Full Text] [Related]  

  • 30. DSP-KD: Dual-Stage Progressive Knowledge Distillation for Skin Disease Classification.
    Zeng X; Ji Z; Zhang H; Chen R; Liao Q; Wang J; Lyu T; Zhao L
    Bioengineering (Basel); 2024 Jan; 11(1):. PubMed ID: 38247947
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Graph Flow: Cross-Layer Graph Flow Distillation for Dual Efficient Medical Image Segmentation.
    Zou W; Qi X; Zhou W; Sun M; Sun Z; Shan C
    IEEE Trans Med Imaging; 2023 Apr; 42(4):1159-1171. PubMed ID: 36423314
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Knowledge Distillation Classifier Generation Network for Zero-Shot Learning.
    Yu Y; Li B; Ji Z; Han J; Zhang Z
    IEEE Trans Neural Netw Learn Syst; 2023 Jun; 34(6):3183-3194. PubMed ID: 34587096
    [TBL] [Abstract][Full Text] [Related]  

  • 35. MSKD: Structured knowledge distillation for efficient medical image segmentation.
    Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
    Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Robust Point Cloud Segmentation With Noisy Annotations.
    Ye S; Chen D; Han S; Liao J
    IEEE Trans Pattern Anal Mach Intell; 2023 Jun; 45(6):7696-7710. PubMed ID: 36449593
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Learning with Privileged Information via Adversarial Discriminative Modality Distillation.
    Garcia NC; Morerio P; Murino V
    IEEE Trans Pattern Anal Mach Intell; 2020 Oct; 42(10):2581-2593. PubMed ID: 31331879
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Distilling Knowledge by Mimicking Features.
    Wang GH; Ge Y; Wu J
    IEEE Trans Pattern Anal Mach Intell; 2022 Nov; 44(11):8183-8195. PubMed ID: 34379588
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
    Javed S; Mahmood A; Qaiser T; Werghi N
    IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Using ensembles and distillation to optimize the deployment of deep learning models for the classification of electronic cancer pathology reports.
    De Angeli K; Gao S; Blanchard A; Durbin EB; Wu XC; Stroup A; Doherty J; Schwartz SM; Wiggins C; Coyle L; Penberthy L; Tourassi G; Yoon HJ
    JAMIA Open; 2022 Oct; 5(3):ooac075. PubMed ID: 36110150
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.