BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

134 related articles for article (PubMed ID: 37038389)

  • 1. RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging.
    Jaiswal A; Ashutosh K; Rousseau JF; Peng Y; Wang Z; Ding Y
    Proc IEEE Int Conf Data Min; 2022; 2022():981-986. PubMed ID: 37038389
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Sample self-selection using dual teacher networks for pathological image classification with noisy labels.
    Han G; Guo W; Zhang H; Jin J; Gan X; Zhao X
    Comput Biol Med; 2024 May; 174():108489. PubMed ID: 38640633
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Knowledge Distillation Meets Label Noise Learning: Ambiguity-Guided Mutual Label Refinery.
    Jiang R; Yan Y; Xue JH; Chen S; Wang N; Wang H
    IEEE Trans Neural Netw Learn Syst; 2023 Nov; PP():. PubMed ID: 38019631
    [TBL] [Abstract][Full Text] [Related]  

  • 4. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 5. S-CUDA: Self-cleansing unsupervised domain adaptation for medical image segmentation.
    Liu L; Zhang Z; Li S; Ma K; Zheng Y
    Med Image Anal; 2021 Dec; 74():102214. PubMed ID: 34464837
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Breast tumor classification through learning from noisy labeled ultrasound images.
    Cao Z; Yang G; Chen Q; Chen X; Lv F
    Med Phys; 2020 Mar; 47(3):1048-1057. PubMed ID: 31837239
    [TBL] [Abstract][Full Text] [Related]  

  • 7. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging.
    Yang Y; Guo X; Ye C; Xiang Y; Ma T
    Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Robust co-teaching learning with consistency-based noisy label correction for medical image classification.
    Zhu M; Zhang L; Wang L; Li D; Zhang J; Yi Z
    Int J Comput Assist Radiol Surg; 2023 Apr; 18(4):675-683. PubMed ID: 36437387
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Robust Medical Image Classification From Noisy Labeled Data With Global and Local Representation Guided Co-Training.
    Xue C; Yu L; Chen P; Dou Q; Heng PA
    IEEE Trans Med Imaging; 2022 Jun; 41(6):1371-1382. PubMed ID: 34982680
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 12. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
    Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
    Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
    Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
    Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Mitigating Accuracy-Robustness Trade-Off Via Balanced Multi-Teacher Adversarial Distillation.
    Zhao S; Wang X; Wei X
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; PP():. PubMed ID: 38889035
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Topological Persistence Guided Knowledge Distillation for Wearable Sensor Data.
    Jeon ES; Choi H; Shukla A; Wang Y; Lee H; Buman MP; Turaga P
    Eng Appl Artif Intell; 2024 Apr; 130():. PubMed ID: 38282698
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Multistructure-Based Collaborative Online Distillation.
    Gao L; Lan X; Mi H; Feng D; Xu K; Peng Y
    Entropy (Basel); 2019 Apr; 21(4):. PubMed ID: 33267071
    [TBL] [Abstract][Full Text] [Related]  

  • 20. GC
    Bayasi N; Hamarneh G; Garbi R
    IEEE Trans Med Imaging; 2024 May; PP():. PubMed ID: 38717881
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.