BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

174 related articles for article (PubMed ID: 34721657)

  • 21. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
    Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
    Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Breast tumor classification through learning from noisy labeled ultrasound images.
    Cao Z; Yang G; Chen Q; Chen X; Lv F
    Med Phys; 2020 Mar; 47(3):1048-1057. PubMed ID: 31837239
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Generalized Knowledge Distillation via Relationship Matching.
    Ye HJ; Lu S; Zhan DC
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Paced-curriculum distillation with prediction and label uncertainty for image segmentation.
    Islam M; Seenivasan L; Sharan SP; Viekash VK; Gupta B; Glocker B; Ren H
    Int J Comput Assist Radiol Surg; 2023 Oct; 18(10):1875-1883. PubMed ID: 36862365
    [TBL] [Abstract][Full Text] [Related]  

  • 25. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation.
    Chen C; Dou Q; Jin Y; Liu Q; Heng PA
    IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Self-Distillation: Towards Efficient and Compact Neural Networks.
    Zhang L; Bao C; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4388-4403. PubMed ID: 33735074
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Data-Efficient Sensor Upgrade Path Using Knowledge Distillation.
    Van Molle P; De Boom C; Verbelen T; Vankeirsbilck B; De Vylder J; Diricx B; Simoens P; Dhoedt B
    Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640843
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 30. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Compatible-domain Transfer Learning for Breast Cancer Classification with Limited Annotated Data.
    Shamshiri MA; Krzyżak A; Kowal M; Korbicz J
    Comput Biol Med; 2023 Mar; 154():106575. PubMed ID: 36758326
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
    Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
    [TBL] [Abstract][Full Text] [Related]  

  • 33. AdaDFKD: Exploring adaptive inter-sample relationship in data-free knowledge distillation.
    Li J; Zhou S; Li L; Wang H; Bu J; Yu Z
    Neural Netw; 2024 Sep; 177():106386. PubMed ID: 38776761
    [TBL] [Abstract][Full Text] [Related]  

  • 34. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging.
    Yang Y; Guo X; Ye C; Xiang Y; Ma T
    Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Semi-supervised training of deep convolutional neural networks with heterogeneous data and few local annotations: An experiment on prostate histopathology image classification.
    Marini N; Otálora S; Müller H; Atzori M
    Med Image Anal; 2021 Oct; 73():102165. PubMed ID: 34303169
    [TBL] [Abstract][Full Text] [Related]  

  • 37. AAWS-Net: Anatomy-aware weakly-supervised learning network for breast mass segmentation.
    Sun Y; Ji Y
    PLoS One; 2021; 16(8):e0256830. PubMed ID: 34460852
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Knowledge Distillation Meets Label Noise Learning: Ambiguity-Guided Mutual Label Refinery.
    Jiang R; Yan Y; Xue JH; Chen S; Wang N; Wang H
    IEEE Trans Neural Netw Learn Syst; 2023 Nov; PP():. PubMed ID: 38019631
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Multistage feature fusion knowledge distillation.
    Li G; Wang K; Lv P; He P; Zhou Z; Xu C
    Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 9.