These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

168 related articles for article (PubMed ID: 35298374)

  • 1. Generalized Knowledge Distillation via Relationship Matching.
    Ye HJ; Lu S; Zhan DC
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
    Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Learning Student Networks via Feature Embedding.
    Chen H; Wang Y; Xu C; Xu C; Tao D
    IEEE Trans Neural Netw Learn Syst; 2021 Jan; 32(1):25-35. PubMed ID: 32092018
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Dual Distillation Discriminator Networks for Domain Adaptive Few-Shot Learning.
    Liu X; Ji Z; Pang Y; Han Z
    Neural Netw; 2023 Aug; 165():625-633. PubMed ID: 37364472
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Improving Knowledge Distillation With a Customized Teacher.
    Tan C; Liu J
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2290-2299. PubMed ID: 35877790
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Few-Shot Face Stylization via GAN Prior Distillation.
    Zhao R; Zhu M; Wang N; Gao X
    IEEE Trans Neural Netw Learn Syst; 2024 Mar; PP():. PubMed ID: 38536698
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Cosine similarity knowledge distillation for surface anomaly detection.
    Sheng S; Jing J; Wang Z; Zhang H
    Sci Rep; 2024 Apr; 14(1):8150. PubMed ID: 38589492
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Resolution-based distillation for efficient histology image classification.
    DiPalma J; Suriawinata AA; Tafe LJ; Torresani L; Hassanpour S
    Artif Intell Med; 2021 Sep; 119():102136. PubMed ID: 34531005
    [TBL] [Abstract][Full Text] [Related]  

  • 14. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 15. On Representation Knowledge Distillation for Graph Neural Networks.
    Joshi CK; Liu F; Xun X; Lin J; Foo CS
    IEEE Trans Neural Netw Learn Syst; 2024 Apr; 35(4):4656-4667. PubMed ID: 36459610
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
    Javed S; Mahmood A; Qaiser T; Werghi N
    IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Adversarial Distillation for Learning with Privileged Provisions.
    Wang X; Zhang R; Sun Y; Qi J
    IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712
    [TBL] [Abstract][Full Text] [Related]  

  • 18. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 20. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.