These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

157 related articles for article (PubMed ID: 38005675)

  • 21. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 23. A lightweight speech recognition method with target-swap knowledge distillation for Mandarin air traffic control communications.
    Ren J; Yang S; Shi Y; Yang J
    PeerJ Comput Sci; 2023; 9():e1650. PubMed ID: 38077570
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 25. An Efficient Approach Using Knowledge Distillation Methods to Stabilize Performance in a Lightweight Top-Down Posture Estimation Network.
    Park C; Lee HS; Kim WJ; Bae HB; Lee J; Lee S
    Sensors (Basel); 2021 Nov; 21(22):. PubMed ID: 34833717
    [TBL] [Abstract][Full Text] [Related]  

  • 26. A single stage knowledge distillation network for brain tumor segmentation on limited MR image modalities.
    Choi Y; Al-Masni MA; Jung KJ; Yoo RE; Lee SY; Kim DH
    Comput Methods Programs Biomed; 2023 Oct; 240():107644. PubMed ID: 37307766
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Learning to Explore Distillability and Sparsability: A Joint Framework for Model Compression.
    Liu Y; Cao J; Li B; Hu W; Maybank S
    IEEE Trans Pattern Anal Mach Intell; 2023 Mar; 45(3):3378-3395. PubMed ID: 35731774
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
    Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Low-resolution Face Recognition in the Wild via Selective Knowledge Distillation.
    Ge S; Zhao S; Li C; Li J
    IEEE Trans Image Process; 2018 Nov; ():. PubMed ID: 30507531
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Learning Student Networks via Feature Embedding.
    Chen H; Wang Y; Xu C; Xu C; Tao D
    IEEE Trans Neural Netw Learn Syst; 2021 Jan; 32(1):25-35. PubMed ID: 32092018
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Cross View Gait Recognition Using Joint-Direct Linear Discriminant Analysis.
    Portillo-Portillo J; Leyva R; Sanchez V; Sanchez-Perez G; Perez-Meana H; Olivares-Mercado J; Toscano-Medina K; Nakano-Miyatake M
    Sensors (Basel); 2016 Dec; 17(1):. PubMed ID: 28025484
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Real-Time Correlation Tracking via Joint Model Compression and Transfer.
    Wang N; Zhou W; Song Y; Ma C; Li H
    IEEE Trans Image Process; 2020 Apr; ():. PubMed ID: 32356748
    [TBL] [Abstract][Full Text] [Related]  

  • 34. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 35. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Classification of Alzheimer's disease in MRI images using knowledge distillation framework: an investigation.
    Li Y; Luo J; Zhang J
    Int J Comput Assist Radiol Surg; 2022 Jul; 17(7):1235-1243. PubMed ID: 35633492
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Self-Distillation for Randomized Neural Networks.
    Hu M; Gao R; Suganthan PN
    IEEE Trans Neural Netw Learn Syst; 2023 Aug; PP():. PubMed ID: 37585327
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Unsupervised Anomaly Detection with Distillated Teacher-Student Network Ensemble.
    Xiao Q; Wang J; Lin Y; Gongsa W; Hu G; Li M; Wang F
    Entropy (Basel); 2021 Feb; 23(2):. PubMed ID: 33561954
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 8.