These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

105 related articles for article (PubMed ID: 38309137)

  • 1. Importance-aware adaptive dataset distillation.
    Li G; Togo R; Ogawa T; Haseyama M
    Neural Netw; 2024 Apr; 172():106154. PubMed ID: 38309137
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Dataset Distillation: A Comprehensive Review.
    Yu R; Liu S; Wang X
    IEEE Trans Pattern Anal Mach Intell; 2024 Jan; 46(1):150-170. PubMed ID: 37815974
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Compressed gastric image generation based on soft-label dataset distillation for medical data sharing.
    Li G; Togo R; Ogawa T; Haseyama M
    Comput Methods Programs Biomed; 2022 Dec; 227():107189. PubMed ID: 36323177
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Self-supervised learning with self-distillation on COVID-19 medical image classification.
    Tan Z; Yu Y; Meng J; Liu S; Li W
    Comput Methods Programs Biomed; 2024 Jan; 243():107876. PubMed ID: 37875036
    [TBL] [Abstract][Full Text] [Related]  

  • 5. DDK: Dynamic structure pruning based on differentiable search and recursive knowledge distillation for BERT.
    Zhang Z; Lu Y; Wang T; Wei X; Wei Z
    Neural Netw; 2024 May; 173():106164. PubMed ID: 38367353
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Multistructure-Based Collaborative Online Distillation.
    Gao L; Lan X; Mi H; Feng D; Xu K; Peng Y
    Entropy (Basel); 2019 Apr; 21(4):. PubMed ID: 33267071
    [TBL] [Abstract][Full Text] [Related]  

  • 8. TEM virus images: Benchmark dataset and deep learning classification.
    Matuszewski DJ; Sintorn IM
    Comput Methods Programs Biomed; 2021 Sep; 209():106318. PubMed ID: 34375851
    [TBL] [Abstract][Full Text] [Related]  

  • 9. A Comprehensive Survey of Dataset Distillation.
    Lei S; Tao D
    IEEE Trans Pattern Anal Mach Intell; 2024 Jan; 46(1):17-32. PubMed ID: 37801377
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Boosting knowledge diversity, accuracy, and stability via tri-enhanced distillation for domain continual medical image segmentation.
    Zhu Z; Ma X; Wang W; Dong S; Wang K; Wu L; Luo G; Wang G; Li S
    Med Image Anal; 2024 May; 94():103112. PubMed ID: 38401270
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Multi-granularity knowledge distillation and prototype consistency regularization for class-incremental learning.
    Shi Y; Shi D; Qiao Z; Wang Z; Zhang Y; Yang S; Qiu C
    Neural Netw; 2023 Jul; 164():617-630. PubMed ID: 37245476
    [TBL] [Abstract][Full Text] [Related]  

  • 13. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution.
    Hu H; Gao M; Wu M
    Comput Intell Neurosci; 2021; 2021():6702625. PubMed ID: 34987568
    [TBL] [Abstract][Full Text] [Related]  

  • 15. A novel adaptive cubic quasi-Newton optimizer for deep learning based medical image analysis tasks, validated on detection of COVID-19 and segmentation for COVID-19 lung infection, liver tumor, and optic disc/cup.
    Liu Y; Zhang M; Zhong Z; Zeng X
    Med Phys; 2023 Mar; 50(3):1528-1538. PubMed ID: 36057788
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Self-Distillation for Randomized Neural Networks.
    Hu M; Gao R; Suganthan PN
    IEEE Trans Neural Netw Learn Syst; 2024 Nov; 35(11):16119-16128. PubMed ID: 37585327
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Improving Differentiable Architecture Search via self-distillation.
    Zhu X; Li J; Liu Y; Wang W
    Neural Netw; 2023 Oct; 167():656-667. PubMed ID: 37717323
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Graph Flow: Cross-Layer Graph Flow Distillation for Dual Efficient Medical Image Segmentation.
    Zou W; Qi X; Zhou W; Sun M; Sun Z; Shan C
    IEEE Trans Med Imaging; 2023 Apr; 42(4):1159-1171. PubMed ID: 36423314
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Hierarchical Knowledge Propagation and Distillation for Few-Shot Learning.
    Zhou C; Wang H; Zhou S; Yu Z; Bandara D; Bu J
    Neural Netw; 2023 Oct; 167():615-625. PubMed ID: 37713767
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
    Li L; Su W; Liu F; He M; Liang X
    Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.