BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

181 related articles for article (PubMed ID: 38359020)

  • 1. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Feature relocation network for fine-grained image classification.
    Zhao P; Li Y; Tang B; Liu H; Yao S
    Neural Netw; 2023 Apr; 161():306-317. PubMed ID: 36774868
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Fine-grained image classification method based on hybrid attention module.
    Lu W; Yang Y; Yang L
    Front Neurorobot; 2024; 18():1391791. PubMed ID: 38765871
    [TBL] [Abstract][Full Text] [Related]  

  • 4. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
    Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
    Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Centralized contrastive loss with weakly supervised progressive feature extraction for fine-grained common thorax disease retrieval in chest x-ray.
    Chen F; You L; Zhao W; Zhou X
    Med Phys; 2023 Jun; 50(6):3560-3572. PubMed ID: 36515554
    [TBL] [Abstract][Full Text] [Related]  

  • 7. UCFNNet: Ulcerative colitis evaluation based on fine-grained lesion learner and noise suppression gating.
    Li H; Wang Z; Guan Z; Miao J; Li W; Yu P; Molina Jimenez C
    Comput Methods Programs Biomed; 2024 Apr; 247():108080. PubMed ID: 38382306
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 9. A Fine-Grained Bird Classification Method Based on Attention and Decoupled Knowledge Distillation.
    Wang K; Yang F; Chen Z; Chen Y; Zhang Y
    Animals (Basel); 2023 Jan; 13(2):. PubMed ID: 36670805
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Attention and feature transfer based knowledge distillation.
    Yang G; Yu S; Sheng Y; Yang H
    Sci Rep; 2023 Oct; 13(1):18369. PubMed ID: 37884556
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Transformer guided self-adaptive network for multi-scale skin lesion image segmentation.
    Xin C; Liu Z; Ma Y; Wang D; Zhang J; Li L; Zhou Q; Xu S; Zhang Y
    Comput Biol Med; 2024 Feb; 169():107846. PubMed ID: 38184865
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Fine-Grained 3D Shape Classification With Hierarchical Part-View Attention.
    Liu X; Han Z; Liu YS; Zwicker M
    IEEE Trans Image Process; 2021; 30():1744-1758. PubMed ID: 33417547
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Latent Space Semantic Supervision Based on Knowledge Distillation for Cross-Modal Retrieval.
    Zhang L; Wu X
    IEEE Trans Image Process; 2022; 31():7154-7164. PubMed ID: 36355734
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 15. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 16. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 17. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging.
    Yang Y; Guo X; Ye C; Xiang Y; Ma T
    Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Efficient Multi-Organ Segmentation From 3D Abdominal CT Images With Lightweight Network and Knowledge Distillation.
    Zhao Q; Zhong L; Xiao J; Zhang J; Chen Y; Liao W; Zhang S; Wang G
    IEEE Trans Med Imaging; 2023 Sep; 42(9):2513-2523. PubMed ID: 37030798
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 10.