These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

140 related articles for article (PubMed ID: 38124584)

  • 1. Research on a lightweight electronic component detection method based on knowledge distillation.
    Xia Z; Gu J; Wang W; Huang Z
    Math Biosci Eng; 2023 Nov; 20(12):20971-20994. PubMed ID: 38124584
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Lightweight model-based sheep face recognition via face image recording channel.
    Zhang X; Xuan C; Ma Y; Liu H; Xue J
    J Anim Sci; 2024 Jan; 102():. PubMed ID: 38477672
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Cosine similarity-guided knowledge distillation for robust object detectors.
    Park S; Kang D; Paik J
    Sci Rep; 2024 Aug; 14(1):18888. PubMed ID: 39143179
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Structured Knowledge Distillation for Accurate and Efficient Object Detection.
    Zhang L; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):15706-15724. PubMed ID: 37527292
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Inferior and Coordinate Distillation for Object Detectors.
    Zhang Y; Li Y; Pan Z
    Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35957276
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Expanding and Refining Hybrid Compressors for Efficient Object Re-Identification.
    Xie Y; Wu H; Zhu J; Zeng H; Zhang J
    IEEE Trans Image Process; 2024; 33():3793-3808. PubMed ID: 38865219
    [TBL] [Abstract][Full Text] [Related]  

  • 8. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Multistage feature fusion knowledge distillation.
    Li G; Wang K; Lv P; He P; Zhou Z; Xu C
    Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT.
    Cho I; Kang U
    PLoS One; 2022; 17(2):e0263592. PubMed ID: 35180258
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation.
    Li R; Yun L; Zhang M; Yang Y; Cheng F
    Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Surface Defect Detection System for Carrot Combine Harvest Based on Multi-Stage Knowledge Distillation.
    Zhou W; Song C; Song K; Wen N; Sun X; Gao P
    Foods; 2023 Feb; 12(4):. PubMed ID: 36832869
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT.
    Zhang Y; Chen Z; Yang X
    Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Knowledge Transfer via Decomposing Essential Information in Convolutional Neural Networks.
    Lee S; Song BC
    IEEE Trans Neural Netw Learn Syst; 2022 Jan; 33(1):366-377. PubMed ID: 33048771
    [TBL] [Abstract][Full Text] [Related]  

  • 15. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 16. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 17. A lightweight Color-changing melon ripeness detection algorithm based on model pruning and knowledge distillation: leveraging dilated residual and multi-screening path aggregation.
    Chen G; Hou Y; Chen H; Cao L; Yuan J
    Front Plant Sci; 2024; 15():1406593. PubMed ID: 39109070
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Improving Knowledge Distillation With a Customized Teacher.
    Tan C; Liu J
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2290-2299. PubMed ID: 35877790
    [TBL] [Abstract][Full Text] [Related]  

  • 19. A Lightweight and High-Precision Passion Fruit YOLO Detection Model for Deployment in Embedded Devices.
    Sun Q; Li P; He C; Song Q; Chen J; Kong X; Luo Z
    Sensors (Basel); 2024 Jul; 24(15):. PubMed ID: 39123989
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Feature Map Distillation of Thin Nets for Low-Resolution Object Recognition.
    Huang Z; Yang S; Zhou M; Li Z; Gong Z; Chen Y
    IEEE Trans Image Process; 2022; 31():1364-1379. PubMed ID: 35025743
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.