These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

122 related articles for article (PubMed ID: 37494155)

  • 1. LHAR: Lightweight Human Activity Recognition on Knowledge Distillation.
    Deng S; Chen J; Teng D; Yang C; Chen D; Jia T; Wang H
    IEEE J Biomed Health Inform; 2023 Jul; PP():. PubMed ID: 37494155
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
    Sepahvand M; Abdali-Mohammadi F
    Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation.
    Li R; Yun L; Zhang M; Yang Y; Cheng F
    Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT.
    Zhang Y; Chen Z; Yang X
    Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Efficient image classification through collaborative knowledge distillation: A novel AlexNet modification approach.
    Kuldashboy A; Umirzakova S; Allaberdiev S; Nasimov R; Abdusalomov A; Cho YI
    Heliyon; 2024 Jul; 10(14):e34376. PubMed ID: 39113984
    [TBL] [Abstract][Full Text] [Related]  

  • 7. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Selective Ensemble Based on Extreme Learning Machine for Sensor-Based Human Activity Recognition.
    Tian Y; Zhang J; Chen L; Geng Y; Wang X
    Sensors (Basel); 2019 Aug; 19(16):. PubMed ID: 31398938
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Multistage feature fusion knowledge distillation.
    Li G; Wang K; Lv P; He P; Zhou Z; Xu C
    Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
    [TBL] [Abstract][Full Text] [Related]  

  • 11. SSD-KD: A self-supervised diverse knowledge distillation method for lightweight skin lesion classification using dermoscopic images.
    Wang Y; Wang Y; Cai J; Lee TK; Miao C; Wang ZJ
    Med Image Anal; 2023 Feb; 84():102693. PubMed ID: 36462373
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 14. TCN-attention-HAR: human activity recognition based on attention mechanism time convolutional network.
    Wei X; Wang Z
    Sci Rep; 2024 Mar; 14(1):7414. PubMed ID: 38548859
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Learning to Explore Distillability and Sparsability: A Joint Framework for Model Compression.
    Liu Y; Cao J; Li B; Hu W; Maybank S
    IEEE Trans Pattern Anal Mach Intell; 2023 Mar; 45(3):3378-3395. PubMed ID: 35731774
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 17. A deep learning knowledge distillation framework using knee MRI and arthroscopy data for meniscus tear detection.
    Ying M; Wang Y; Yang K; Wang H; Liu X
    Front Bioeng Biotechnol; 2023; 11():1326706. PubMed ID: 38292305
    [No Abstract]   [Full Text] [Related]  

  • 18. A Lightweight Dangerous Liquid Detection Method Based on Depthwise Separable Convolution for X-Ray Security Inspection.
    Liu D; Liu J; Yuan P; Yu F
    Comput Intell Neurosci; 2022; 2022():5371350. PubMed ID: 35087581
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Cross-modal knowledge distillation for continuous sign language recognition.
    Gao L; Shi P; Hu L; Feng J; Zhu L; Wan L; Feng W
    Neural Netw; 2024 Nov; 179():106587. PubMed ID: 39111160
    [TBL] [Abstract][Full Text] [Related]  

  • 20. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data.
    Tseng W; Liu H; Yang Y; Liu C; Lu B
    Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689
    [No Abstract]   [Full Text] [Related]  

    [Next]    [New Search]
    of 7.