These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

140 related articles for article (PubMed ID: 38124584)

  • 21. Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation.
    Jeong Y; Park J; Cho D; Hwang Y; Choi SB; Kweon IS
    Sensors (Basel); 2022 Sep; 22(19):. PubMed ID: 36236485
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Scale-Hybrid Group Distillation with Knowledge Disentangling for Continual Semantic Segmentation.
    Song Z; Zhang X; Shi Z
    Sensors (Basel); 2023 Sep; 23(18):. PubMed ID: 37765877
    [TBL] [Abstract][Full Text] [Related]  

  • 23. ResKD: Residual-Guided Knowledge Distillation.
    Li X; Li S; Omar B; Wu F; Li X
    IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924
    [TBL] [Abstract][Full Text] [Related]  

  • 24. A Novel Deep Learning Model for Accurate Pest Detection and Edge Computing Deployment.
    Kang H; Ai L; Zhen Z; Lu B; Man Z; Yi P; Li M; Lin L
    Insects; 2023 Jul; 14(7):. PubMed ID: 37504666
    [TBL] [Abstract][Full Text] [Related]  

  • 25. MSKD: Structured knowledge distillation for efficient medical image segmentation.
    Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
    Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Distilling Knowledge by Mimicking Features.
    Wang GH; Ge Y; Wu J
    IEEE Trans Pattern Anal Mach Intell; 2022 Nov; 44(11):8183-8195. PubMed ID: 34379588
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Localization Distillation for Object Detection.
    Zheng Z; Ye R; Hou Q; Ren D; Wang P; Zuo W; Cheng MM
    IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10070-10083. PubMed ID: 37027640
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Lightweight Knowledge Distillation-Based Transfer Learning Framework for Rolling Bearing Fault Diagnosis.
    Lu R; Liu S; Gong Z; Xu C; Ma Z; Zhong Y; Li B
    Sensors (Basel); 2024 Mar; 24(6):. PubMed ID: 38544021
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Multilayer Semantic Features Adaptive Distillation for Object Detectors.
    Zhang Z; Liu J; Chen Y; Mei W; Huang F; Chen L
    Sensors (Basel); 2023 Sep; 23(17):. PubMed ID: 37688070
    [TBL] [Abstract][Full Text] [Related]  

  • 31. A lightweight speech recognition method with target-swap knowledge distillation for Mandarin air traffic control communications.
    Ren J; Yang S; Shi Y; Yang J
    PeerJ Comput Sci; 2023; 9():e1650. PubMed ID: 38077570
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 33. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Pixel Distillation: Cost-flexible Distillation across Image Sizes and Heterogeneous Networks.
    Guo G; Zhang D; Han L; Liu N; Cheng MM; Han J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jul; PP():. PubMed ID: 38949946
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Lightweight Feature Enhancement Network for Single-Shot Object Detection.
    Jia P; Liu F
    Sensors (Basel); 2021 Feb; 21(4):. PubMed ID: 33557216
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Dynamic Knowledge Distillation with Noise Elimination for RGB-D Salient Object Detection.
    Ren G; Yu Y; Liu H; Stathaki T
    Sensors (Basel); 2022 Aug; 22(16):. PubMed ID: 36015947
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
    Sepahvand M; Abdali-Mohammadi F
    Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.