These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

143 related articles for article (PubMed ID: 38816446)

  • 1. Multiscale knowledge distillation with attention based fusion for robust human activity recognition.
    Yuan Z; Yang Z; Ning H; Tang X
    Sci Rep; 2024 May; 14(1):12411. PubMed ID: 38816446
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Semantics-Aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition.
    Liu Y; Wang K; Li G; Lin L
    IEEE Trans Image Process; 2021; 30():5573-5588. PubMed ID: 34110991
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Adaptive Multi-Modal Fusion Framework for Activity Monitoring of People With Mobility Disability.
    Lin F; Wang Z; Zhao H; Qiu S; Shi X; Wu L; Gravina R; Fortino G
    IEEE J Biomed Health Inform; 2022 Aug; 26(8):4314-4324. PubMed ID: 35439149
    [TBL] [Abstract][Full Text] [Related]  

  • 5. MHD-Net: Memory-Aware Hetero-Modal Distillation Network for Thymic Epithelial Tumor Typing With Missing Pathology Modality.
    Zhang H; Liu J; Liu W; Chen H; Yu Z; Yuan Y; Wang P; Qin J
    IEEE J Biomed Health Inform; 2024 May; 28(5):3003-3014. PubMed ID: 38470599
    [TBL] [Abstract][Full Text] [Related]  

  • 6. MSKD: Structured knowledge distillation for efficient medical image segmentation.
    Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
    Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Classification of diabetic retinopathy using unlabeled data and knowledge distillation.
    Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
    Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
    Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Prototype-based sample-weighted distillation unified framework adapted to missing modality sentiment analysis.
    Zhang Y; Liu F; Zhuang X; Hou Y; Zhang Y
    Neural Netw; 2024 Sep; 177():106397. PubMed ID: 38805799
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Graph Flow: Cross-Layer Graph Flow Distillation for Dual Efficient Medical Image Segmentation.
    Zou W; Qi X; Zhou W; Sun M; Sun Z; Shan C
    IEEE Trans Med Imaging; 2023 Apr; 42(4):1159-1171. PubMed ID: 36423314
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Automated multi-modal Transformer network (AMTNet) for 3D medical images segmentation.
    Zheng S; Tan J; Jiang C; Li L
    Phys Med Biol; 2023 Jan; 68(2):. PubMed ID: 36595252
    [No Abstract]   [Full Text] [Related]  

  • 12. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation.
    Chen C; Dou Q; Jin Y; Liu Q; Heng PA
    IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Layerwised multimodal knowledge distillation for vision-language pretrained model.
    Wang J; Liao D; Zhang Y; Xu D; Zhang X
    Neural Netw; 2024 Jul; 175():106272. PubMed ID: 38569460
    [TBL] [Abstract][Full Text] [Related]  

  • 14. SwinCross: Cross-modal Swin transformer for head-and-neck tumor segmentation in PET/CT images.
    Li GY; Chen J; Jang SI; Gong K; Li Q
    Med Phys; 2024 Mar; 51(3):2096-2107. PubMed ID: 37776263
    [TBL] [Abstract][Full Text] [Related]  

  • 15. TCN-attention-HAR: human activity recognition based on attention mechanism time convolutional network.
    Wei X; Wang Z
    Sci Rep; 2024 Mar; 14(1):7414. PubMed ID: 38548859
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Self-Distillation: Towards Efficient and Compact Neural Networks.
    Zhang L; Bao C; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4388-4403. PubMed ID: 33735074
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Multistage feature fusion knowledge distillation.
    Li G; Wang K; Lv P; He P; Zhou Z; Xu C
    Sci Rep; 2024 Jun; 14(1):13373. PubMed ID: 38862547
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
    Li L; Su W; Liu F; He M; Liang X
    Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Dynamic Edge Convolutional Neural Network for Skeleton-Based Human Action Recognition.
    Tasnim N; Baek JH
    Sensors (Basel); 2023 Jan; 23(2):. PubMed ID: 36679576
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Comprehensive learning and adaptive teaching: Distilling multi-modal knowledge for pathological glioma grading.
    Xing X; Zhu M; Chen Z; Yuan Y
    Med Image Anal; 2024 Jan; 91():102990. PubMed ID: 37864912
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 8.