These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

117 related articles for article (PubMed ID: 39111160)

  • 1. Cross-modal knowledge distillation for continuous sign language recognition.
    Gao L; Shi P; Hu L; Feng J; Zhu L; Wan L; Feng W
    Neural Netw; 2024 Nov; 179():106587. PubMed ID: 39111160
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Continuous Sign Language Recognition through a Context-Aware Generative Adversarial Network.
    Papastratis I; Dimitropoulos K; Daras P
    Sensors (Basel); 2021 Apr; 21(7):. PubMed ID: 33916231
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Novel Spatio-Temporal Continuous Sign Language Recognition Using an Attentive Multi-Feature Network.
    Aditya W; Shih TK; Thaipisutikul T; Fitriajie AS; Gochoo M; Utaminingrum F; Lin CY
    Sensors (Basel); 2022 Aug; 22(17):. PubMed ID: 36080911
    [TBL] [Abstract][Full Text] [Related]  

  • 4. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Comprehensive learning and adaptive teaching: Distilling multi-modal knowledge for pathological glioma grading.
    Xing X; Zhu M; Chen Z; Yuan Y
    Med Image Anal; 2024 Jan; 91():102990. PubMed ID: 37864912
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Deep Unsupervised Hashing for Large-Scale Cross-Modal Retrieval Using Knowledge Distillation Model.
    Li M; Li Q; Tang L; Peng S; Ma Y; Yang D
    Comput Intell Neurosci; 2021; 2021():5107034. PubMed ID: 34326867
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation.
    Li R; Yun L; Zhang M; Yang Y; Cheng F
    Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675
    [TBL] [Abstract][Full Text] [Related]  

  • 10. A Portable Sign Language Collection and Translation Platform with Smart Watches Using a BLSTM-Based Multi-Feature Framework.
    Zhou Z; Tam VWL; Lam EY
    Micromachines (Basel); 2022 Feb; 13(2):. PubMed ID: 35208457
    [TBL] [Abstract][Full Text] [Related]  

  • 11. MHD-Net: Memory-Aware Hetero-Modal Distillation Network for Thymic Epithelial Tumor Typing With Missing Pathology Modality.
    Zhang H; Liu J; Liu W; Chen H; Yu Z; Yuan Y; Wang P; Qin J
    IEEE J Biomed Health Inform; 2024 May; 28(5):3003-3014. PubMed ID: 38470599
    [TBL] [Abstract][Full Text] [Related]  

  • 12. A Mutual Knowledge Distillation-Empowered AI Framework for Early Detection of Alzheimer's Disease Using Incomplete Multi-Modal Images.
    Kwak MG; Su Y; Chen K; Weidman D; Wu T; Lure F; Li J
    medRxiv; 2023 Aug; ():. PubMed ID: 37662267
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Multi-Modality Self-Distillation for Weakly Supervised Temporal Action Localization.
    Huang L; Wang L; Li H
    IEEE Trans Image Process; 2022; 31():1504-1519. PubMed ID: 35050854
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Folic acid supplementation and malaria susceptibility and severity among people taking antifolate antimalarial drugs in endemic areas.
    Crider K; Williams J; Qi YP; Gutman J; Yeung L; Mai C; Finkelstain J; Mehta S; Pons-Duran C; Menéndez C; Moraleda C; Rogers L; Daniels K; Green P
    Cochrane Database Syst Rev; 2022 Feb; 2(2022):. PubMed ID: 36321557
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Uninformed Teacher-Student for hard-samples distillation in weakly supervised mitosis localization.
    Fernandez-Martín C; Silva-Rodriguez J; Kiraz U; Morales S; Janssen EAM; Naranjo V
    Comput Med Imaging Graph; 2024 Mar; 112():102328. PubMed ID: 38244279
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Multiscale knowledge distillation with attention based fusion for robust human activity recognition.
    Yuan Z; Yang Z; Ning H; Tang X
    Sci Rep; 2024 May; 14(1):12411. PubMed ID: 38816446
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Semantics-Aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition.
    Liu Y; Wang K; Li G; Lin L
    IEEE Trans Image Process; 2021; 30():5573-5588. PubMed ID: 34110991
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.