BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

118 related articles for article (PubMed ID: 38127608)

  • 1. MT4MTL-KD: A Multi-Teacher Knowledge Distillation Framework for Triplet Recognition.
    Gui S; Wang Z; Chen J; Zhou X; Zhang C; Cao Y
    IEEE Trans Med Imaging; 2024 Apr; 43(4):1628-1639. PubMed ID: 38127608
    [TBL] [Abstract][Full Text] [Related]  

  • 2. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging.
    Yang Y; Guo X; Ye C; Xiang Y; Ma T
    Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Rendezvous: Attention mechanisms for the recognition of surgical action triplets in endoscopic videos.
    Nwoye CI; Yu T; Gonzalez C; Seeliger B; Mascagni P; Mutter D; Marescaux J; Padoy N
    Med Image Anal; 2022 May; 78():102433. PubMed ID: 35398658
    [TBL] [Abstract][Full Text] [Related]  

  • 4. MT-FiST: A Multi-Task Fine-Grained Spatial-Temporal Framework for Surgical Action Triplet Recognition.
    Li Y; Xia T; Luo H; He B; Jia F
    IEEE J Biomed Health Inform; 2023 Oct; 27(10):4983-4994. PubMed ID: 37498758
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Rendezvous in time: an attention-based temporal fusion approach for surgical triplet recognition.
    Sharma S; Nwoye CI; Mutter D; Padoy N
    Int J Comput Assist Radiol Surg; 2023 Jun; 18(6):1053-1059. PubMed ID: 37097518
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Teacher-student training and triplet loss to reduce the effect of drastic face occlusion: Application to emotion recognition, gender identification and age estimation.
    Georgescu MI; Duţǎ GE; Ionescu RT
    Mach Vis Appl; 2022; 33(1):12. PubMed ID: 34955610
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Light-M: An efficient lightweight medical image segmentation framework for resource-constrained IoMT.
    Zhang Y; Chen Z; Yang X
    Comput Biol Med; 2024 Mar; 170():108088. PubMed ID: 38320339
    [TBL] [Abstract][Full Text] [Related]  

  • 11. MSKD: Structured knowledge distillation for efficient medical image segmentation.
    Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
    Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 13. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation.
    Li R; Yun L; Zhang M; Yang Y; Cheng F
    Sensors (Basel); 2023 Nov; 23(22):. PubMed ID: 38005675
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Semantics-Aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition.
    Liu Y; Wang K; Li G; Lin L
    IEEE Trans Image Process; 2021; 30():5573-5588. PubMed ID: 34110991
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Comprehensive learning and adaptive teaching: Distilling multi-modal knowledge for pathological glioma grading.
    Xing X; Zhu M; Chen Z; Yuan Y
    Med Image Anal; 2024 Jan; 91():102990. PubMed ID: 37864912
    [TBL] [Abstract][Full Text] [Related]  

  • 17. CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning.
    Li K; Wan J; Yu S
    IEEE Trans Image Process; 2022; 31():3825-3837. PubMed ID: 35609094
    [TBL] [Abstract][Full Text] [Related]  

  • 18. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Multiscale knowledge distillation with attention based fusion for robust human activity recognition.
    Yuan Z; Yang Z; Ning H; Tang X
    Sci Rep; 2024 May; 14(1):12411. PubMed ID: 38816446
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution.
    Yang C; An Z; Cai L; Xu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2094-2108. PubMed ID: 35820013
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.