BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

170 related articles for article (PubMed ID: 33513099)

  • 1. Knowledge Distillation and Student-Teacher Learning for Visual Intelligence: A Review and New Outlooks.
    Wang L; Yoon KJ
    IEEE Trans Pattern Anal Mach Intell; 2022 Jun; 44(6):3048-3068. PubMed ID: 33513099
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 3. When Object Detection Meets Knowledge Distillation: A Survey.
    Li Z; Xu P; Chang X; Yang L; Zhang Y; Yao L; Chen X
    IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10555-10579. PubMed ID: 37028387
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 5. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Self-Distillation: Towards Efficient and Compact Neural Networks.
    Zhang L; Bao C; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4388-4403. PubMed ID: 33735074
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Collaborative Knowledge Distillation via Multiknowledge Transfer.
    Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D
    IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
    Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
    Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT.
    Cho I; Kang U
    PLoS One; 2022; 17(2):e0263592. PubMed ID: 35180258
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Robust Student Network Learning.
    Guo T; Xu C; He S; Shi B; Xu C; Tao D
    IEEE Trans Neural Netw Learn Syst; 2020 Jul; 31(7):2455-2468. PubMed ID: 31425124
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Learning Student Networks via Feature Embedding.
    Chen H; Wang Y; Xu C; Xu C; Tao D
    IEEE Trans Neural Netw Learn Syst; 2021 Jan; 32(1):25-35. PubMed ID: 32092018
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Mitigating carbon footprint for knowledge distillation based deep learning model compression.
    Rafat K; Islam S; Mahfug AA; Hossain MI; Rahman F; Momen S; Rahman S; Mohammed N
    PLoS One; 2023; 18(5):e0285668. PubMed ID: 37186614
    [TBL] [Abstract][Full Text] [Related]  

  • 16. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Classification of diabetic retinopathy using unlabeled data and knowledge distillation.
    Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
    Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Knowledge distillation in deep learning and its applications.
    Alkhulaifi A; Alsahli F; Ahmad I
    PeerJ Comput Sci; 2021; 7():e474. PubMed ID: 33954248
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Distilled Siamese Networks for Visual Tracking.
    Shen J; Liu Y; Dong X; Lu X; Khan FS; Hoi S
    IEEE Trans Pattern Anal Mach Intell; 2022 Dec; 44(12):8896-8909. PubMed ID: 34762585
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.