These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

225 related articles for article (PubMed ID: 33921068)

  • 21. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 22. CKDF: Cascaded Knowledge Distillation Framework for Robust Incremental Learning.
    Li K; Wan J; Yu S
    IEEE Trans Image Process; 2022; 31():3825-3837. PubMed ID: 35609094
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Deep Neural Network Self-Distillation Exploiting Data Representation Invariance.
    Xu TB; Liu CL
    IEEE Trans Neural Netw Learn Syst; 2022 Jan; 33(1):257-269. PubMed ID: 33074828
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Building a Compact Convolutional Neural Network for Embedded Intelligent Sensor Systems Using Group Sparsity and Knowledge Distillation.
    Cho J; Lee M
    Sensors (Basel); 2019 Oct; 19(19):. PubMed ID: 31590266
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Self-Distillation for Randomized Neural Networks.
    Hu M; Gao R; Suganthan PN
    IEEE Trans Neural Netw Learn Syst; 2024 Nov; 35(11):16119-16128. PubMed ID: 37585327
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Cervical Cell Image Classification-Based Knowledge Distillation.
    Gao W; Xu C; Li G; Zhang Y; Bai N; Li M
    Biomimetics (Basel); 2022 Nov; 7(4):. PubMed ID: 36412723
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Densely Distilled Flow-Based Knowledge Transfer in Teacher-Student Framework for Image Classification.
    Bae JH; Yeo D; Yim J; Kim NS; Pyo CS; Kim J
    IEEE Trans Image Process; 2020 Apr; ():. PubMed ID: 32286978
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Improving Knowledge Distillation With a Customized Teacher.
    Tan C; Liu J
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2290-2299. PubMed ID: 35877790
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution.
    Yang C; An Z; Cai L; Xu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2094-2108. PubMed ID: 35820013
    [TBL] [Abstract][Full Text] [Related]  

  • 31. ResKD: Residual-Guided Knowledge Distillation.
    Li X; Li S; Omar B; Wu F; Li X
    IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
    Li L; Su W; Liu F; He M; Liang X
    Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Fine-Grained Learning Behavior-Oriented Knowledge Distillation for Graph Neural Networks.
    Liu K; Huang Z; Wang CD; Gao B; Chen Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; PP():. PubMed ID: 39012738
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Resolution-based distillation for efficient histology image classification.
    DiPalma J; Suriawinata AA; Tafe LJ; Torresani L; Hassanpour S
    Artif Intell Med; 2021 Sep; 119():102136. PubMed ID: 34531005
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Template-Driven Knowledge Distillation for Compact and Accurate Periocular Biometrics Deep-Learning Models.
    Boutros F; Damer N; Raja K; Kirchbuchner F; Kuijper A
    Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271074
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation.
    Chen C; Dou Q; Jin Y; Liu Q; Heng PA
    IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Collaborative Knowledge Distillation via Multiknowledge Transfer.
    Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D
    IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Overcoming limitation of dissociation between MD and MI classifications of breast cancer histopathological images through a novel decomposed feature-based knowledge distillation method.
    Sepahvand M; Abdali-Mohammadi F
    Comput Biol Med; 2022 Jun; 145():105413. PubMed ID: 35325731
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Generalized Knowledge Distillation via Relationship Matching.
    Ye HJ; Lu S; Zhan DC
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 12.