BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

191 related articles for article (PubMed ID: 37021915)

  • 41. MHD-Net: Memory-Aware Hetero-Modal Distillation Network for Thymic Epithelial Tumor Typing With Missing Pathology Modality.
    Zhang H; Liu J; Liu W; Chen H; Yu Z; Yuan Y; Wang P; Qin J
    IEEE J Biomed Health Inform; 2024 May; 28(5):3003-3014. PubMed ID: 38470599
    [TBL] [Abstract][Full Text] [Related]  

  • 42. Real-Time Correlation Tracking via Joint Model Compression and Transfer.
    Wang N; Zhou W; Song Y; Ma C; Li H
    IEEE Trans Image Process; 2020 Apr; ():. PubMed ID: 32356748
    [TBL] [Abstract][Full Text] [Related]  

  • 43. Classification of diabetic retinopathy using unlabeled data and knowledge distillation.
    Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
    Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798
    [TBL] [Abstract][Full Text] [Related]  

  • 44. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 45. Compressing deep graph convolution network with multi-staged knowledge distillation.
    Kim J; Jung J; Kang U
    PLoS One; 2021; 16(8):e0256187. PubMed ID: 34388224
    [TBL] [Abstract][Full Text] [Related]  

  • 46. Resolution-Aware Knowledge Distillation for Efficient Inference.
    Feng Z; Lai J; Xie X
    IEEE Trans Image Process; 2021; 30():6985-6996. PubMed ID: 34347598
    [TBL] [Abstract][Full Text] [Related]  

  • 47. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 48. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
    Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
    [TBL] [Abstract][Full Text] [Related]  

  • 49. Efficient Crowd Counting via Dual Knowledge Distillation.
    Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I
    IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611
    [TBL] [Abstract][Full Text] [Related]  

  • 50. Overcoming limitation of dissociation between MD and MI classifications of breast cancer histopathological images through a novel decomposed feature-based knowledge distillation method.
    Sepahvand M; Abdali-Mohammadi F
    Comput Biol Med; 2022 Jun; 145():105413. PubMed ID: 35325731
    [TBL] [Abstract][Full Text] [Related]  

  • 51. Knowledge Distillation for Face Photo-Sketch Synthesis.
    Zhu M; Li J; Wang N; Gao X
    IEEE Trans Neural Netw Learn Syst; 2022 Feb; 33(2):893-906. PubMed ID: 33108298
    [TBL] [Abstract][Full Text] [Related]  

  • 52. Inferior and Coordinate Distillation for Object Detectors.
    Zhang Y; Li Y; Pan Z
    Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35957276
    [TBL] [Abstract][Full Text] [Related]  

  • 53. Cervical Cell Image Classification-Based Knowledge Distillation.
    Gao W; Xu C; Li G; Zhang Y; Bai N; Li M
    Biomimetics (Basel); 2022 Nov; 7(4):. PubMed ID: 36412723
    [TBL] [Abstract][Full Text] [Related]  

  • 54. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 55. Multiscale knowledge distillation with attention based fusion for robust human activity recognition.
    Yuan Z; Yang Z; Ning H; Tang X
    Sci Rep; 2024 May; 14(1):12411. PubMed ID: 38816446
    [TBL] [Abstract][Full Text] [Related]  

  • 56. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation.
    Chen C; Dou Q; Jin Y; Liu Q; Heng PA
    IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927
    [TBL] [Abstract][Full Text] [Related]  

  • 57. A New Framework of Collaborative Learning for Adaptive Metric Distillation.
    Liu H; Ye M; Wang Y; Zhao S; Li P; Shen J
    IEEE Trans Neural Netw Learn Syst; 2024 Jun; 35(6):8266-8277. PubMed ID: 37022854
    [TBL] [Abstract][Full Text] [Related]  

  • 58. Distilling a Powerful Student Model via Online Knowledge Distillation.
    Li S; Lin M; Wang Y; Wu Y; Tian Y; Shao L; Ji R
    IEEE Trans Neural Netw Learn Syst; 2023 Nov; 34(11):8743-8752. PubMed ID: 35254994
    [TBL] [Abstract][Full Text] [Related]  

  • 59. TSFD-Net: Tissue specific feature distillation network for nuclei segmentation and classification.
    Ilyas T; Mannan ZI; Khan A; Azam S; Kim H; De Boer F
    Neural Netw; 2022 Jul; 151():1-15. PubMed ID: 35367734
    [TBL] [Abstract][Full Text] [Related]  

  • 60. A Novel Approach to Classifying Breast Cancer Histopathology Biopsy Images Using Bilateral Knowledge Distillation and Label Smoothing Regularization.
    Chaudhury S; Shelke N; Sau K; Prasanalakshmi B; Shabaz M
    Comput Math Methods Med; 2021; 2021():4019358. PubMed ID: 34721657
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 10.