BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

136 related articles for article (PubMed ID: 34506549)

  • 1. BERTtoCNN: Similarity-preserving enhanced knowledge distillation for stance detection.
    Li Y; Sun Y; Zhu N
    PLoS One; 2021; 16(9):e0257130. PubMed ID: 34506549
    [TBL] [Abstract][Full Text] [Related]  

  • 2. LAD: Layer-Wise Adaptive Distillation for BERT Model Compression.
    Lin YJ; Chen KY; Kao HY
    Sensors (Basel); 2023 Jan; 23(3):. PubMed ID: 36772523
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Leveraging Symbolic Knowledge Bases for Commonsense Natural Language Inference Using Pattern Theory.
    Aakur SN; Sarkar S
    IEEE Trans Pattern Anal Mach Intell; 2023 Nov; 45(11):13185-13202. PubMed ID: 37339033
    [TBL] [Abstract][Full Text] [Related]  

  • 4. DDK: Dynamic structure pruning based on differentiable search and recursive knowledge distillation for BERT.
    Zhang Z; Lu Y; Wang T; Wei X; Wei Z
    Neural Netw; 2024 May; 173():106164. PubMed ID: 38367353
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Layerwised multimodal knowledge distillation for vision-language pretrained model.
    Wang J; Liao D; Zhang Y; Xu D; Zhang X
    Neural Netw; 2024 Jul; 175():106272. PubMed ID: 38569460
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Highlight Every Step: Knowledge Distillation via Collaborative Teaching.
    Zhao H; Sun X; Dong J; Chen C; Dong Z
    IEEE Trans Cybern; 2022 Apr; 52(4):2070-2081. PubMed ID: 32721909
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Self-Distillation: Towards Efficient and Compact Neural Networks.
    Zhang L; Bao C; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4388-4403. PubMed ID: 33735074
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Learning With Privileged Multimodal Knowledge for Unimodal Segmentation.
    Chen C; Dou Q; Jin Y; Liu Q; Heng PA
    IEEE Trans Med Imaging; 2022 Mar; 41(3):621-632. PubMed ID: 34633927
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Learning Student Networks via Feature Embedding.
    Chen H; Wang Y; Xu C; Xu C; Tao D
    IEEE Trans Neural Netw Learn Syst; 2021 Jan; 32(1):25-35. PubMed ID: 32092018
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Generalized Knowledge Distillation via Relationship Matching.
    Ye HJ; Lu S; Zhan DC
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1817-1834. PubMed ID: 35298374
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 14. On Representation Knowledge Distillation for Graph Neural Networks.
    Joshi CK; Liu F; Xun X; Lin J; Foo CS
    IEEE Trans Neural Netw Learn Syst; 2024 Apr; 35(4):4656-4667. PubMed ID: 36459610
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Resolution-Aware Knowledge Distillation for Efficient Inference.
    Feng Z; Lai J; Xie X
    IEEE Trans Image Process; 2021; 30():6985-6996. PubMed ID: 34347598
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Knowledge Distillation for Semantic Segmentation Using Channel and Spatial Correlations and Adaptive Cross Entropy.
    Park S; Heo YS
    Sensors (Basel); 2020 Aug; 20(16):. PubMed ID: 32824456
    [TBL] [Abstract][Full Text] [Related]  

  • 17. A Novel Approach to Classifying Breast Cancer Histopathology Biopsy Images Using Bilateral Knowledge Distillation and Label Smoothing Regularization.
    Chaudhury S; Shelke N; Sau K; Prasanalakshmi B; Shabaz M
    Comput Math Methods Med; 2021; 2021():4019358. PubMed ID: 34721657
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Cosine similarity knowledge distillation for surface anomaly detection.
    Sheng S; Jing J; Wang Z; Zhang H
    Sci Rep; 2024 Apr; 14(1):8150. PubMed ID: 38589492
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Unsupervised Anomaly Detection with Distillated Teacher-Student Network Ensemble.
    Xiao Q; Wang J; Lin Y; Gongsa W; Hu G; Li M; Wang F
    Entropy (Basel); 2021 Feb; 23(2):. PubMed ID: 33561954
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Classification of diabetic retinopathy using unlabeled data and knowledge distillation.
    Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
    Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.