These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

129 related articles for article (PubMed ID: 34640843)

  • 21. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Sample self-selection using dual teacher networks for pathological image classification with noisy labels.
    Han G; Guo W; Zhang H; Jin J; Gan X; Zhao X
    Comput Biol Med; 2024 May; 174():108489. PubMed ID: 38640633
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Semi-supervised training of deep convolutional neural networks with heterogeneous data and few local annotations: An experiment on prostate histopathology image classification.
    Marini N; Otálora S; Müller H; Atzori M
    Med Image Anal; 2021 Oct; 73():102165. PubMed ID: 34303169
    [TBL] [Abstract][Full Text] [Related]  

  • 24. A Novel Approach to Classifying Breast Cancer Histopathology Biopsy Images Using Bilateral Knowledge Distillation and Label Smoothing Regularization.
    Chaudhury S; Shelke N; Sau K; Prasanalakshmi B; Shabaz M
    Comput Math Methods Med; 2021; 2021():4019358. PubMed ID: 34721657
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Relieving the Incompatibility of Network Representation and Classification for Long-Tailed Data Distribution.
    Hu H; Gao M; Wu M
    Comput Intell Neurosci; 2021; 2021():6702625. PubMed ID: 34987568
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Classification of diabetic retinopathy using unlabeled data and knowledge distillation.
    Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
    Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Resolution-Aware Knowledge Distillation for Efficient Inference.
    Feng Z; Lai J; Xie X
    IEEE Trans Image Process; 2021; 30():6985-6996. PubMed ID: 34347598
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Teacher-student guided knowledge distillation for unsupervised convolutional neural network-based speckle tracking in ultrasound strain elastography.
    Xiang T; Li Y; Deng H; Tian C; Peng B; Jiang J
    Med Biol Eng Comput; 2024 Aug; 62(8):2265-2279. PubMed ID: 38627356
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Graph Flow: Cross-Layer Graph Flow Distillation for Dual Efficient Medical Image Segmentation.
    Zou W; Qi X; Zhou W; Sun M; Sun Z; Shan C
    IEEE Trans Med Imaging; 2023 Apr; 42(4):1159-1171. PubMed ID: 36423314
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Efficient Combination of CNN and Transformer for Dual-Teacher Uncertainty-guided Semi-supervised Medical Image Segmentation.
    Xiao Z; Su Y; Deng Z; Zhang W
    Comput Methods Programs Biomed; 2022 Nov; 226():107099. PubMed ID: 36116398
    [TBL] [Abstract][Full Text] [Related]  

  • 33. MABAL: a Novel Deep-Learning Architecture for Machine-Assisted Bone Age Labeling.
    Mutasa S; Chang PD; Ruzal-Shapiro C; Ayyala R
    J Digit Imaging; 2018 Aug; 31(4):513-519. PubMed ID: 29404850
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Prototype-based sample-weighted distillation unified framework adapted to missing modality sentiment analysis.
    Zhang Y; Liu F; Zhuang X; Hou Y; Zhang Y
    Neural Netw; 2024 Sep; 177():106397. PubMed ID: 38805799
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Self-Distillation: Towards Efficient and Compact Neural Networks.
    Zhang L; Bao C; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4388-4403. PubMed ID: 33735074
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Building a Compact Convolutional Neural Network for Embedded Intelligent Sensor Systems Using Group Sparsity and Knowledge Distillation.
    Cho J; Lee M
    Sensors (Basel); 2019 Oct; 19(19):. PubMed ID: 31590266
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Semantics-Aware Adaptive Knowledge Distillation for Sensor-to-Vision Action Recognition.
    Liu Y; Wang K; Li G; Lin L
    IEEE Trans Image Process; 2021; 30():5573-5588. PubMed ID: 34110991
    [TBL] [Abstract][Full Text] [Related]  

  • 38. A Preliminary Study of Deep Learning Sensor Fusion for Pedestrian Detection.
    Plascencia AC; García-Gómez P; Perez EB; DeMas-Giménez G; Casas JR; Royo S
    Sensors (Basel); 2023 Apr; 23(8):. PubMed ID: 37112506
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Monocular Depth Estimation from a Fisheye Camera Based on Knowledge Distillation.
    Son E; Choi J; Song J; Jin Y; Lee SJ
    Sensors (Basel); 2023 Dec; 23(24):. PubMed ID: 38139714
    [TBL] [Abstract][Full Text] [Related]  

  • 40. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.