These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

125 related articles for article (PubMed ID: 38818128)

  • 41. Attention and feature transfer based knowledge distillation.
    Yang G; Yu S; Sheng Y; Yang H
    Sci Rep; 2023 Oct; 13(1):18369. PubMed ID: 37884556
    [TBL] [Abstract][Full Text] [Related]  

  • 42. LAD: Layer-Wise Adaptive Distillation for BERT Model Compression.
    Lin YJ; Chen KY; Kao HY
    Sensors (Basel); 2023 Jan; 23(3):. PubMed ID: 36772523
    [TBL] [Abstract][Full Text] [Related]  

  • 43. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
    Javed S; Mahmood A; Qaiser T; Werghi N
    IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
    [TBL] [Abstract][Full Text] [Related]  

  • 44. Lightweight Knowledge Distillation-Based Transfer Learning Framework for Rolling Bearing Fault Diagnosis.
    Lu R; Liu S; Gong Z; Xu C; Ma Z; Zhong Y; Li B
    Sensors (Basel); 2024 Mar; 24(6):. PubMed ID: 38544021
    [TBL] [Abstract][Full Text] [Related]  

  • 45. Unsupervised Anomaly Detection with Distillated Teacher-Student Network Ensemble.
    Xiao Q; Wang J; Lin Y; Gongsa W; Hu G; Li M; Wang F
    Entropy (Basel); 2021 Feb; 23(2):. PubMed ID: 33561954
    [TBL] [Abstract][Full Text] [Related]  

  • 46. Classification of Alzheimer's disease in MRI images using knowledge distillation framework: an investigation.
    Li Y; Luo J; Zhang J
    Int J Comput Assist Radiol Surg; 2022 Jul; 17(7):1235-1243. PubMed ID: 35633492
    [TBL] [Abstract][Full Text] [Related]  

  • 47. RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging.
    Jaiswal A; Ashutosh K; Rousseau JF; Peng Y; Wang Z; Ding Y
    Proc IEEE Int Conf Data Min; 2022; 2022():981-986. PubMed ID: 37038389
    [TBL] [Abstract][Full Text] [Related]  

  • 48. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 49. LHAR: Lightweight Human Activity Recognition on Knowledge Distillation.
    Deng S; Chen J; Teng D; Yang C; Chen D; Jia T; Wang H
    IEEE J Biomed Health Inform; 2023 Jul; PP():. PubMed ID: 37494155
    [TBL] [Abstract][Full Text] [Related]  

  • 50. Self-knowledge distillation for surgical phase recognition.
    Zhang J; Barbarisi S; Kadkhodamohammadi A; Stoyanov D; Luengo I
    Int J Comput Assist Radiol Surg; 2024 Jan; 19(1):61-68. PubMed ID: 37340283
    [TBL] [Abstract][Full Text] [Related]  

  • 51. Self-Parameter Distillation Dehazing.
    Kim G; Kwon J
    IEEE Trans Image Process; 2022 Dec; PP():. PubMed ID: 37015501
    [TBL] [Abstract][Full Text] [Related]  

  • 52. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 53. Inferior and Coordinate Distillation for Object Detectors.
    Zhang Y; Li Y; Pan Z
    Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35957276
    [TBL] [Abstract][Full Text] [Related]  

  • 54. Self-Distillation for Randomized Neural Networks.
    Hu M; Gao R; Suganthan PN
    IEEE Trans Neural Netw Learn Syst; 2023 Aug; PP():. PubMed ID: 37585327
    [TBL] [Abstract][Full Text] [Related]  

  • 55. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 56. Efficient Crowd Counting via Dual Knowledge Distillation.
    Wang R; Hao Y; Hu L; Li X; Chen M; Miao Y; Humar I
    IEEE Trans Image Process; 2023 Dec; PP():. PubMed ID: 38127611
    [TBL] [Abstract][Full Text] [Related]  

  • 57. Learning From Human Educational Wisdom: A Student-Centered Knowledge Distillation Method.
    Yang S; Yang J; Zhou M; Huang Z; Zheng WS; Yang X; Ren J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jun; 46(6):4188-4205. PubMed ID: 38227419
    [TBL] [Abstract][Full Text] [Related]  

  • 58. CReg-KD: Model refinement via confidence regularized knowledge distillation for brain imaging.
    Yang Y; Guo X; Ye C; Xiang Y; Ma T
    Med Image Anal; 2023 Oct; 89():102916. PubMed ID: 37549611
    [TBL] [Abstract][Full Text] [Related]  

  • 59. MSKD: Structured knowledge distillation for efficient medical image segmentation.
    Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
    Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
    [TBL] [Abstract][Full Text] [Related]  

  • 60. Knowledge Distillation for Face Photo-Sketch Synthesis.
    Zhu M; Li J; Wang N; Gao X
    IEEE Trans Neural Netw Learn Syst; 2022 Feb; 33(2):893-906. PubMed ID: 33108298
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.