BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

139 related articles for article (PubMed ID: 34506549)

  • 21. ResKD: Residual-Guided Knowledge Distillation.
    Li X; Li S; Omar B; Wu F; Li X
    IEEE Trans Image Process; 2021; 30():4735-4746. PubMed ID: 33739924
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
    Javed S; Mahmood A; Qaiser T; Werghi N
    IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
    [TBL] [Abstract][Full Text] [Related]  

  • 23. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Learning to Explore Distillability and Sparsability: A Joint Framework for Model Compression.
    Liu Y; Cao J; Li B; Hu W; Maybank S
    IEEE Trans Pattern Anal Mach Intell; 2023 Mar; 45(3):3378-3395. PubMed ID: 35731774
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT.
    Cho I; Kang U
    PLoS One; 2022; 17(2):e0263592. PubMed ID: 35180258
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Adversarial Distillation for Learning with Privileged Provisions.
    Wang X; Zhang R; Sun Y; Qi J
    IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Adversarial Knowledge Distillation Based Biomedical Factoid Question Answering.
    Bai J; Yin C; Zhang J; Wang Y; Dong Y; Rong W; Xiong Z
    IEEE/ACM Trans Comput Biol Bioinform; 2023; 20(1):106-118. PubMed ID: 35316189
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Knowledge distillation under ideal joint classifier assumption.
    Li H; Chen X; Ditzler G; Roveda J; Li A
    Neural Netw; 2024 May; 173():106160. PubMed ID: 38330746
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Efficient knowledge distillation for liver CT segmentation using growing assistant network.
    Xu P; Kim K; Koh J; Wu D; Rim Lee Y; Young Park S; Young Tak W; Liu H; Li Q
    Phys Med Biol; 2021 Nov; 66(23):. PubMed ID: 34768246
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Self-Distillation for Randomized Neural Networks.
    Hu M; Gao R; Suganthan PN
    IEEE Trans Neural Netw Learn Syst; 2023 Aug; PP():. PubMed ID: 37585327
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Pre-trained language model augmented adversarial training network for Chinese clinical event detection.
    Zhang ZC; Zhang MY; Zhou T; Qiu YL
    Math Biosci Eng; 2020 Mar; 17(4):2825-2841. PubMed ID: 32987500
    [TBL] [Abstract][Full Text] [Related]  

  • 32. A non-negative feedback self-distillation method for salient object detection.
    Chen L; Cao T; Zheng Y; Yang J; Wang Y; Wang Y; Zhang B
    PeerJ Comput Sci; 2023; 9():e1435. PubMed ID: 37409081
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Compressing deep graph convolution network with multi-staged knowledge distillation.
    Kim J; Jung J; Kang U
    PLoS One; 2021; 16(8):e0256187. PubMed ID: 34388224
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Prototype-based sample-weighted distillation unified framework adapted to missing modality sentiment analysis.
    Zhang Y; Liu F; Zhuang X; Hou Y; Zhang Y
    Neural Netw; 2024 Sep; 177():106397. PubMed ID: 38805799
    [TBL] [Abstract][Full Text] [Related]  

  • 35. CapsTM: capsule network for Chinese medical text matching.
    Yu X; Shen Y; Ni Y; Huang X; Wang X; Chen Q; Tang B
    BMC Med Inform Decis Mak; 2021 Jul; 21(Suppl 2):94. PubMed ID: 34330253
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Inferior and Coordinate Distillation for Object Detectors.
    Zhang Y; Li Y; Pan Z
    Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35957276
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Lightweight Depth Completion Network with Local Similarity-Preserving Knowledge Distillation.
    Jeong Y; Park J; Cho D; Hwang Y; Choi SB; Kweon IS
    Sensors (Basel); 2022 Sep; 22(19):. PubMed ID: 36236485
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Knowledge Distillation via Constrained Variational Inference.
    Saeedi A; Utsumi Y; Sun L; Batmanghelich K; Lehman LH
    Proc AAAI Conf Artif Intell; 2022; 36(7):8132-8140. PubMed ID: 36092768
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Uncertainty-Aware Knowledge Distillation for Collision Identification of Collaborative Robots.
    Kwon W; Jin Y; Lee SJ
    Sensors (Basel); 2021 Oct; 21(19):. PubMed ID: 34640993
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.