BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

133 related articles for article (PubMed ID: 38569460)

  • 41. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 42. RCKD: Response-Based Cross-Task Knowledge Distillation for Pathological Image Analysis.
    Kim H; Kwak TY; Chang H; Kim SW; Kim I
    Bioengineering (Basel); 2023 Nov; 10(11):. PubMed ID: 38002403
    [TBL] [Abstract][Full Text] [Related]  

  • 43. GSB: Group superposition binarization for vision transformer with limited training samples.
    Gao T; Xu CZ; Zhang L; Kong H
    Neural Netw; 2024 Apr; 172():106133. PubMed ID: 38266471
    [TBL] [Abstract][Full Text] [Related]  

  • 44. Pea-KD: Parameter-efficient and accurate Knowledge Distillation on BERT.
    Cho I; Kang U
    PLoS One; 2022; 17(2):e0263592. PubMed ID: 35180258
    [TBL] [Abstract][Full Text] [Related]  

  • 45. Attention and feature transfer based knowledge distillation.
    Yang G; Yu S; Sheng Y; Yang H
    Sci Rep; 2023 Oct; 13(1):18369. PubMed ID: 37884556
    [TBL] [Abstract][Full Text] [Related]  

  • 46. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
    Li L; Su W; Liu F; He M; Liang X
    Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
    [TBL] [Abstract][Full Text] [Related]  

  • 47. Structured Knowledge Distillation for Accurate and Efficient Object Detection.
    Zhang L; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):15706-15724. PubMed ID: 37527292
    [TBL] [Abstract][Full Text] [Related]  

  • 48. Integrating Multimodal Information in Large Pretrained Transformers.
    Rahman W; Hasan MK; Lee S; Zadeh A; Mao C; Morency LP; Hoque E
    Proc Conf Assoc Comput Linguist Meet; 2020 Jul; 2020():2359-2369. PubMed ID: 33782629
    [TBL] [Abstract][Full Text] [Related]  

  • 49. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 50. Efficient knowledge distillation for liver CT segmentation using growing assistant network.
    Xu P; Kim K; Koh J; Wu D; Rim Lee Y; Young Park S; Young Tak W; Liu H; Li Q
    Phys Med Biol; 2021 Nov; 66(23):. PubMed ID: 34768246
    [TBL] [Abstract][Full Text] [Related]  

  • 51. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 52. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 53. Adversarial Knowledge Distillation Based Biomedical Factoid Question Answering.
    Bai J; Yin C; Zhang J; Wang Y; Dong Y; Rong W; Xiong Z
    IEEE/ACM Trans Comput Biol Bioinform; 2023; 20(1):106-118. PubMed ID: 35316189
    [TBL] [Abstract][Full Text] [Related]  

  • 54. Pixel Distillation: Cost-flexible Distillation across Image Sizes and Heterogeneous Networks.
    Guo G; Zhang D; Han L; Liu N; Cheng MM; Han J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jul; PP():. PubMed ID: 38949946
    [TBL] [Abstract][Full Text] [Related]  

  • 55. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 56. Resolution-based distillation for efficient histology image classification.
    DiPalma J; Suriawinata AA; Tafe LJ; Torresani L; Hassanpour S
    Artif Intell Med; 2021 Sep; 119():102136. PubMed ID: 34531005
    [TBL] [Abstract][Full Text] [Related]  

  • 57. Collaborative Knowledge Distillation via Multiknowledge Transfer.
    Gou J; Sun L; Yu B; Du L; Ramamohanarao K; Tao D
    IEEE Trans Neural Netw Learn Syst; 2024 May; 35(5):6718-6730. PubMed ID: 36264723
    [TBL] [Abstract][Full Text] [Related]  

  • 58. Knowledge Distillation for Face Photo-Sketch Synthesis.
    Zhu M; Li J; Wang N; Gao X
    IEEE Trans Neural Netw Learn Syst; 2022 Feb; 33(2):893-906. PubMed ID: 33108298
    [TBL] [Abstract][Full Text] [Related]  

  • 59. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
    Javed S; Mahmood A; Qaiser T; Werghi N
    IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
    [TBL] [Abstract][Full Text] [Related]  

  • 60. An ultra-fast deep-learning-based dose engine for prostate VMAT via knowledge distillation framework with limited patient data.
    Tseng W; Liu H; Yang Y; Liu C; Lu B
    Phys Med Biol; 2022 Dec; 68(1):. PubMed ID: 36533689
    [No Abstract]   [Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.