BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

142 related articles for article (PubMed ID: 37688070)

  • 1. Multilayer Semantic Features Adaptive Distillation for Object Detectors.
    Zhang Z; Liu J; Chen Y; Mei W; Huang F; Chen L
    Sensors (Basel); 2023 Sep; 23(17):. PubMed ID: 37688070
    [TBL] [Abstract][Full Text] [Related]  

  • 2. FCKDNet: A Feature Condensation Knowledge Distillation Network for Semantic Segmentation.
    Yuan W; Lu X; Zhang R; Liu Y
    Entropy (Basel); 2023 Jan; 25(1):. PubMed ID: 36673266
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Inferior and Coordinate Distillation for Object Detectors.
    Zhang Y; Li Y; Pan Z
    Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35957276
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Adaptive Perspective Distillation for Semantic Segmentation.
    Tian Z; Chen P; Lai X; Jiang L; Liu S; Zhao H; Yu B; Yang MC; Jia J
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1372-1387. PubMed ID: 35294341
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Pixel Distillation: Cost-flexible Distillation across Image Sizes and Heterogeneous Networks.
    Guo G; Zhang D; Han L; Liu N; Cheng MM; Han J
    IEEE Trans Pattern Anal Mach Intell; 2024 Jul; PP():. PubMed ID: 38949946
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Spot-Adaptive Knowledge Distillation.
    Song J; Chen Y; Ye J; Song M
    IEEE Trans Image Process; 2022; 31():3359-3370. PubMed ID: 35503832
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Structured Knowledge Distillation for Accurate and Efficient Object Detection.
    Zhang L; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):15706-15724. PubMed ID: 37527292
    [TBL] [Abstract][Full Text] [Related]  

  • 8. MSKD: Structured knowledge distillation for efficient medical image segmentation.
    Zhao L; Qian X; Guo Y; Song J; Hou J; Gong J
    Comput Biol Med; 2023 Sep; 164():107284. PubMed ID: 37572439
    [TBL] [Abstract][Full Text] [Related]  

  • 9. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 10. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2023 Jan; PP():. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Feature Map Distillation of Thin Nets for Low-Resolution Object Recognition.
    Huang Z; Yang S; Zhou M; Li Z; Gong Z; Chen Y
    IEEE Trans Image Process; 2022; 31():1364-1379. PubMed ID: 35025743
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Attention and feature transfer based knowledge distillation.
    Yang G; Yu S; Sheng Y; Yang H
    Sci Rep; 2023 Oct; 13(1):18369. PubMed ID: 37884556
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Knowledge Distillation for Semantic Segmentation Using Channel and Spatial Correlations and Adaptive Cross Entropy.
    Park S; Heo YS
    Sensors (Basel); 2020 Aug; 20(16):. PubMed ID: 32824456
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Localization Distillation for Object Detection.
    Zheng Z; Ye R; Hou Q; Ren D; Wang P; Zuo W; Cheng MM
    IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10070-10083. PubMed ID: 37027640
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Knowledge distillation based on multi-layer fusion features.
    Tan S; Guo R; Tang J; Jiang N; Zou J
    PLoS One; 2023; 18(8):e0285901. PubMed ID: 37639443
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

  • 17. EPANet-KD: Efficient progressive attention network for fine-grained provincial village classification via knowledge distillation.
    Zhang C; Liu C; Gong H; Teng J
    PLoS One; 2024; 19(2):e0298452. PubMed ID: 38359020
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Self-Distillation for Randomized Neural Networks.
    Hu M; Gao R; Suganthan PN
    IEEE Trans Neural Netw Learn Syst; 2023 Aug; PP():. PubMed ID: 37585327
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Multi-view Teacher-Student Network.
    Tian Y; Sun S; Tang J
    Neural Netw; 2022 Feb; 146():69-84. PubMed ID: 34839092
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
    Li L; Su W; Liu F; He M; Liang X
    Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 8.