These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

134 related articles for article (PubMed ID: 39186414)

  • 1. Relation Knowledge Distillation by Auxiliary Learning for Object Detection.
    Wang H; Jia T; Wang Q; Zuo W
    IEEE Trans Image Process; 2024; 33():4796-4810. PubMed ID: 39186414
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Inferior and Coordinate Distillation for Object Detectors.
    Zhang Y; Li Y; Pan Z
    Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35957276
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Structured Knowledge Distillation for Accurate and Efficient Object Detection.
    Zhang L; Ma K
    IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):15706-15724. PubMed ID: 37527292
    [TBL] [Abstract][Full Text] [Related]  

  • 4. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Unsupervised Anomaly Detection with Distillated Teacher-Student Network Ensemble.
    Xiao Q; Wang J; Lin Y; Gongsa W; Hu G; Li M; Wang F
    Entropy (Basel); 2021 Feb; 23(2):. PubMed ID: 33561954
    [TBL] [Abstract][Full Text] [Related]  

  • 6. CrabNet: Fully Task-Specific Feature Learning for One-Stage Object Detection.
    Wang H; Wang Q; Zhang H; Hu Q; Zuo W
    IEEE Trans Image Process; 2022; 31():2962-2974. PubMed ID: 35353700
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Multilayer Semantic Features Adaptive Distillation for Object Detectors.
    Zhang Z; Liu J; Chen Y; Mei W; Huang F; Chen L
    Sensors (Basel); 2023 Sep; 23(17):. PubMed ID: 37688070
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Adversarial Distillation for Learning with Privileged Provisions.
    Wang X; Zhang R; Sun Y; Qi J
    IEEE Trans Pattern Anal Mach Intell; 2021 Mar; 43(3):786-797. PubMed ID: 31545712
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Teacher-student complementary sample contrastive distillation.
    Bao Z; Huang Z; Gou J; Du L; Liu K; Zhou J; Chen Y
    Neural Netw; 2024 Feb; 170():176-189. PubMed ID: 37989039
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Leveraging different learning styles for improved knowledge distillation in biomedical imaging.
    Niyaz U; Sambyal AS; Bathula DR
    Comput Biol Med; 2024 Jan; 168():107764. PubMed ID: 38056210
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Learning lightweight tea detector with reconstructed feature and dual distillation.
    Zheng Z; Zuo G; Zhang W; Zhang C; Zhang J; Rao Y; Jiang Z
    Sci Rep; 2024 Oct; 14(1):23669. PubMed ID: 39390063
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Cosine similarity-guided knowledge distillation for robust object detectors.
    Park S; Kang D; Paik J
    Sci Rep; 2024 Aug; 14(1):18888. PubMed ID: 39143179
    [TBL] [Abstract][Full Text] [Related]  

  • 13. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 14. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution.
    Yang C; An Z; Cai L; Xu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2094-2108. PubMed ID: 35820013
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Memory-Replay Knowledge Distillation.
    Wang J; Zhang P; Li Y
    Sensors (Basel); 2021 Apr; 21(8):. PubMed ID: 33921068
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Localization Distillation for Object Detection.
    Zheng Z; Ye R; Hou Q; Ren D; Wang P; Zuo W; Cheng MM
    IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):10070-10083. PubMed ID: 37027640
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Resolution-Aware Knowledge Distillation for Efficient Inference.
    Feng Z; Lai J; Xie X
    IEEE Trans Image Process; 2021; 30():6985-6996. PubMed ID: 34347598
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Pixel Distillation: Cost-Flexible Distillation Across Image Sizes and Heterogeneous Networks.
    Guo G; Zhang D; Han L; Liu N; Cheng MM; Han J
    IEEE Trans Pattern Anal Mach Intell; 2024 Dec; 46(12):9536-9550. PubMed ID: 38949946
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.