These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

193 related articles for article (PubMed ID: 35957276)

  • 21. Attention and feature transfer based knowledge distillation.
    Yang G; Yu S; Sheng Y; Yang H
    Sci Rep; 2023 Oct; 13(1):18369. PubMed ID: 37884556
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Multiscale knowledge distillation with attention based fusion for robust human activity recognition.
    Yuan Z; Yang Z; Ning H; Tang X
    Sci Rep; 2024 May; 14(1):12411. PubMed ID: 38816446
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Single-Shot Object Detection via Feature Enhancement and Channel Attention.
    Li Y; Wang L; Wang Z
    Sensors (Basel); 2022 Sep; 22(18):. PubMed ID: 36146207
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Complementary label learning based on knowledge distillation.
    Ying P; Li Z; Sun R; Xu X
    Math Biosci Eng; 2023 Sep; 20(10):17905-17918. PubMed ID: 38052542
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Adversarial learning-based multi-level dense-transmission knowledge distillation for AP-ROP detection.
    Xie H; Liu Y; Lei H; Song T; Yue G; Du Y; Wang T; Zhang G; Lei B
    Med Image Anal; 2023 Feb; 84():102725. PubMed ID: 36527770
    [TBL] [Abstract][Full Text] [Related]  

  • 26. FCOS: A Simple and Strong Anchor-Free Object Detector.
    Tian Z; Shen C; Chen H; He T
    IEEE Trans Pattern Anal Mach Intell; 2022 Apr; 44(4):1922-1933. PubMed ID: 33074804
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.
    Li L; Su W; Liu F; He M; Liang X
    Neural Process Lett; 2023 Jan; ():1-16. PubMed ID: 36619739
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Optimisation of Deep Learning Small-Object Detectors with Novel Explainable Verification.
    Mohamed E; Sirlantzis K; Howells G; Hoque S
    Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35898097
    [TBL] [Abstract][Full Text] [Related]  

  • 29. DCCD: Reducing Neural Network Redundancy via Distillation.
    Liu Y; Chen J; Liu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):10006-10017. PubMed ID: 37022254
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Multiple-in-Single-Out Object Detector Leveraging Spiking Neural Membrane Systems and Multiple Transformers.
    Jiang Z; Sun S; Peng H; Liu Z; Wang J
    Int J Neural Syst; 2024 Jul; 34(7):2450035. PubMed ID: 38616293
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Interactive Regression and Classification for Dense Object Detector.
    Zhou L; Chang H; Ma B; Shan S
    IEEE Trans Image Process; 2022; 31():3684-3696. PubMed ID: 35580106
    [TBL] [Abstract][Full Text] [Related]  

  • 32. A General Dynamic Knowledge Distillation Method for Visual Analytics.
    Tu Z; Liu X; Xiao X
    IEEE Trans Image Process; 2022 Oct; PP():. PubMed ID: 36227819
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Knowledge Distillation in Histology Landscape by Multi-Layer Features Supervision.
    Javed S; Mahmood A; Qaiser T; Werghi N
    IEEE J Biomed Health Inform; 2023 Jan; PP():. PubMed ID: 37021915
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Research on Object Detection of PCB Assembly Scene Based on Effective Receptive Field Anchor Allocation.
    Li J; Li W; Chen Y; Gu J
    Comput Intell Neurosci; 2022; 2022():7536711. PubMed ID: 35198023
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Classification of diabetic retinopathy using unlabeled data and knowledge distillation.
    Abbasi S; Hajabdollahi M; Khadivi P; Karimi N; Roshandel R; Shirani S; Samavi S
    Artif Intell Med; 2021 Nov; 121():102176. PubMed ID: 34763798
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Mask-Refined R-CNN: A Network for Refining Object Details in Instance Segmentation.
    Zhang Y; Chu J; Leng L; Miao J
    Sensors (Basel); 2020 Feb; 20(4):. PubMed ID: 32069927
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Joint learning method with teacher-student knowledge distillation for on-device breast cancer image classification.
    Sepahvand M; Abdali-Mohammadi F
    Comput Biol Med; 2023 Mar; 155():106476. PubMed ID: 36841060
    [TBL] [Abstract][Full Text] [Related]  

  • 38. A 3DCNN-Based Knowledge Distillation Framework for Human Activity Recognition.
    Ullah H; Munir A
    J Imaging; 2023 Apr; 9(4):. PubMed ID: 37103233
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Knowledge Transfer via Decomposing Essential Information in Convolutional Neural Networks.
    Lee S; Song BC
    IEEE Trans Neural Netw Learn Syst; 2022 Jan; 33(1):366-377. PubMed ID: 33048771
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector.
    Shang R; Li W; Zhu S; Jiao L; Li Y
    Neural Netw; 2023 Jul; 164():345-356. PubMed ID: 37163850
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 10.