These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

107 related articles for article (PubMed ID: 35998170)

  • 1. Slimming Neural Networks Using Adaptive Connectivity Scores.
    Ravi Ganesh M; Blanchard D; Corso JJ; Sekeh SY
    IEEE Trans Neural Netw Learn Syst; 2024 Mar; 35(3):3794-3808. PubMed ID: 35998170
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Adding Before Pruning: Sparse Filter Fusion for Deep Convolutional Neural Networks via Auxiliary Attention.
    Tian G; Sun Y; Liu Y; Zeng X; Wang M; Liu Y; Zhang J; Chen J
    IEEE Trans Neural Netw Learn Syst; 2021 Sep; PP():. PubMed ID: 34487502
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Filter Pruning via Measuring Feature Map Information.
    Shao L; Zuo H; Zhang J; Xu Z; Yao J; Wang Z; Li H
    Sensors (Basel); 2021 Oct; 21(19):. PubMed ID: 34640921
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Hierarchical Threshold Pruning Based on Uniform Response Criterion.
    Qian Y; He Z; Wang Y; Wang B; Ling X; Gu Z; Wang H; Zeng S; Swaileh W
    IEEE Trans Neural Netw Learn Syst; 2024 Aug; 35(8):10869-10881. PubMed ID: 37071515
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Exploiting Sparse Self-Representation and Particle Swarm Optimization for CNN Compression.
    Niu S; Gao K; Ma P; Gao X; Zhao H; Dong J; Chen Y; Shen D
    IEEE Trans Neural Netw Learn Syst; 2023 Dec; 34(12):10266-10278. PubMed ID: 35439146
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks.
    He Y; Dong X; Kang G; Fu Y; Yan C; Yang Y
    IEEE Trans Cybern; 2020 Aug; 50(8):3594-3604. PubMed ID: 31478883
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Jump-GRS: a multi-phase approach to structured pruning of neural networks for neural decoding.
    Wu X; Lin DT; Chen R; Bhattacharyya SS
    J Neural Eng; 2023 Jul; 20(4):. PubMed ID: 37429288
    [No Abstract]   [Full Text] [Related]  

  • 8. Model pruning based on filter similarity for edge device deployment.
    Wu T; Song C; Zeng P
    Front Neurorobot; 2023; 17():1132679. PubMed ID: 36937554
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Weak sub-network pruning for strong and efficient neural networks.
    Guo Q; Wu XJ; Kittler J; Feng Z
    Neural Netw; 2021 Dec; 144():614-626. PubMed ID: 34653719
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Optimizing the Deep Neural Networks by Layer-Wise Refined Pruning and the Acceleration on FPGA.
    Li H; Yue X; Wang Z; Chai Z; Wang W; Tomiyama H; Meng L
    Comput Intell Neurosci; 2022; 2022():8039281. PubMed ID: 35694575
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Evolutionary Multi-Objective One-Shot Filter Pruning for Designing Lightweight Convolutional Neural Network.
    Wu T; Shi J; Zhou D; Zheng X; Li N
    Sensors (Basel); 2021 Sep; 21(17):. PubMed ID: 34502792
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Discrimination-Aware Network Pruning for Deep Model Compression.
    Liu J; Zhuang B; Zhuang Z; Guo Y; Huang J; Zhu J; Tan M
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4035-4051. PubMed ID: 33755553
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Dynamic Image Difficulty-Aware DNN Pruning.
    Pentsos V; Spantidi O; Anagnostopoulos I
    Micromachines (Basel); 2023 Apr; 14(5):. PubMed ID: 37241531
    [TBL] [Abstract][Full Text] [Related]  

  • 14. HRel: Filter pruning based on High Relevance between activation maps and class labels.
    Sarvani CH; Ghorai M; Dubey SR; Basha SHS
    Neural Netw; 2022 Mar; 147():186-197. PubMed ID: 35042156
    [TBL] [Abstract][Full Text] [Related]  

  • 15. RED++ : Data-Free Pruning of Deep Neural Networks via Input Splitting and Output Merging.
    Yvinec E; Dapogny A; Cord M; Bailly K
    IEEE Trans Pattern Anal Mach Intell; 2023 Mar; 45(3):3664-3676. PubMed ID: 35653454
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Differential Evolution Based Layer-Wise Weight Pruning for Compressing Deep Neural Networks.
    Wu T; Li X; Zhou D; Li N; Shi J
    Sensors (Basel); 2021 Jan; 21(3):. PubMed ID: 33525527
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Where to Prune: Using LSTM to Guide Data-Dependent Soft Pruning.
    Ding G; Zhang S; Jia Z; Zhong J; Han J
    IEEE Trans Image Process; 2021; 30():293-304. PubMed ID: 33186105
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Adaptive Search-and-Training for Robust and Efficient Network Pruning.
    Lu X; Dong W; Li X; Wu J; Li L; Shi G
    IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):9325-9338. PubMed ID: 37027639
    [TBL] [Abstract][Full Text] [Related]  

  • 19. A Novel Deep-Learning Model Compression Based on Filter-Stripe Group Pruning and Its IoT Application.
    Zhao M; Tong X; Wu W; Wang Z; Zhou B; Huang X
    Sensors (Basel); 2022 Jul; 22(15):. PubMed ID: 35957176
    [TBL] [Abstract][Full Text] [Related]  

  • 20. CRESPR: Modular sparsification of DNNs to improve pruning performance and model interpretability.
    Kang T; Ding W; Chen P
    Neural Netw; 2024 Apr; 172():106067. PubMed ID: 38199151
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.