These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

126 related articles for article (PubMed ID: 34280607)

  • 1. Recurrent neural network pruning using dynamical systems and iterative fine-tuning.
    Chatzikonstantinou C; Konstantinidis D; Dimitropoulos K; Daras P
    Neural Netw; 2021 Nov; 143():475-488. PubMed ID: 34280607
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Stage-Wise Magnitude-Based Pruning for Recurrent Neural Networks.
    Li G; Yang P; Qian C; Hong R; Tang K
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):1666-1680. PubMed ID: 35759588
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Evolutionary Multi-Objective One-Shot Filter Pruning for Designing Lightweight Convolutional Neural Network.
    Wu T; Shi J; Zhou D; Zheng X; Li N
    Sensors (Basel); 2021 Sep; 21(17):. PubMed ID: 34502792
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Redundancy-Aware Pruning of Convolutional Neural Networks.
    Xie G
    Neural Comput; 2020 Dec; 32(12):2532-2556. PubMed ID: 33080161
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Weak sub-network pruning for strong and efficient neural networks.
    Guo Q; Wu XJ; Kittler J; Feng Z
    Neural Netw; 2021 Dec; 144():614-626. PubMed ID: 34653719
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Fast Filter Pruning via Coarse-to-Fine Neural Architecture Search and Contrastive Knowledge Transfer.
    Lee S; Song BC
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):9674-9685. PubMed ID: 37021856
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Small Network for Lightweight Task in Computer Vision: A Pruning Method Based on Feature Representation.
    Ge Y; Lu S; Gao F
    Comput Intell Neurosci; 2021; 2021():5531023. PubMed ID: 33959156
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks.
    Chen Z; Xu TB; Du C; Liu CL; He H
    IEEE Trans Neural Netw Learn Syst; 2021 Feb; 32(2):799-813. PubMed ID: 32275616
    [TBL] [Abstract][Full Text] [Related]  

  • 9. CNNPruner: Pruning Convolutional Neural Networks with Visual Analytics.
    Li G; Wang J; Shen HW; Chen K; Shan G; Lu Z
    IEEE Trans Vis Comput Graph; 2021 Feb; 27(2):1364-1373. PubMed ID: 33048744
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Unsupervised Adaptive Weight Pruning for Energy-Efficient Neuromorphic Systems.
    Guo W; Fouda ME; Yantir HE; Eltawil AM; Salama KN
    Front Neurosci; 2020; 14():598876. PubMed ID: 33281549
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Coarse-Grained Pruning of Neural Network Models Based on Blocky Sparse Structure.
    Huang L; Zeng J; Sun S; Wang W; Wang Y; Wang K
    Entropy (Basel); 2021 Aug; 23(8):. PubMed ID: 34441182
    [TBL] [Abstract][Full Text] [Related]  

  • 12. A transfer learning with structured filter pruning approach for improved breast cancer classification on point-of-care devices.
    Choudhary T; Mishra V; Goswami A; Sarangapani J
    Comput Biol Med; 2021 Jul; 134():104432. PubMed ID: 33964737
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Pruning artificial neural networks using neural complexity measures.
    Jorgensen TD; Haynes BP; Norlund CC
    Int J Neural Syst; 2008 Oct; 18(5):389-403. PubMed ID: 18991362
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Pruning recurrent neural networks replicates adolescent changes in working memory and reinforcement learning.
    Averbeck BB
    Proc Natl Acad Sci U S A; 2022 May; 119(22):e2121331119. PubMed ID: 35622896
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Memory-Based Pruning of Deep Neural Networks for IoT Devices Applied to Flood Detection.
    Fernandes Junior FE; Nonato LG; Ranieri CM; Ueyama J
    Sensors (Basel); 2021 Nov; 21(22):. PubMed ID: 34833583
    [TBL] [Abstract][Full Text] [Related]  

  • 16. DMPP: Differentiable multi-pruner and predictor for neural network pruning.
    Li J; Zhao B; Liu D
    Neural Netw; 2022 Mar; 147():103-112. PubMed ID: 34998270
    [TBL] [Abstract][Full Text] [Related]  

  • 17. LOss-Based SensiTivity rEgulaRization: Towards deep sparse neural networks.
    Tartaglione E; Bragagnolo A; Fiandrotti A; Grangetto M
    Neural Netw; 2022 Feb; 146():230-237. PubMed ID: 34906759
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks.
    He Y; Dong X; Kang G; Fu Y; Yan C; Yang Y
    IEEE Trans Cybern; 2020 Aug; 50(8):3594-3604. PubMed ID: 31478883
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Differential Evolution Based Layer-Wise Weight Pruning for Compressing Deep Neural Networks.
    Wu T; Li X; Zhou D; Li N; Shi J
    Sensors (Basel); 2021 Jan; 21(3):. PubMed ID: 33525527
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Pruning deep neural networks generates a sparse, bio-inspired nonlinear controller for insect flight.
    Zahn O; Bustamante J; Switzer C; Daniel TL; Kutz JN
    PLoS Comput Biol; 2022 Sep; 18(9):e1010512. PubMed ID: 36166481
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.