These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

111 related articles for article (PubMed ID: 39059046)

  • 1. Reweighted Alternating Direction Method of Multipliers for DNN weight pruning.
    Yuan M; Du L; Jiang F; Bai J; Chen G
    Neural Netw; 2024 Nov; 179():106534. PubMed ID: 39059046
    [TBL] [Abstract][Full Text] [Related]  

  • 2. StructADMM: Achieving Ultrahigh Efficiency in Structured Pruning for DNNs.
    Zhang T; Ye S; Feng X; Ma X; Zhang K; Li Z; Tang J; Liu S; Lin X; Liu Y; Fardad M; Wang Y
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2259-2273. PubMed ID: 33587706
    [TBL] [Abstract][Full Text] [Related]  

  • 3. HRel: Filter pruning based on High Relevance between activation maps and class labels.
    Sarvani CH; Ghorai M; Dubey SR; Basha SHS
    Neural Netw; 2022 Mar; 147():186-197. PubMed ID: 35042156
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning.
    Lin S; Ji R; Li Y; Deng C; Li X
    IEEE Trans Neural Netw Learn Syst; 2020 Feb; 31(2):574-588. PubMed ID: 30990448
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Feature flow regularization: Improving structured sparsity in deep neural networks.
    Wu Y; Lan Y; Zhang L; Xiang Y
    Neural Netw; 2023 Apr; 161():598-613. PubMed ID: 36822145
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Non-Structured DNN Weight Pruning-Is It Beneficial in Any Platform?
    Ma X; Lin S; Ye S; He Z; Zhang L; Yuan G; Tan SH; Li Z; Fan D; Qian X; Lin X; Ma K; Wang Y
    IEEE Trans Neural Netw Learn Syst; 2022 Sep; 33(9):4930-4944. PubMed ID: 33735086
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Redundant feature pruning for accelerated inference in deep neural networks.
    Ayinde BO; Inanc T; Zurada JM
    Neural Netw; 2019 Oct; 118():148-158. PubMed ID: 31279285
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Weak sub-network pruning for strong and efficient neural networks.
    Guo Q; Wu XJ; Kittler J; Feng Z
    Neural Netw; 2021 Dec; 144():614-626. PubMed ID: 34653719
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks.
    Chen Z; Xu TB; Du C; Liu CL; He H
    IEEE Trans Neural Netw Learn Syst; 2021 Feb; 32(2):799-813. PubMed ID: 32275616
    [TBL] [Abstract][Full Text] [Related]  

  • 10. CRESPR: Modular sparsification of DNNs to improve pruning performance and model interpretability.
    Kang T; Ding W; Chen P
    Neural Netw; 2024 Apr; 172():106067. PubMed ID: 38199151
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Jump-GRS: a multi-phase approach to structured pruning of neural networks for neural decoding.
    Wu X; Lin DT; Chen R; Bhattacharyya SS
    J Neural Eng; 2023 Jul; 20(4):. PubMed ID: 37429288
    [No Abstract]   [Full Text] [Related]  

  • 12. DeepCompNet: A Novel Neural Net Model Compression Architecture.
    Mary Shanthi Rani M; Chitra P; Lakshmanan S; Kalpana Devi M; Sangeetha R; Nithya S
    Comput Intell Neurosci; 2022; 2022():2213273. PubMed ID: 35242176
    [TBL] [Abstract][Full Text] [Related]  

  • 13. LAP: Latency-aware automated pruning with dynamic-based filter selection.
    Chen Z; Liu C; Yang W; Li K; Li K
    Neural Netw; 2022 Aug; 152():407-418. PubMed ID: 35609502
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Dynamical Conventional Neural Network Channel Pruning by Genetic Wavelet Channel Search for Image Classification.
    Chen L; Gong S; Shi X; Shang M
    Front Comput Neurosci; 2021; 15():760554. PubMed ID: 34776916
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Perturbation of deep autoencoder weights for model compression and classification of tabular data.
    Abrar S; Samad MD
    Neural Netw; 2022 Dec; 156():160-169. PubMed ID: 36270199
    [TBL] [Abstract][Full Text] [Related]  

  • 16. GRIM: A General, Real-Time Deep Learning Inference Framework for Mobile Devices Based on Fine-Grained Structured Weight Sparsity.
    Niu W; Li Z; Ma X; Dong P; Zhou G; Qian X; Lin X; Wang Y; Ren B
    IEEE Trans Pattern Anal Mach Intell; 2022 Oct; 44(10):6224-6239. PubMed ID: 34133272
    [TBL] [Abstract][Full Text] [Related]  

  • 17. SSGCNet: A Sparse Spectra Graph Convolutional Network for Epileptic EEG Signal Classification.
    Wang J; Gao R; Zheng H; Zhu H; Shi CR
    IEEE Trans Neural Netw Learn Syst; 2024 Sep; 35(9):12157-12171. PubMed ID: 37030729
    [TBL] [Abstract][Full Text] [Related]  

  • 18. SSGD: SPARSITY-PROMOTING STOCHASTIC GRADIENT DESCENT ALGORITHM FOR UNBIASED DNN PRUNING.
    Lee CH; Fedorov I; Rao BD; Garudadri H
    Proc IEEE Int Conf Acoust Speech Signal Process; 2020 May; 2020():5410-5414. PubMed ID: 33162834
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Heuristic Method for Minimizing Model Size of CNN by Combining Multiple Pruning Techniques.
    Tian D; Yamagiwa S; Wada K
    Sensors (Basel); 2022 Aug; 22(15):. PubMed ID: 35957431
    [TBL] [Abstract][Full Text] [Related]  

  • 20. On the compression of neural networks using ℓ
    de Resende Oliveira FD; Batista ELO; Seara R
    Neural Netw; 2024 Mar; 171():343-352. PubMed ID: 38113719
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.