These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

135 related articles for article (PubMed ID: 35452384)

  • 1. Exploring Structural Sparsity of Deep Networks Via Inverse Scale Spaces.
    Fu Y; Liu C; Li D; Zhong Z; Sun X; Zeng J; Yao Y
    IEEE Trans Pattern Anal Mach Intell; 2023 Feb; 45(2):1749-1765. PubMed ID: 35452384
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Weak sub-network pruning for strong and efficient neural networks.
    Guo Q; Wu XJ; Kittler J; Feng Z
    Neural Netw; 2021 Dec; 144():614-626. PubMed ID: 34653719
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Deep Sparse Learning for Automatic Modulation Classification Using Recurrent Neural Networks.
    Zang K; Wu W; Luo W
    Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640730
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Coarse-Grained Pruning of Neural Network Models Based on Blocky Sparse Structure.
    Huang L; Zeng J; Sun S; Wang W; Wang Y; Wang K
    Entropy (Basel); 2021 Aug; 23(8):. PubMed ID: 34441182
    [TBL] [Abstract][Full Text] [Related]  

  • 5. When Sparse Neural Network Meets Label Noise Learning: A Multistage Learning Framework.
    Jiang R; Yan Y; Xue JH; Wang B; Wang H
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2208-2222. PubMed ID: 35834450
    [TBL] [Abstract][Full Text] [Related]  

  • 6. SSGD: SPARSITY-PROMOTING STOCHASTIC GRADIENT DESCENT ALGORITHM FOR UNBIASED DNN PRUNING.
    Lee CH; Fedorov I; Rao BD; Garudadri H
    Proc IEEE Int Conf Acoust Speech Signal Process; 2020 May; 2020():5410-5414. PubMed ID: 33162834
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Supervised Deep Sparse Coding Networks for Image Classification.
    Sun X; Nasrabadi NM; Tran TD
    IEEE Trans Image Process; 2019 Jul; ():. PubMed ID: 31331886
    [TBL] [Abstract][Full Text] [Related]  

  • 8. EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks.
    Poyatos J; Molina D; Martinez AD; Del Ser J; Herrera F
    Neural Netw; 2023 Jan; 158():59-82. PubMed ID: 36442374
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning.
    Lin S; Ji R; Li Y; Deng C; Li X
    IEEE Trans Neural Netw Learn Syst; 2020 Feb; 31(2):574-588. PubMed ID: 30990448
    [TBL] [Abstract][Full Text] [Related]  

  • 10. A novel adaptive cubic quasi-Newton optimizer for deep learning based medical image analysis tasks, validated on detection of COVID-19 and segmentation for COVID-19 lung infection, liver tumor, and optic disc/cup.
    Liu Y; Zhang M; Zhong Z; Zeng X
    Med Phys; 2023 Mar; 50(3):1528-1538. PubMed ID: 36057788
    [TBL] [Abstract][Full Text] [Related]  

  • 11. MLR-SNet: Transferable LR Schedules for Heterogeneous Tasks.
    Shu J; Zhu Y; Zhao Q; Meng D; Xu Z
    IEEE Trans Pattern Anal Mach Intell; 2023 Mar; 45(3):3505-3521. PubMed ID: 35724299
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Discrimination-Aware Network Pruning for Deep Model Compression.
    Liu J; Zhuang B; Zhuang Z; Guo Y; Huang J; Zhu J; Tan M
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4035-4051. PubMed ID: 33755553
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Building a Compact Convolutional Neural Network for Embedded Intelligent Sensor Systems Using Group Sparsity and Knowledge Distillation.
    Cho J; Lee M
    Sensors (Basel); 2019 Oct; 19(19):. PubMed ID: 31590266
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science.
    Mocanu DC; Mocanu E; Stone P; Nguyen PH; Gibescu M; Liotta A
    Nat Commun; 2018 Jun; 9(1):2383. PubMed ID: 29921910
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Learning Sparse Deep Neural Networks with a Spike-and-Slab Prior.
    Sun Y; Song Q; Liang F
    Stat Probab Lett; 2022 Jan; 180():. PubMed ID: 34744226
    [TBL] [Abstract][Full Text] [Related]  

  • 16. GRIM: A General, Real-Time Deep Learning Inference Framework for Mobile Devices Based on Fine-Grained Structured Weight Sparsity.
    Niu W; Li Z; Ma X; Dong P; Zhou G; Qian X; Lin X; Wang Y; Ren B
    IEEE Trans Pattern Anal Mach Intell; 2022 Oct; 44(10):6224-6239. PubMed ID: 34133272
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Structure Learning for Deep Neural Networks Based on Multiobjective Optimization.
    Liu J; Gong M; Miao Q; Wang X; Li H
    IEEE Trans Neural Netw Learn Syst; 2018 Jun; 29(6):2450-2463. PubMed ID: 28489552
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Feature flow regularization: Improving structured sparsity in deep neural networks.
    Wu Y; Lan Y; Zhang L; Xiang Y
    Neural Netw; 2023 Apr; 161():598-613. PubMed ID: 36822145
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Random pruning: channel sparsity by expectation scaling factor.
    Sun C; Chen J; Li Y; Wang W; Ma T
    PeerJ Comput Sci; 2023; 9():e1564. PubMed ID: 37705629
    [TBL] [Abstract][Full Text] [Related]  

  • 20. DNNGP, a deep neural network-based method for genomic prediction using multi-omics data in plants.
    Wang K; Abid MA; Rasheed A; Crossa J; Hearne S; Li H
    Mol Plant; 2023 Jan; 16(1):279-293. PubMed ID: 36366781
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.