These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

273 related articles for article (PubMed ID: 31855748)

  • 21. Small Network for Lightweight Task in Computer Vision: A Pruning Method Based on Feature Representation.
    Ge Y; Lu S; Gao F
    Comput Intell Neurosci; 2021; 2021():5531023. PubMed ID: 33959156
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Chinese Clinical Named Entity Recognition Using Residual Dilated Convolutional Neural Network With Conditional Random Field.
    Qiu J; Zhou Y; Wang Q; Ruan T; Gao J
    IEEE Trans Nanobioscience; 2019 Jul; 18(3):306-315. PubMed ID: 30946674
    [TBL] [Abstract][Full Text] [Related]  

  • 23. SeReNe: Sensitivity-Based Regularization of Neurons for Structured Sparsity in Neural Networks.
    Tartaglione E; Bragagnolo A; Odierna F; Fiandrotti A; Grangetto M
    IEEE Trans Neural Netw Learn Syst; 2022 Dec; 33(12):7237-7250. PubMed ID: 34129503
    [TBL] [Abstract][Full Text] [Related]  

  • 24. A local training-pruning approach for recurrent neural networks.
    Leung CS; Lam PM
    Int J Neural Syst; 2003 Feb; 13(1):25-38. PubMed ID: 12638121
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Towards performance-maximizing neural network pruning via global channel attention.
    Wang Y; Guo S; Guo J; Zhang J; Zhang W; Yan C; Zhang Y
    Neural Netw; 2024 Mar; 171():104-113. PubMed ID: 38091754
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Sunflower seeds classification based on sparse convolutional neural networks in multi-objective scene.
    Jin X; Zhao Y; Wu H; Sun T
    Sci Rep; 2022 Nov; 12(1):19890. PubMed ID: 36400872
    [TBL] [Abstract][Full Text] [Related]  

  • 27. CNNPruner: Pruning Convolutional Neural Networks with Visual Analytics.
    Li G; Wang J; Shen HW; Chen K; Shan G; Lu Z
    IEEE Trans Vis Comput Graph; 2021 Feb; 27(2):1364-1373. PubMed ID: 33048744
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Applications of Recurrent Neural Networks in Environmental Factor Forecasting: A Review.
    Chen Y; Cheng Q; Cheng Y; Yang H; Yu H
    Neural Comput; 2018 Nov; 30(11):2855-2881. PubMed ID: 30216144
    [TBL] [Abstract][Full Text] [Related]  

  • 29. A Hardware-Friendly High-Precision CNN Pruning Method and Its FPGA Implementation.
    Sui X; Lv Q; Zhi L; Zhu B; Yang Y; Zhang Y; Tan Z
    Sensors (Basel); 2023 Jan; 23(2):. PubMed ID: 36679624
    [TBL] [Abstract][Full Text] [Related]  

  • 30. ACSL: Adaptive correlation-driven sparsity learning for deep neural network compression.
    He W; Wu M; Lam SK
    Neural Netw; 2021 Dec; 144():465-477. PubMed ID: 34600219
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Data-Independent Structured Pruning of Neural Networks via Coresets.
    Mussay B; Feldman D; Zhou S; Braverman V; Osadchy M
    IEEE Trans Neural Netw Learn Syst; 2022 Dec; 33(12):7829-7841. PubMed ID: 34166205
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Dynamical Channel Pruning by Conditional Accuracy Change for Deep Neural Networks.
    Chen Z; Xu TB; Du C; Liu CL; He H
    IEEE Trans Neural Netw Learn Syst; 2021 Feb; 32(2):799-813. PubMed ID: 32275616
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Adding Before Pruning: Sparse Filter Fusion for Deep Convolutional Neural Networks via Auxiliary Attention.
    Tian G; Sun Y; Liu Y; Zeng X; Wang M; Liu Y; Zhang J; Chen J
    IEEE Trans Neural Netw Learn Syst; 2021 Sep; PP():. PubMed ID: 34487502
    [TBL] [Abstract][Full Text] [Related]  

  • 34. LAP: Latency-aware automated pruning with dynamic-based filter selection.
    Chen Z; Liu C; Yang W; Li K; Li K
    Neural Netw; 2022 Aug; 152():407-418. PubMed ID: 35609502
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Layer adaptive node selection in Bayesian neural networks: Statistical guarantees and implementation details.
    Jantre S; Bhattacharya S; Maiti T
    Neural Netw; 2023 Oct; 167():309-330. PubMed ID: 37666188
    [TBL] [Abstract][Full Text] [Related]  

  • 36. A pruning feedforward small-world neural network based on Katz centrality for nonlinear system modeling.
    Li W; Chu M; Qiao J
    Neural Netw; 2020 Oct; 130():269-285. PubMed ID: 32711349
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning.
    Lin S; Ji R; Li Y; Deng C; Li X
    IEEE Trans Neural Netw Learn Syst; 2020 Feb; 31(2):574-588. PubMed ID: 30990448
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Neural Tree Indexers for Text Understanding.
    Munkhdalai T; Yu H
    Proc Conf Assoc Comput Linguist Meet; 2017 Apr; 1():11-21. PubMed ID: 29081577
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Using Artificial Neural Network Condensation to Facilitate Adaptation of Machine Learning in Medical Settings by Reducing Computational Burden: Model Design and Evaluation Study.
    Liu D; Zheng M; Sepulveda NA
    JMIR Form Res; 2021 Dec; 5(12):e20767. PubMed ID: 34889747
    [TBL] [Abstract][Full Text] [Related]  

  • 40. EDropout: Energy-Based Dropout and Pruning of Deep Neural Networks.
    Salehinejad H; Valaee S
    IEEE Trans Neural Netw Learn Syst; 2022 Oct; 33(10):5279-5292. PubMed ID: 33830931
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 14.