These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

184 related articles for article (PubMed ID: 18255656)

  • 1. An iterative pruning algorithm for feedforward neural networks.
    Castellano G; Fanelli AM; Pelillo M
    IEEE Trans Neural Netw; 1997; 8(3):519-31. PubMed ID: 18255656
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A formal selection and pruning algorithm for feedforward artificial neural network optimization.
    Ponnapalli PS; Ho KC; Thomson M
    IEEE Trans Neural Netw; 1999; 10(4):964-8. PubMed ID: 18252597
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Inverting feedforward neural networks using linear and nonlinear programming.
    Lu BL; Kita H; Nishikawa Y
    IEEE Trans Neural Netw; 1999; 10(6):1271-90. PubMed ID: 18252630
    [TBL] [Abstract][Full Text] [Related]  

  • 4. A local linearized least squares algorithm for training feedforward neural networks.
    Stan O; Kamen E
    IEEE Trans Neural Netw; 2000; 11(2):487-95. PubMed ID: 18249777
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A pruning feedforward small-world neural network based on Katz centrality for nonlinear system modeling.
    Li W; Chu M; Qiao J
    Neural Netw; 2020 Oct; 130():269-285. PubMed ID: 32711349
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A local training and pruning approach for neural networks.
    Chang SJ; Leung CS; Wong KW; Sum J
    Int J Neural Syst; 2000 Dec; 10(6):425-38. PubMed ID: 11307857
    [TBL] [Abstract][Full Text] [Related]  

  • 7. An improvement of extreme learning machine for compact single-hidden-layer feedforward neural networks.
    Huynh HT; Won Y; Kim JJ
    Int J Neural Syst; 2008 Oct; 18(5):433-41. PubMed ID: 18991365
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A penalty-function approach for pruning feedforward neural networks.
    Setiono R
    Neural Comput; 1997 Jan; 9(1):185-204. PubMed ID: 9117898
    [TBL] [Abstract][Full Text] [Related]  

  • 9. A Novel Pruning Algorithm for Smoothing Feedforward Neural Networks Based on Group Lasso Method.
    Wang J; Xu C; Yang X; Zurada JM
    IEEE Trans Neural Netw Learn Syst; 2018 May; 29(5):2012-2024. PubMed ID: 28961129
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Pruning recurrent neural networks for improved generalization performance.
    Giles CL; Omlin CW
    IEEE Trans Neural Netw; 1994; 5(5):848-51. PubMed ID: 18267860
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Simplified neural networks for solving linear least squares and total least squares problems in real time.
    Cichocki A; Unbehauen R
    IEEE Trans Neural Netw; 1994; 5(6):910-23. PubMed ID: 18267865
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Radical pruning: a method to construct skeleton radial basis function networks.
    Augusteijn MF; Shaw KA
    Int J Neural Syst; 2000 Apr; 10(2):143-54. PubMed ID: 10939346
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A local training-pruning approach for recurrent neural networks.
    Leung CS; Lam PM
    Int J Neural Syst; 2003 Feb; 13(1):25-38. PubMed ID: 12638121
    [TBL] [Abstract][Full Text] [Related]  

  • 14. A new formulation for feedforward neural networks.
    Razavi S; Tolson BA
    IEEE Trans Neural Netw; 2011 Oct; 22(10):1588-98. PubMed ID: 21859600
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Pruning artificial neural networks using neural complexity measures.
    Jorgensen TD; Haynes BP; Norlund CC
    Int J Neural Syst; 2008 Oct; 18(5):389-403. PubMed ID: 18991362
    [TBL] [Abstract][Full Text] [Related]  

  • 16. A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation.
    Vuković N; Miljković Z
    Neural Netw; 2013 Oct; 46():210-26. PubMed ID: 23811384
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Advanced neural-network training algorithm with reduced complexity based on Jacobian deficiency.
    Zhou G; Si J
    IEEE Trans Neural Netw; 1998; 9(3):448-53. PubMed ID: 18252468
    [TBL] [Abstract][Full Text] [Related]  

  • 18. A pruning method for the recursive least squared algorithm.
    Leung CS; Wong KW; Sum PF; Chan LW
    Neural Netw; 2001 Mar; 14(2):147-74. PubMed ID: 11316231
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Training feedforward networks with the Marquardt algorithm.
    Hagan MT; Menhaj MB
    IEEE Trans Neural Netw; 1994; 5(6):989-93. PubMed ID: 18267874
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Extended Kalman Filter-Based Pruning Method for Recurrent Neural Networks.
    Sum J; Chan Lw; Leung Cs; Young GH
    Neural Comput; 1998 Jul; 10(6):1481-505. PubMed ID: 9698354
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 10.