These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

163 related articles for article (PubMed ID: 12638121)

  • 1. A local training-pruning approach for recurrent neural networks.
    Leung CS; Lam PM
    Int J Neural Syst; 2003 Feb; 13(1):25-38. PubMed ID: 12638121
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A local training and pruning approach for neural networks.
    Chang SJ; Leung CS; Wong KW; Sum J
    Int J Neural Syst; 2000 Dec; 10(6):425-38. PubMed ID: 11307857
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Dual extended Kalman filtering in recurrent neural networks(1).
    Leung CS; Chan LW
    Neural Netw; 2003 Mar; 16(2):223-39. PubMed ID: 12628608
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Multiobjective hybrid optimization and training of recurrent neural networks.
    Delgado M; Cuéllar MP; Pegalajar MC
    IEEE Trans Syst Man Cybern B Cybern; 2008 Apr; 38(2):381-403. PubMed ID: 18348922
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Recursive Bayesian recurrent neural networks for time-series modeling.
    Mirikitani DT; Nikolaev N
    IEEE Trans Neural Netw; 2010 Feb; 21(2):262-74. PubMed ID: 20040415
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Decision feedback recurrent neural equalization with fast convergence rate.
    Choi J; Bouchard M; Yeap TH
    IEEE Trans Neural Netw; 2005 May; 16(3):699-708. PubMed ID: 15940997
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Memory-efficient fully coupled filtering approach for observational model building.
    Mussa HY; Glen RC
    IEEE Trans Neural Netw; 2010 Apr; 21(4):680-6. PubMed ID: 20194056
    [TBL] [Abstract][Full Text] [Related]  

  • 8. An adaptive Bayesian pruning for neural networks in a non-stationary environment.
    Sum J; Leung CS; Young GH; Chan LW; Kan WK
    Neural Comput; 1999 May; 11(4):965-76. PubMed ID: 10226192
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Extended Kalman Filter-Based Pruning Method for Recurrent Neural Networks.
    Sum J; Chan Lw; Leung Cs; Young GH
    Neural Comput; 1998 Jul; 10(6):1481-505. PubMed ID: 9698354
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Robust initialization of a Jordan network with recurrent constrained learning.
    Song Q
    IEEE Trans Neural Netw; 2011 Dec; 22(12):2460-73. PubMed ID: 21965202
    [TBL] [Abstract][Full Text] [Related]  

  • 11. A new approach to knowledge-based design of recurrent neural networks.
    Kolman E; Margaliot M
    IEEE Trans Neural Netw; 2008 Aug; 19(8):1389-401. PubMed ID: 18701369
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Training winner-take-all simultaneous recurrent neural networks.
    Cai X; Prokhorov DV; Wunsch DC
    IEEE Trans Neural Netw; 2007 May; 18(3):674-84. PubMed ID: 17526335
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A recurrent neural network based on projection operator for extended general variational inequalities.
    Liu Q; Cao J
    IEEE Trans Syst Man Cybern B Cybern; 2010 Jun; 40(3):928-38. PubMed ID: 19933009
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Improved delay-dependent robust stability criteria for recurrent neural networks with time-varying delays.
    Liu PL
    ISA Trans; 2013 Jan; 52(1):30-5. PubMed ID: 22959741
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Pruning artificial neural networks using neural complexity measures.
    Jorgensen TD; Haynes BP; Norlund CC
    Int J Neural Syst; 2008 Oct; 18(5):389-403. PubMed ID: 18991362
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Training simultaneous recurrent neural network with resilient propagation for static optimization.
    Serpen G; Corra J
    Int J Neural Syst; 2002; 12(3-4):203-18. PubMed ID: 12370962
    [TBL] [Abstract][Full Text] [Related]  

  • 17. An alternative recurrent neural network for solving variational inequalities and related optimization problems.
    Hu X; Zhang B
    IEEE Trans Syst Man Cybern B Cybern; 2009 Dec; 39(6):1640-5. PubMed ID: 19661003
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Beyond feedforward models trained by backpropagation: a practical training tool for a more efficient universal approximator.
    Ilin R; Kozma R; Werbos PJ
    IEEE Trans Neural Netw; 2008 Jun; 19(6):929-37. PubMed ID: 18541494
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Discrete-time reduced order neural observers for uncertain nonlinear systems.
    Alanis AY; Sanchez EN; Ricalde LJ
    Int J Neural Syst; 2010 Feb; 20(1):29-38. PubMed ID: 20180251
    [TBL] [Abstract][Full Text] [Related]  

  • 20. [Extended Kalman filtering trained neural networks and multicomponent analysis of amino acids].
    Li Z; Matsumoto S; Yu B; Sakai M; Li ML
    Guang Pu Xue Yu Guang Pu Fen Xi; 1997 Jun; 17(3):123-6. PubMed ID: 15810234
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.