These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

91 related articles for article (PubMed ID: 18282861)

  • 21. Analysis of boundedness and convergence of online gradient method for two-layer feedforward neural networks.
    Lu Xu ; Jinshu Chen ; Defeng Huang ; Jianhua Lu ; Licai Fang
    IEEE Trans Neural Netw Learn Syst; 2013 Aug; 24(8):1327-38. PubMed ID: 24808571
    [TBL] [Abstract][Full Text] [Related]  

  • 22. New nonleast-squares neural network learning algorithms for hypothesis testing.
    Pados DA; Papantoni-Kazakos P
    IEEE Trans Neural Netw; 1995; 6(3):596-609. PubMed ID: 18263346
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Global convergence of online BP training with dynamic learning rate.
    Zhang R; Xu ZB; Huang GB; Wang D
    IEEE Trans Neural Netw Learn Syst; 2012 Feb; 23(2):330-41. PubMed ID: 24808511
    [TBL] [Abstract][Full Text] [Related]  

  • 24. A fast feedforward training algorithm using a modified form of the standard backpropagation algorithm.
    Abid S; Fnaiech F; Najim M
    IEEE Trans Neural Netw; 2001; 12(2):424-30. PubMed ID: 18244397
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Stability analysis of a three-term backpropagation algorithm.
    Zweiri YH; Seneviratne LD; Althoefer K
    Neural Netw; 2005 Dec; 18(10):1341-7. PubMed ID: 16135404
    [TBL] [Abstract][Full Text] [Related]  

  • 26. A generalized learning paradigm exploiting the structure of feedforward neural networks.
    Parisi R; Di Claudio ED; Orlandi G; Rao BD
    IEEE Trans Neural Netw; 1996; 7(6):1450-60. PubMed ID: 18263538
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Recruitment learning of boolean functions in sparse random networks.
    Hogan JM; Diederich J
    Int J Neural Syst; 2001 Dec; 11(6):537-59. PubMed ID: 11852438
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Hierarchically clustered adaptive quantization CMAC and its learning convergence.
    Teddy SD; Lai EM; Quek C
    IEEE Trans Neural Netw; 2007 Nov; 18(6):1658-82. PubMed ID: 18051184
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Robust adaptive gradient-descent training algorithm for recurrent neural networks in discrete time domain.
    Song Q; Wu Y; Soh YC
    IEEE Trans Neural Netw; 2008 Nov; 19(11):1841-53. PubMed ID: 18990640
    [TBL] [Abstract][Full Text] [Related]  

  • 30. An improvement of extreme learning machine for compact single-hidden-layer feedforward neural networks.
    Huynh HT; Won Y; Kim JJ
    Int J Neural Syst; 2008 Oct; 18(5):433-41. PubMed ID: 18991365
    [TBL] [Abstract][Full Text] [Related]  

  • 31. The annealing robust backpropagation (ARBP) learning algorithm.
    Chuang CC; Su SF; Hsiao CC
    IEEE Trans Neural Netw; 2000; 11(5):1067-77. PubMed ID: 18249835
    [TBL] [Abstract][Full Text] [Related]  

  • 32. When does online BP training converge?
    Xu ZB; Zhang R; Jing WF
    IEEE Trans Neural Netw; 2009 Oct; 20(10):1529-39. PubMed ID: 19695997
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement.
    Ko H; Jacyna GM
    IEEE Trans Neural Netw; 2000; 11(5):1152-61. PubMed ID: 18249841
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Error minimized extreme learning machine with growth of hidden nodes and incremental learning.
    Feng G; Huang GB; Lin Q; Gay R
    IEEE Trans Neural Netw; 2009 Aug; 20(8):1352-7. PubMed ID: 19596632
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Convergence and objective functions of some fault/noise-injection-based online learning algorithms for RBF networks.
    Ho KI; Leung CS; Sum J
    IEEE Trans Neural Netw; 2010 Jun; 21(6):938-47. PubMed ID: 20388593
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Stable dynamic backpropagation learning in recurrent neural networks.
    Jin L; Gupta MM
    IEEE Trans Neural Netw; 1999; 10(6):1321-34. PubMed ID: 18252634
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Symbolic representation of recurrent neural network dynamics.
    Huynh TQ; Reggia JA
    IEEE Trans Neural Netw Learn Syst; 2012 Oct; 23(10):1649-58. PubMed ID: 24808009
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Equivalence of backpropagation and contrastive Hebbian learning in a layered network.
    Xie X; Seung HS
    Neural Comput; 2003 Feb; 15(2):441-54. PubMed ID: 12590814
    [TBL] [Abstract][Full Text] [Related]  

  • 39. A new adaptive backpropagation algorithm based on Lyapunov stability theory for neural networks.
    Man Z; Wu HR; Liu S; Yu X
    IEEE Trans Neural Netw; 2006 Nov; 17(6):1580-91. PubMed ID: 17131670
    [TBL] [Abstract][Full Text] [Related]  

  • 40. A selective learning method to improve the generalization of multilayer feedforward neural networks.
    Galván IM; Isasi P; Aler R; Valls JM
    Int J Neural Syst; 2001 Apr; 11(2):167-77. PubMed ID: 14632169
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 5.