These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

116 related articles for article (PubMed ID: 12662805)

  • 1. XOR has no local minima: A case study in neural network error surface analysis.
    Hamey LG
    Neural Netw; 1998 Jun; 11(4):669-681. PubMed ID: 12662805
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A local minimum for the 2-3-1 XOR network.
    Sprinkhuizen-Kuyper IG; Boers EW
    IEEE Trans Neural Netw; 1999; 10(4):968-71. PubMed ID: 18252598
    [TBL] [Abstract][Full Text] [Related]  

  • 3. The error surface of the simplest XOR network has only global minima.
    Sprinkhuizen-Kuyper IG; Boers EJ
    Neural Comput; 1996 Aug; 8(6):1301-20. PubMed ID: 8768396
    [TBL] [Abstract][Full Text] [Related]  

  • 4. The local minima-free condition of feedforward neural networks for outer-supervised learning.
    Huang DS
    IEEE Trans Syst Man Cybern B Cybern; 1998; 28(3):477-80. PubMed ID: 18255966
    [TBL] [Abstract][Full Text] [Related]  

  • 5. On the problem of local minima in recurrent neural networks.
    Bianchini M; Gori M; Maggini M
    IEEE Trans Neural Netw; 1994; 5(2):167-77. PubMed ID: 18267788
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Simulated annealing and weight decay in adaptive learning: the SARPROP algorithm.
    Treadgold NK; Gedeon TD
    IEEE Trans Neural Netw; 1998; 9(4):662-8. PubMed ID: 18252489
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Universal Approximation Using Feedforward Neural Networks: A Survey of Some Existing Methods, and Some New Results.
    Chung Tsoi A; Scarselli F
    Neural Netw; 1998 Jan; 11(1):15-37. PubMed ID: 12662846
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Loss surface of XOR artificial neural networks.
    Mehta D; Zhao X; Bernal EA; Wales DJ
    Phys Rev E; 2018 May; 97(5-1):052307. PubMed ID: 29906831
    [TBL] [Abstract][Full Text] [Related]  

  • 9. On adaptive learning rate that guarantees convergence in feedforward networks.
    Behera L; Kumar S; Patnaik A
    IEEE Trans Neural Netw; 2006 Sep; 17(5):1116-25. PubMed ID: 17001974
    [TBL] [Abstract][Full Text] [Related]  

  • 10. An annealed chaotic maximum neural network for bipartite subgraph problem.
    Wang J; Tang Z; Wang R
    Int J Neural Syst; 2004 Apr; 14(2):107-16. PubMed ID: 15112368
    [TBL] [Abstract][Full Text] [Related]  

  • 11. New learning automata based algorithms for adaptation of backpropagation algorithm parameters.
    Meybodi MR; Beigy H
    Int J Neural Syst; 2002 Feb; 12(1):45-67. PubMed ID: 11852444
    [TBL] [Abstract][Full Text] [Related]  

  • 12. An Investigation into the Improvement of Local Minima of the Hopfield Network.
    Armitage AF; Gupta NK; Peng M
    Neural Netw; 1996 Oct; 9(7):1241-1253. PubMed ID: 12662596
    [TBL] [Abstract][Full Text] [Related]  

  • 13. The loading problem for recursive neural networks.
    Gori M; Sperduti A
    Neural Netw; 2005 Oct; 18(8):1064-79. PubMed ID: 16198537
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Deterministic global optimization for FNN training.
    Toh KA
    IEEE Trans Syst Man Cybern B Cybern; 2003; 33(6):977-83. PubMed ID: 18238248
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Optimal competitive hopfield network with stochastic dynamics for maximum cut problem.
    Wang J; Tang Z; Cao Q; Wang R
    Int J Neural Syst; 2004 Aug; 14(4):257-65. PubMed ID: 15372703
    [TBL] [Abstract][Full Text] [Related]  

  • 16. The error surface of the 2-2-1 XOR network: The finite stationary points.
    Sprinkhuizen-Kuyper IG; Boers EJ
    Neural Netw; 1998 Jun; 11(4):683-690. PubMed ID: 12662806
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Backpropagation algorithm adaptation parameters using learning automata.
    Beigy H; Meybodi MR
    Int J Neural Syst; 2001 Jun; 11(3):219-28. PubMed ID: 11574959
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Leap-frog is a robust algorithm for training neural networks.
    Holm JE; Botha EC
    Network; 1999 Feb; 10(1):1-13. PubMed ID: 10372759
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Flat minima.
    Hochreiter S; Schmidhuber J
    Neural Comput; 1997 Jan; 9(1):1-42. PubMed ID: 9117894
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Evolutionary product unit based neural networks for regression.
    Martínez-Estudillo A; Martínez-Estudillo F; Hervás-Martínez C; García-Pedrajas N
    Neural Netw; 2006 May; 19(4):477-86. PubMed ID: 16481148
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.