These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
2. Orthogonality is not a panacea: backpropagation and "catastrophic interference". Yamaguchi M Scand J Psychol; 2006 Oct; 47(5):339-44. PubMed ID: 16987202 [TBL] [Abstract][Full Text] [Related]
3. Reassessment of catastrophic interference. Yamaguchi M Neuroreport; 2004 Oct; 15(15):2423-6. PubMed ID: 15640768 [TBL] [Abstract][Full Text] [Related]
4. Size invariance does not hold for connectionist models: dangers of using a toy model. Yamaguchi M Neuroreport; 2004 Mar; 15(3):565-7. PubMed ID: 15094524 [TBL] [Abstract][Full Text] [Related]
5. A learning rule for very simple universal approximators consisting of a single layer of perceptrons. Auer P; Burgsteiner H; Maass W Neural Netw; 2008 Jun; 21(5):786-95. PubMed ID: 18249524 [TBL] [Abstract][Full Text] [Related]
6. Catastrophic forgetting in simple networks: an analysis of the pseudorehearsal solution. Frean M; Robins A Network; 1999 Aug; 10(3):227-36. PubMed ID: 10496474 [TBL] [Abstract][Full Text] [Related]
7. Methods for reducing interference in the Complementary Learning Systems model: oscillating inhibition and autonomous memory rehearsal. Norman KA; Newman EL; Perotte AJ Neural Netw; 2005 Nov; 18(9):1212-28. PubMed ID: 16260116 [TBL] [Abstract][Full Text] [Related]
8. A robust method for distinguishing between learned and spurious attractors. Robins AV; McCallum SJ Neural Netw; 2004 Apr; 17(3):313-26. PubMed ID: 15037350 [TBL] [Abstract][Full Text] [Related]
9. The loading problem for recursive neural networks. Gori M; Sperduti A Neural Netw; 2005 Oct; 18(8):1064-79. PubMed ID: 16198537 [TBL] [Abstract][Full Text] [Related]
10. A new backpropagation learning algorithm for layered neural networks with nondifferentiable units. Oohori T; Naganuma H; Watanabe K Neural Comput; 2007 May; 19(5):1422-35. PubMed ID: 17381272 [TBL] [Abstract][Full Text] [Related]
11. Elements for a general memory structure: properties of recurrent neural networks used to form situation models. Makarov VA; Song Y; Velarde MG; Hübner D; Cruse H Biol Cybern; 2008 May; 98(5):371-95. PubMed ID: 18350312 [TBL] [Abstract][Full Text] [Related]
12. Training recurrent networks by Evolino. Schmidhuber J; Wierstra D; Gagliolo M; Gomez F Neural Comput; 2007 Mar; 19(3):757-79. PubMed ID: 17298232 [TBL] [Abstract][Full Text] [Related]
13. Dynamic and interactive generation of object handling behaviors by a small humanoid robot using a dynamic neural network model. Ito M; Noda K; Hoshino Y; Tani J Neural Netw; 2006 Apr; 19(3):323-37. PubMed ID: 16618536 [TBL] [Abstract][Full Text] [Related]
15. Self-consistent signal-to-noise analysis of Hopfield model with unit replacement. Aonishi T; Komatsu Y; Kurata K Neural Netw; 2010 Dec; 23(10):1180-6. PubMed ID: 20621446 [TBL] [Abstract][Full Text] [Related]
16. Using noise to compute error surfaces in connectionist networks: a novel means of reducing catastrophic forgetting. French RM; Chater N Neural Comput; 2002 Jul; 14(7):1755-69. PubMed ID: 12079555 [TBL] [Abstract][Full Text] [Related]
17. Elman backpropagation as reinforcement for simple recurrent networks. Grüning A Neural Comput; 2007 Nov; 19(11):3108-31. PubMed ID: 17883351 [TBL] [Abstract][Full Text] [Related]
18. Incremental learning of feature space and classifier for face recognition. Ozawa S; Toh SL; Abe S; Pang S; Kasabov N Neural Netw; 2005; 18(5-6):575-84. PubMed ID: 16102940 [TBL] [Abstract][Full Text] [Related]
19. Short-term memory for serial order: a recurrent neural network model. Botvinick MM; Plaut DC Psychol Rev; 2006 Apr; 113(2):201-33. PubMed ID: 16637760 [TBL] [Abstract][Full Text] [Related]
20. A modified error backpropagation algorithm for complex-value neural networks. Chen X; Tang Z; Variappan C; Li S; Okada T Int J Neural Syst; 2005 Dec; 15(6):435-43. PubMed ID: 16385633 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]