These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
4. Analysis of an Attractor Neural Network's Response to Conflicting External Inputs. Hedrick K; Zhang K J Math Neurosci; 2018 May; 8(1):6. PubMed ID: 29767380 [TBL] [Abstract][Full Text] [Related]
5. Dynamic On-line Clustering and State Extraction: An Approach to Symbolic Learning. Mozer M; Das S Neural Netw; 1998 Jan; 11(1):53-64. PubMed ID: 12662848 [TBL] [Abstract][Full Text] [Related]
6. An Investigation of the Dynamical Transitions in Harmonically Driven Random Networks of Firing-Rate Neurons. Nikiforou K; Mediano PAM; Shanahan M Cognit Comput; 2017; 9(3):351-363. PubMed ID: 28680506 [TBL] [Abstract][Full Text] [Related]
7. Symbolic representation of recurrent neural network dynamics. Huynh TQ; Reggia JA IEEE Trans Neural Netw Learn Syst; 2012 Oct; 23(10):1649-58. PubMed ID: 24808009 [TBL] [Abstract][Full Text] [Related]
9. Robust computation with rhythmic spike patterns. Frady EP; Sommer FT Proc Natl Acad Sci U S A; 2019 Sep; 116(36):18050-18059. PubMed ID: 31431524 [TBL] [Abstract][Full Text] [Related]
10. Excitable networks for finite state computation with continuous time recurrent neural networks. Ashwin P; Postlethwaite C Biol Cybern; 2021 Oct; 115(5):519-538. PubMed ID: 34608540 [TBL] [Abstract][Full Text] [Related]
11. Exploring the associative learning capabilities of the segmented attractor network for lifelong learning. Jones A; Jha R Front Artif Intell; 2022; 5():910407. PubMed ID: 35978653 [TBL] [Abstract][Full Text] [Related]
12. Memory dynamics in attractor networks with saliency weights. Tang H; Li H; Yan R Neural Comput; 2010 Jul; 22(7):1899-926. PubMed ID: 20235821 [TBL] [Abstract][Full Text] [Related]
13. Models of Innate Neural Attractors and Their Applications for Neural Information Processing. Solovyeva KP; Karandashev IM; Zhavoronkov A; Dunin-Barkowski WL Front Syst Neurosci; 2015; 9():178. PubMed ID: 26778977 [TBL] [Abstract][Full Text] [Related]
14. NeuroLISP: High-level symbolic programming with attractor neural networks. Davis GP; Katz GE; Gentili RJ; Reggia JA Neural Netw; 2022 Feb; 146():200-219. PubMed ID: 34894482 [TBL] [Abstract][Full Text] [Related]
15. Accuracy and response-time distributions for decision-making: linear perfect integrators versus nonlinear attractor-based neural circuits. Miller P; Katz DB J Comput Neurosci; 2013 Dec; 35(3):261-94. PubMed ID: 23608921 [TBL] [Abstract][Full Text] [Related]
16. Selective connectivity enhances storage capacity in attractor models of memory function. Emina F; Kropff E Front Syst Neurosci; 2022; 16():983147. PubMed ID: 36185821 [TBL] [Abstract][Full Text] [Related]
17. Noise tolerance of attractor and feedforward memory models. Lim S; Goldman MS Neural Comput; 2012 Feb; 24(2):332-90. PubMed ID: 22091664 [TBL] [Abstract][Full Text] [Related]
18. Lateral thinking, from the Hopfield model to cortical dynamics. Akrami A; Russo E; Treves A Brain Res; 2012 Jan; 1434():4-16. PubMed ID: 21839426 [TBL] [Abstract][Full Text] [Related]
19. Attractor-map versus autoassociation based attractor dynamics in the hippocampal network. Colgin LL; Leutgeb S; Jezek K; Leutgeb JK; Moser EI; McNaughton BL; Moser MB J Neurophysiol; 2010 Jul; 104(1):35-50. PubMed ID: 20445029 [TBL] [Abstract][Full Text] [Related]
20. Generating functionals for autonomous latching dynamics in attractor relict networks. Linkerhand M; Gros C Sci Rep; 2013; 3():2042. PubMed ID: 23784373 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]