These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
3. Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study. Kim DH; Park J; Kahng B PLoS One; 2017; 12(10):e0184683. PubMed ID: 29077721 [TBL] [Abstract][Full Text] [Related]
4. Long-term attraction in higher order neural networks. Burshtein D IEEE Trans Neural Netw; 1998; 9(1):42-50. PubMed ID: 18252428 [TBL] [Abstract][Full Text] [Related]
5. Analysis and Design of Multivalued High-Capacity Associative Memories Based on Delayed Recurrent Neural Networks. Zhang J; Zhu S; Bao G; Liu X; Wen S IEEE Trans Cybern; 2022 Dec; 52(12):12989-13000. PubMed ID: 34347620 [TBL] [Abstract][Full Text] [Related]
6. On the Maximum Storage Capacity of the Hopfield Model. Folli V; Leonetti M; Ruocco G Front Comput Neurosci; 2016; 10():144. PubMed ID: 28119595 [TBL] [Abstract][Full Text] [Related]
8. Information theoretical performance measure for associative memories and its application to neural networks. Schlüter M; Kerschhaggl O; Wagner F Phys Rev E Stat Phys Plasmas Fluids Relat Interdiscip Topics; 1999 Aug; 60(2 Pt B):2141-7. PubMed ID: 11970006 [TBL] [Abstract][Full Text] [Related]
9. Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks. Gosti G; Folli V; Leonetti M; Ruocco G Entropy (Basel); 2019 Jul; 21(8):. PubMed ID: 33267440 [TBL] [Abstract][Full Text] [Related]
11. Storage capacity and retrieval time of small-world neural networks. Oshima H; Odagaki T Phys Rev E Stat Nonlin Soft Matter Phys; 2007 Sep; 76(3 Pt 2):036114. PubMed ID: 17930313 [TBL] [Abstract][Full Text] [Related]
12. Associative Memories via Predictive Coding. Salvatori T; Song Y; Hong Y; Sha L; Frieder S; Xu Z; Bogacz R; Lukasiewicz T Adv Neural Inf Process Syst; 2021 Dec; 34():3874-3886. PubMed ID: 35664437 [TBL] [Abstract][Full Text] [Related]
13. Learning associative memories by error backpropagation. Zheng P; Zhang J; Tang W IEEE Trans Neural Netw; 2011 Mar; 22(3):347-55. PubMed ID: 21189234 [TBL] [Abstract][Full Text] [Related]
14. Memory dynamics in attractor networks with saliency weights. Tang H; Li H; Yan R Neural Comput; 2010 Jul; 22(7):1899-926. PubMed ID: 20235821 [TBL] [Abstract][Full Text] [Related]
15. Multistability of Delayed Hybrid Impulsive Neural Networks With Application to Associative Memories. Hu B; Guan ZH; Chen G; Lewis FL IEEE Trans Neural Netw Learn Syst; 2019 May; 30(5):1537-1551. PubMed ID: 30296243 [TBL] [Abstract][Full Text] [Related]
16. Optimal and robust design of brain-state-in-a-box neural associative memories. Park Y Neural Netw; 2010 Mar; 23(2):210-8. PubMed ID: 19914797 [TBL] [Abstract][Full Text] [Related]
17. Increase of storage capacity of neural networks by preprocessing using convergence and divergence. Orzó L Acta Biochim Biophys Hung; 1991-1992; 26(1-4):127-30. PubMed ID: 1844796 [TBL] [Abstract][Full Text] [Related]
18. Neural networks with chaotic recursive nodes: techniques for the design of associative memories, contrast with Hopfield architectures, and extensions for time-dependent inputs. Del-Moral-Hernandez E Neural Netw; 2003; 16(5-6):675-82. PubMed ID: 12850022 [TBL] [Abstract][Full Text] [Related]
19. Analysis and optimal design of continuous neural networks with applications to associative memory. Zhenjiang M; Baozong Y Neural Netw; 1999 Mar; 12(2):259-271. PubMed ID: 12662702 [TBL] [Abstract][Full Text] [Related]
20. A modular attractor associative memory with patchy connectivity and weight pruning. Meli C; Lansner A Network; 2013; 24(4):129-50. PubMed ID: 24251411 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]