These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
170 related articles for article (PubMed ID: 36545030)
21. Effects of cellular homeostatic intrinsic plasticity on dynamical and computational properties of biological recurrent neural networks. Naudé J; Cessac B; Berry H; Delord B J Neurosci; 2013 Sep; 33(38):15032-43. PubMed ID: 24048833 [TBL] [Abstract][Full Text] [Related]
22. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks. He T; Mao H; Yi Z IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305 [TBL] [Abstract][Full Text] [Related]
23. Working Memory Connections for LSTM. Landi F; Baraldi L; Cornia M; Cucchiara R Neural Netw; 2021 Dec; 144():334-341. PubMed ID: 34547671 [TBL] [Abstract][Full Text] [Related]
24. Real-time computation at the edge of chaos in recurrent neural networks. Bertschinger N; Natschläger T Neural Comput; 2004 Jul; 16(7):1413-36. PubMed ID: 15165396 [TBL] [Abstract][Full Text] [Related]
25. Markovian architectural bias of recurrent neural networks. Tino P; Cernanský M; Benusková L IEEE Trans Neural Netw; 2004 Jan; 15(1):6-15. PubMed ID: 15387243 [TBL] [Abstract][Full Text] [Related]
26. Learning continuous chaotic attractors with a reservoir computer. Smith LM; Kim JZ; Lu Z; Bassett DS Chaos; 2022 Jan; 32(1):011101. PubMed ID: 35105129 [TBL] [Abstract][Full Text] [Related]
27. Robust initialization of a Jordan network with recurrent constrained learning. Song Q IEEE Trans Neural Netw; 2011 Dec; 22(12):2460-73. PubMed ID: 21965202 [TBL] [Abstract][Full Text] [Related]
28. Learning to forget: continual prediction with LSTM. Gers FA; Schmidhuber J; Cummins F Neural Comput; 2000 Oct; 12(10):2451-71. PubMed ID: 11032042 [TBL] [Abstract][Full Text] [Related]
29. Itinerant memory dynamics and global bifurcations in chaotic neural networks. Kitajima H; Yoshinaga T; Aihara K; Kawakami H Chaos; 2003 Sep; 13(3):1122-32. PubMed ID: 12946205 [TBL] [Abstract][Full Text] [Related]
30. Recurrent Neural Networks With Auxiliary Memory Units. Wang J; Zhang L; Guo Q; Yi Z IEEE Trans Neural Netw Learn Syst; 2018 May; 29(5):1652-1661. PubMed ID: 28333646 [TBL] [Abstract][Full Text] [Related]
31. A hybrid model based on neural networks for biomedical relation extraction. Zhang Y; Lin H; Yang Z; Wang J; Zhang S; Sun Y; Yang L J Biomed Inform; 2018 May; 81():83-92. PubMed ID: 29601989 [TBL] [Abstract][Full Text] [Related]
32. Novel tracking function of moving target using chaotic dynamics in a recurrent neural network model. Li Y; Nara S Cogn Neurodyn; 2008 Mar; 2(1):39-48. PubMed ID: 19003472 [TBL] [Abstract][Full Text] [Related]
33. Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks. Sussillo D; Barak O Neural Comput; 2013 Mar; 25(3):626-49. PubMed ID: 23272922 [TBL] [Abstract][Full Text] [Related]
34. Synchronization transition in neuronal networks composed of chaotic or non-chaotic oscillators. Xu K; Maidana JP; Castro S; Orio P Sci Rep; 2018 May; 8(1):8370. PubMed ID: 29849108 [TBL] [Abstract][Full Text] [Related]
35. State-Regularized Recurrent Neural Networks to Extract Automata and Explain Predictions. Wang C; Lawrence C; Niepert M IEEE Trans Pattern Anal Mach Intell; 2023 Jun; 45(6):7739-7750. PubMed ID: 36445990 [TBL] [Abstract][Full Text] [Related]
36. Winning the Lottery With Neural Connectivity Constraints: Faster Learning Across Cognitive Tasks With Spatially Constrained Sparse RNNs. Khona M; Chandra S; Ma JJ; Fiete IR Neural Comput; 2023 Oct; 35(11):1850-1869. PubMed ID: 37725708 [TBL] [Abstract][Full Text] [Related]