These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
245 related articles for article (PubMed ID: 15387243)
1. Markovian architectural bias of recurrent neural networks. Tino P; Cernanský M; Benusková L IEEE Trans Neural Netw; 2004 Jan; 15(1):6-15. PubMed ID: 15387243 [TBL] [Abstract][Full Text] [Related]
2. Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures. Cernanský M; Makula M; Benusková L Neural Netw; 2007 Mar; 20(2):236-44. PubMed ID: 16687236 [TBL] [Abstract][Full Text] [Related]
3. Architectural and Markovian factors of echo state networks. Gallicchio C; Micheli A Neural Netw; 2011 Jun; 24(5):440-56. PubMed ID: 21376531 [TBL] [Abstract][Full Text] [Related]
4. Training recurrent networks by Evolino. Schmidhuber J; Wierstra D; Gagliolo M; Gomez F Neural Comput; 2007 Mar; 19(3):757-79. PubMed ID: 17298232 [TBL] [Abstract][Full Text] [Related]
8. A new approach to knowledge-based design of recurrent neural networks. Kolman E; Margaliot M IEEE Trans Neural Netw; 2008 Aug; 19(8):1389-401. PubMed ID: 18701369 [TBL] [Abstract][Full Text] [Related]
9. Foundations of implementing the competitive layer model by Lotka-Volterra recurrent neural networks. Yi Z IEEE Trans Neural Netw; 2010 Mar; 21(3):494-507. PubMed ID: 20142165 [TBL] [Abstract][Full Text] [Related]
10. Sequence-specific bias correction for RNA-seq data using recurrent neural networks. Zhang YZ; Yamaguchi R; Imoto S; Miyano S BMC Genomics; 2017 Jan; 18(Suppl 1):1044. PubMed ID: 28198674 [TBL] [Abstract][Full Text] [Related]
11. Multifeedback-layer neural network. Savran A IEEE Trans Neural Netw; 2007 Mar; 18(2):373-84. PubMed ID: 17385626 [TBL] [Abstract][Full Text] [Related]
12. Energy-to-peak state estimation for Markov jump RNNs with time-varying delays via nonsynchronous filter with nonstationary mode transitions. Zhang L; Zhu Y; Zheng WX IEEE Trans Neural Netw Learn Syst; 2015 Oct; 26(10):2346-56. PubMed ID: 25576580 [TBL] [Abstract][Full Text] [Related]
13. Extracting finite-state representations from recurrent neural networks trained on chaotic symbolic sequences. Tino P; Köteles M IEEE Trans Neural Netw; 1999; 10(2):284-302. PubMed ID: 18252527 [TBL] [Abstract][Full Text] [Related]
14. Absolute exponential stability of recurrent neural networks with generalized activation function. Xu J; Cao YY; Sun Y; Tang J IEEE Trans Neural Netw; 2008 Jun; 19(6):1075-89. PubMed ID: 18541505 [TBL] [Abstract][Full Text] [Related]
15. Recursive Bayesian recurrent neural networks for time-series modeling. Mirikitani DT; Nikolaev N IEEE Trans Neural Netw; 2010 Feb; 21(2):262-74. PubMed ID: 20040415 [TBL] [Abstract][Full Text] [Related]
16. Simple recurrent networks learn context-free and context-sensitive languages by counting. Rodriguez P Neural Comput; 2001 Sep; 13(9):2093-118. PubMed ID: 11516359 [TBL] [Abstract][Full Text] [Related]
18. Robustness analysis of global exponential stability of recurrent neural networks in the presence of time delays and random disturbances. Shen Y; Wang J IEEE Trans Neural Netw Learn Syst; 2012 Jan; 23(1):87-96. PubMed ID: 24808458 [TBL] [Abstract][Full Text] [Related]