These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
5. Local online learning in recurrent networks with random feedback. Murray JM Elife; 2019 May; 8():. PubMed ID: 31124785 [TBL] [Abstract][Full Text] [Related]
6. Local Dynamics in Trained Recurrent Neural Networks. Rivkind A; Barak O Phys Rev Lett; 2017 Jun; 118(25):258101. PubMed ID: 28696758 [TBL] [Abstract][Full Text] [Related]
7. Biologically plausible deep learning - But how far can we go with shallow networks? Illing B; Gerstner W; Brea J Neural Netw; 2019 Oct; 118():90-101. PubMed ID: 31254771 [TBL] [Abstract][Full Text] [Related]
8. Design of double fuzzy clustering-driven context neural networks. Kim EH; Oh SK; Pedrycz W Neural Netw; 2018 Aug; 104():1-14. PubMed ID: 29689457 [TBL] [Abstract][Full Text] [Related]
9. full-FORCE: A target-based method for training recurrent networks. DePasquale B; Cueva CJ; Rajan K; Escola GS; Abbott LF PLoS One; 2018; 13(2):e0191527. PubMed ID: 29415041 [TBL] [Abstract][Full Text] [Related]
10. Contrastive Hebbian learning with random feedback weights. Detorakis G; Bartley T; Neftci E Neural Netw; 2019 Jun; 114():1-14. PubMed ID: 30831378 [TBL] [Abstract][Full Text] [Related]
11. Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks. Shao Y; Ostojic S PLoS Comput Biol; 2023 Jan; 19(1):e1010855. PubMed ID: 36689488 [TBL] [Abstract][Full Text] [Related]
12. Geometry of Energy Landscapes and the Optimizability of Deep Neural Networks. Becker S; Zhang Y; Lee AA Phys Rev Lett; 2020 Mar; 124(10):108301. PubMed ID: 32216422 [TBL] [Abstract][Full Text] [Related]
13. A machine learning method for extracting symbolic knowledge from recurrent neural networks. Vahed A; Omlin CW Neural Comput; 2004 Jan; 16(1):59-71. PubMed ID: 15006023 [TBL] [Abstract][Full Text] [Related]
14. Three learning phases for radial-basis-function networks. Schwenker F; Kestler HA; Palm G Neural Netw; 2001 May; 14(4-5):439-58. PubMed ID: 11411631 [TBL] [Abstract][Full Text] [Related]
15. The impact of sparsity in low-rank recurrent neural networks. Herbert E; Ostojic S PLoS Comput Biol; 2022 Aug; 18(8):e1010426. PubMed ID: 35944030 [TBL] [Abstract][Full Text] [Related]