These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
171 related articles for article (PubMed ID: 37873445)
1. Training biologically plausible recurrent neural networks on cognitive tasks with long-term dependencies. Soo WWM; Goudar V; Wang XJ bioRxiv; 2023 Oct; ():. PubMed ID: 37873445 [TBL] [Abstract][Full Text] [Related]
2. PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks. Ehrlich DB; Stone JT; Brandfonbrener D; Atanasov A; Murray JD eNeuro; 2021; 8(1):. PubMed ID: 33328247 [TBL] [Abstract][Full Text] [Related]
3. A critical review of RNN and LSTM variants in hydrological time series predictions. Waqas M; Humphries UW MethodsX; 2024 Dec; 13():102946. PubMed ID: 39324077 [TBL] [Abstract][Full Text] [Related]
4. Achieving Online Regression Performance of LSTMs With Simple RNNs. Vural NM; Ilhan F; Yilmaz SF; Ergut S; Kozat SS IEEE Trans Neural Netw Learn Syst; 2022 Dec; 33(12):7632-7643. PubMed ID: 34138720 [TBL] [Abstract][Full Text] [Related]
5. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework. Song HF; Yang GR; Wang XJ PLoS Comput Biol; 2016 Feb; 12(2):e1004792. PubMed ID: 26928718 [TBL] [Abstract][Full Text] [Related]
6. Gated Orthogonal Recurrent Units: On Learning to Forget. Jing L; Gulcehre C; Peurifoy J; Shen Y; Tegmark M; Soljacic M; Bengio Y Neural Comput; 2019 Apr; 31(4):765-783. PubMed ID: 30764742 [TBL] [Abstract][Full Text] [Related]
8. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks. He T; Mao H; Yi Z IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305 [TBL] [Abstract][Full Text] [Related]
9. Winning the Lottery With Neural Connectivity Constraints: Faster Learning Across Cognitive Tasks With Spatially Constrained Sparse RNNs. Khona M; Chandra S; Ma JJ; Fiete IR Neural Comput; 2023 Oct; 35(11):1850-1869. PubMed ID: 37725708 [TBL] [Abstract][Full Text] [Related]
10. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks. Miconi T Elife; 2017 Feb; 6():. PubMed ID: 28230528 [TBL] [Abstract][Full Text] [Related]
11. Interpretable, highly accurate brain decoding of subtly distinct brain states from functional MRI using intrinsic functional networks and long short-term memory recurrent neural networks. Li H; Fan Y Neuroimage; 2019 Nov; 202():116059. PubMed ID: 31362049 [TBL] [Abstract][Full Text] [Related]