These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
5. A New Delay Connection for Long Short-Term Memory Networks. Wang J; Zhang L; Chen Y; Yi Z Int J Neural Syst; 2018 Aug; 28(6):1750061. PubMed ID: 29382286 [TBL] [Abstract][Full Text] [Related]
6. A maze learning comparison of Elman, long short-term memory, and Mona neural networks. Portegys TE Neural Netw; 2010 Mar; 23(2):306-13. PubMed ID: 19945822 [TBL] [Abstract][Full Text] [Related]
7. A generalized LSTM-like training algorithm for second-order recurrent neural networks. Monner D; Reggia JA Neural Netw; 2012 Jan; 25(1):70-83. PubMed ID: 21803542 [TBL] [Abstract][Full Text] [Related]
8. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures. Yu Y; Si X; Hu C; Zhang J Neural Comput; 2019 Jul; 31(7):1235-1270. PubMed ID: 31113301 [TBL] [Abstract][Full Text] [Related]
9. Dependency-based Siamese long short-term memory network for learning sentence representations. Zhu W; Yao T; Ni J; Wei B; Lu Z PLoS One; 2018; 13(3):e0193919. PubMed ID: 29513748 [TBL] [Abstract][Full Text] [Related]
10. Estimating Brain Connectivity With Varying-Length Time Lags Using a Recurrent Neural Network. Wang Y; Lin K; Qi Y; Lian Q; Feng S; Wu Z; Pan G IEEE Trans Biomed Eng; 2018 Sep; 65(9):1953-1963. PubMed ID: 29993397 [TBL] [Abstract][Full Text] [Related]
11. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks. Zazo R; Lozano-Diez A; Gonzalez-Dominguez J; Toledano DT; Gonzalez-Rodriguez J PLoS One; 2016; 11(1):e0146917. PubMed ID: 26824467 [TBL] [Abstract][Full Text] [Related]
12. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory. Yang H; Pan Z; Tao Q Comput Intell Neurosci; 2017; 2017():9478952. PubMed ID: 29391864 [TBL] [Abstract][Full Text] [Related]
13. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks. He T; Mao H; Yi Z IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305 [TBL] [Abstract][Full Text] [Related]
14. Training recurrent networks by Evolino. Schmidhuber J; Wierstra D; Gagliolo M; Gomez F Neural Comput; 2007 Mar; 19(3):757-79. PubMed ID: 17298232 [TBL] [Abstract][Full Text] [Related]
15. Efficient Online Learning Algorithms Based on LSTM Neural Networks. Ergen T; Kozat SS IEEE Trans Neural Netw Learn Syst; 2018 Aug; 29(8):3772-3783. PubMed ID: 28920911 [TBL] [Abstract][Full Text] [Related]
16. Skeleton-Based Human Action Recognition With Global Context-Aware Attention LSTM Networks. Liu J; Wang G; Duan LY; Abdiyeva K; Kot AC IEEE Trans Image Process; 2018 Apr; 27(4):1586-1599. PubMed ID: 29324413 [TBL] [Abstract][Full Text] [Related]
18. Partially pre-calculated weights for the backpropagation learning regime and high accuracy function mapping using continuous input RAM-based sigma-pi nets. Neville RS; Stonham TJ; Glover RJ Neural Netw; 2000 Jan; 13(1):91-110. PubMed ID: 10935462 [TBL] [Abstract][Full Text] [Related]
19. Using Long Short-Term Memory (LSTM) Neural Networks to Predict Emergency Department Wait Time. Cheng N; Kuo A Stud Health Technol Inform; 2020 Jun; 272():199-202. PubMed ID: 32604635 [TBL] [Abstract][Full Text] [Related]
20. Improving protein disorder prediction by deep bidirectional long short-term memory recurrent neural networks. Hanson J; Yang Y; Paliwal K; Zhou Y Bioinformatics; 2017 Mar; 33(5):685-692. PubMed ID: 28011771 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]