209 related articles for article (PubMed ID: 30764742)
1. Gated Orthogonal Recurrent Units: On Learning to Forget.
Jing L; Gulcehre C; Peurifoy J; Shen Y; Tegmark M; Soljacic M; Bengio Y
Neural Comput; 2019 Apr; 31(4):765-783. PubMed ID: 30764742
[TBL] [Abstract][Full Text] [Related]
2. Learning to forget: continual prediction with LSTM.
Gers FA; Schmidhuber J; Cummins F
Neural Comput; 2000 Oct; 12(10):2451-71. PubMed ID: 11032042
[TBL] [Abstract][Full Text] [Related]
3. SGORNN: Combining scalar gates and orthogonal constraints in recurrent networks.
Taylor-Melanson W; Ferreira MD; Matwin S
Neural Netw; 2023 Feb; 159():25-33. PubMed ID: 36525915
[TBL] [Abstract][Full Text] [Related]
4. Recurrent Neural Networks With Auxiliary Memory Units.
Wang J; Zhang L; Guo Q; Yi Z
IEEE Trans Neural Netw Learn Syst; 2018 May; 29(5):1652-1661. PubMed ID: 28333646
[TBL] [Abstract][Full Text] [Related]
5. Temporal-kernel recurrent neural networks.
Sutskever I; Hinton G
Neural Netw; 2010 Mar; 23(2):239-43. PubMed ID: 19932002
[TBL] [Abstract][Full Text] [Related]
6. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks.
He T; Mao H; Yi Z
IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305
[TBL] [Abstract][Full Text] [Related]
7. Working Memory Connections for LSTM.
Landi F; Baraldi L; Cornia M; Cucchiara R
Neural Netw; 2021 Dec; 144():334-341. PubMed ID: 34547671
[TBL] [Abstract][Full Text] [Related]
8. Learning With Interpretable Structure From Gated RNN.
Hou BJ; Zhou ZH
IEEE Trans Neural Netw Learn Syst; 2020 Jul; 31(7):2267-2279. PubMed ID: 32071002
[TBL] [Abstract][Full Text] [Related]
9. Considerations in using recurrent neural networks to probe neural dynamics.
Kao JC
J Neurophysiol; 2019 Dec; 122(6):2504-2521. PubMed ID: 31619125
[TBL] [Abstract][Full Text] [Related]
10. Training biologically plausible recurrent neural networks on cognitive tasks with long-term dependencies.
Soo WWM; Goudar V; Wang XJ
bioRxiv; 2023 Oct; ():. PubMed ID: 37873445
[TBL] [Abstract][Full Text] [Related]
11. Sequence Classification Restricted Boltzmann Machines With Gated Units.
Tran SN; Garcez AD; Weyde T; Yin J; Zhang Q; Karunanithi M
IEEE Trans Neural Netw Learn Syst; 2020 Nov; 31(11):4806-4815. PubMed ID: 31940559
[TBL] [Abstract][Full Text] [Related]
12. Training recurrent networks by Evolino.
Schmidhuber J; Wierstra D; Gagliolo M; Gomez F
Neural Comput; 2007 Mar; 19(3):757-79. PubMed ID: 17298232
[TBL] [Abstract][Full Text] [Related]
13. Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences.
Quan Z; Zeng W; Li X; Liu Y; Yu Y; Yang W
IEEE Trans Neural Netw Learn Syst; 2020 Mar; 31(3):813-826. PubMed ID: 31059455
[TBL] [Abstract][Full Text] [Related]
14. EleAtt-RNN: Adding Attentiveness to Neurons in Recurrent Neural Networks.
Zhang P; Xue J; Lan C; Zeng W; Gao Z; Zheng N
IEEE Trans Image Process; 2019 Sep; ():. PubMed ID: 31484119
[TBL] [Abstract][Full Text] [Related]
15. Segmented-memory recurrent neural networks.
Chen J; Chaudhari NS
IEEE Trans Neural Netw; 2009 Aug; 20(8):1267-80. PubMed ID: 19605323
[TBL] [Abstract][Full Text] [Related]
16. Learning Contextual Dependence With Convolutional Hierarchical Recurrent Neural Networks.
Zuo Z; Shuai B; Wang G; Liu X; Wang X; Wang B; Chen Y
IEEE Trans Image Process; 2016 Jul; 25(7):2983-2996. PubMed ID: 28113173
[TBL] [Abstract][Full Text] [Related]
17. A hybrid model based on neural networks for biomedical relation extraction.
Zhang Y; Lin H; Yang Z; Wang J; Zhang S; Sun Y; Yang L
J Biomed Inform; 2018 May; 81():83-92. PubMed ID: 29601989
[TBL] [Abstract][Full Text] [Related]
18. Explicit Duration Recurrent Networks.
Yu SZ
IEEE Trans Neural Netw Learn Syst; 2022 Jul; 33(7):3120-3130. PubMed ID: 33497341
[TBL] [Abstract][Full Text] [Related]
19. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks.
Zazo R; Lozano-Diez A; Gonzalez-Dominguez J; Toledano DT; Gonzalez-Rodriguez J
PLoS One; 2016; 11(1):e0146917. PubMed ID: 26824467
[TBL] [Abstract][Full Text] [Related]
20. Markovian architectural bias of recurrent neural networks.
Tino P; Cernanský M; Benusková L
IEEE Trans Neural Netw; 2004 Jan; 15(1):6-15. PubMed ID: 15387243
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]