These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

213 related articles for article (PubMed ID: 12662788)

  • 1. How embedded memory in recurrent neural network architectures helps learning long-term temporal dependencies.
    Lin T; Horne BG; Giles CL
    Neural Netw; 1998 Jul; 11(5):861-868. PubMed ID: 12662788
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Learning long-term dependencies in NARX recurrent neural networks.
    Lin T; Horne BG; Tino P; Giles CL
    IEEE Trans Neural Netw; 1996; 7(6):1329-38. PubMed ID: 18263528
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Segmented-memory recurrent neural networks.
    Chen J; Chaudhari NS
    IEEE Trans Neural Netw; 2009 Aug; 20(8):1267-80. PubMed ID: 19605323
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Warming up recurrent neural networks to maximise reachable multistability greatly improves learning.
    Lambrechts G; De Geeter F; Vecoven N; Ernst D; Drion G
    Neural Netw; 2023 Sep; 166():645-669. PubMed ID: 37604075
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Computational capabilities of recurrent NARX neural networks.
    Siegelmann HT; Horne BG; Giles CL
    IEEE Trans Syst Man Cybern B Cybern; 1997; 27(2):208-15. PubMed ID: 18255858
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A generalized LSTM-like training algorithm for second-order recurrent neural networks.
    Monner D; Reggia JA
    Neural Netw; 2012 Jan; 25(1):70-83. PubMed ID: 21803542
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Learning long-term dependencies with gradient descent is difficult.
    Bengio Y; Simard P; Frasconi P
    IEEE Trans Neural Netw; 1994; 5(2):157-66. PubMed ID: 18267787
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures.
    Yu Y; Si X; Hu C; Zhang J
    Neural Comput; 2019 Jul; 31(7):1235-1270. PubMed ID: 31113301
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Deep Recurrent Neural Networks for Human Activity Recognition.
    Murad A; Pyun JY
    Sensors (Basel); 2017 Nov; 17(11):. PubMed ID: 29113103
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks.
    He T; Mao H; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305
    [TBL] [Abstract][Full Text] [Related]  

  • 11. A maze learning comparison of Elman, long short-term memory, and Mona neural networks.
    Portegys TE
    Neural Netw; 2010 Mar; 23(2):306-13. PubMed ID: 19945822
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences.
    Quan Z; Zeng W; Li X; Liu Y; Yu Y; Yang W
    IEEE Trans Neural Netw Learn Syst; 2020 Mar; 31(3):813-826. PubMed ID: 31059455
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Fixed-weight on-line learning.
    Younger AS; Conwell PR; Cotter NE
    IEEE Trans Neural Netw; 1999; 10(2):272-83. PubMed ID: 18252526
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Comparing feedforward and recurrent neural network architectures with human behavior in artificial grammar learning.
    Alamia A; Gauducheau V; Paisios D; VanRullen R
    Sci Rep; 2020 Dec; 10(1):22172. PubMed ID: 33335190
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE).
    Pitti A; Quoy M; Lavandier C; Boucenna S
    Neural Netw; 2020 Jan; 121():242-258. PubMed ID: 31581065
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Temporal-kernel recurrent neural networks.
    Sutskever I; Hinton G
    Neural Netw; 2010 Mar; 23(2):239-43. PubMed ID: 19932002
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Neuroevolution of a Modular Memory-Augmented Neural Network for Deep Memory Problems.
    Khadka S; Chung JJ; Tumer K
    Evol Comput; 2019; 27(4):639-664. PubMed ID: 30407876
    [TBL] [Abstract][Full Text] [Related]  

  • 18. A Taxonomy for Neural Memory Networks.
    Ma Y; Principe JC
    IEEE Trans Neural Netw Learn Syst; 2020 Jun; 31(6):1780-1793. PubMed ID: 31443054
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Bidirectional deep recurrent neural networks for process fault classification.
    Chadha GS; Panambilly A; Schwung A; Ding SX
    ISA Trans; 2020 Nov; 106():330-342. PubMed ID: 32684422
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Spatiotemporal Recurrent Convolutional Networks for Traffic Prediction in Transportation Networks.
    Yu H; Wu Z; Wang S; Wang Y; Ma X
    Sensors (Basel); 2017 Jun; 17(7):. PubMed ID: 28672867
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 11.