These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

296 related articles for article (PubMed ID: 31124785)

  • 1. Local online learning in recurrent networks with random feedback.
    Murray JM
    Elife; 2019 May; 8():. PubMed ID: 31124785
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences.
    Quan Z; Zeng W; Li X; Liu Y; Yu Y; Yang W
    IEEE Trans Neural Netw Learn Syst; 2020 Mar; 31(3):813-826. PubMed ID: 31059455
    [TBL] [Abstract][Full Text] [Related]  

  • 3. A normalized adaptive training of recurrent neural networks with augmented error gradient.
    Yilei W; Qing S; Sheng L
    IEEE Trans Neural Netw; 2008 Feb; 19(2):351-6. PubMed ID: 18269965
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks.
    Bitzer S; Kiebel SJ
    Biol Cybern; 2012 Jul; 106(4-5):201-17. PubMed ID: 22581026
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Generalized Recurrent Neural Network accommodating Dynamic Causal Modeling for functional MRI analysis.
    Wang Y; Wang Y; Lui YW
    Neuroimage; 2018 Sep; 178():385-402. PubMed ID: 29782993
    [TBL] [Abstract][Full Text] [Related]  

  • 6. PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks.
    Ehrlich DB; Stone JT; Brandfonbrener D; Atanasov A; Murray JD
    eNeuro; 2021; 8(1):. PubMed ID: 33328247
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Reconstructing Genetic Regulatory Networks Using Two-Step Algorithms with the Differential Equation Models of Neural Networks.
    Chen CK
    Interdiscip Sci; 2018 Dec; 10(4):823-835. PubMed ID: 28748400
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
    Song HF; Yang GR; Wang XJ
    PLoS Comput Biol; 2016 Feb; 12(2):e1004792. PubMed ID: 26928718
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Robust adaptive gradient-descent training algorithm for recurrent neural networks in discrete time domain.
    Song Q; Wu Y; Soh YC
    IEEE Trans Neural Netw; 2008 Nov; 19(11):1841-53. PubMed ID: 18990640
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Temporal-kernel recurrent neural networks.
    Sutskever I; Hinton G
    Neural Netw; 2010 Mar; 23(2):239-43. PubMed ID: 19932002
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Segmenting and classifying activities in robot-assisted surgery with recurrent neural networks.
    DiPietro R; Ahmidi N; Malpani A; Waldram M; Lee GI; Lee MR; Vedula SS; Hager GD
    Int J Comput Assist Radiol Surg; 2019 Nov; 14(11):2005-2020. PubMed ID: 31037493
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Task representations in neural networks trained to perform many cognitive tasks.
    Yang GR; Joglekar MR; Song HF; Newsome WT; Wang XJ
    Nat Neurosci; 2019 Feb; 22(2):297-306. PubMed ID: 30643294
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Neuroevolution of a Modular Memory-Augmented Neural Network for Deep Memory Problems.
    Khadka S; Chung JJ; Tumer K
    Evol Comput; 2019; 27(4):639-664. PubMed ID: 30407876
    [TBL] [Abstract][Full Text] [Related]  

  • 14. A generalized LSTM-like training algorithm for second-order recurrent neural networks.
    Monner D; Reggia JA
    Neural Netw; 2012 Jan; 25(1):70-83. PubMed ID: 21803542
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Training recurrent neural networks robust to incomplete data: Application to Alzheimer's disease progression modeling.
    Mehdipour Ghazi M; Nielsen M; Pai A; Cardoso MJ; Modat M; Ourselin S; Sørensen L;
    Med Image Anal; 2019 Apr; 53():39-46. PubMed ID: 30682584
    [TBL] [Abstract][Full Text] [Related]  

  • 16. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures.
    Yu Y; Si X; Hu C; Zhang J
    Neural Comput; 2019 Jul; 31(7):1235-1270. PubMed ID: 31113301
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Recurrent Neural Networks With Auxiliary Memory Units.
    Wang J; Zhang L; Guo Q; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2018 May; 29(5):1652-1661. PubMed ID: 28333646
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Flexible Working Memory Through Selective Gating and Attentional Tagging.
    Kruijne W; Bohte SM; Roelfsema PR; Olivers CNL
    Neural Comput; 2021 Jan; 33(1):1-40. PubMed ID: 33080159
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Recurrent neural networks that learn multi-step visual routines with reinforcement learning.
    Mollard S; Wacongne C; Bohte SM; Roelfsema PR
    PLoS Comput Biol; 2024 Apr; 20(4):e1012030. PubMed ID: 38683837
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.
    Alemi A; Baldassi C; Brunel N; Zecchina R
    PLoS Comput Biol; 2015 Aug; 11(8):e1004439. PubMed ID: 26291608
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 15.