These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

213 related articles for article (PubMed ID: 30979353)

  • 1. A Geometrical Analysis of Global Stability in Trained Feedback Networks.
    Mastrogiuseppe F; Ostojic S
    Neural Comput; 2019 Jun; 31(6):1139-1182. PubMed ID: 30979353
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Transferring learning from external to internal weights in echo-state networks with sparse connectivity.
    Sussillo D; Abbott LF
    PLoS One; 2012; 7(5):e37372. PubMed ID: 22655041
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks.
    Mastrogiuseppe F; Ostojic S
    Neuron; 2018 Aug; 99(3):609-623.e29. PubMed ID: 30057201
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Differential Geometry Methods for Constructing Manifold-Targeted Recurrent Neural Networks.
    Claudi F; Branco T
    Neural Comput; 2022 Jul; 34(8):1790-1811. PubMed ID: 35798324
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Local online learning in recurrent networks with random feedback.
    Murray JM
    Elife; 2019 May; 8():. PubMed ID: 31124785
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Local Dynamics in Trained Recurrent Neural Networks.
    Rivkind A; Barak O
    Phys Rev Lett; 2017 Jun; 118(25):258101. PubMed ID: 28696758
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Biologically plausible deep learning - But how far can we go with shallow networks?
    Illing B; Gerstner W; Brea J
    Neural Netw; 2019 Oct; 118():90-101. PubMed ID: 31254771
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Design of double fuzzy clustering-driven context neural networks.
    Kim EH; Oh SK; Pedrycz W
    Neural Netw; 2018 Aug; 104():1-14. PubMed ID: 29689457
    [TBL] [Abstract][Full Text] [Related]  

  • 9. full-FORCE: A target-based method for training recurrent networks.
    DePasquale B; Cueva CJ; Rajan K; Escola GS; Abbott LF
    PLoS One; 2018; 13(2):e0191527. PubMed ID: 29415041
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Contrastive Hebbian learning with random feedback weights.
    Detorakis G; Bartley T; Neftci E
    Neural Netw; 2019 Jun; 114():1-14. PubMed ID: 30831378
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Relating local connectivity and global dynamics in recurrent excitatory-inhibitory networks.
    Shao Y; Ostojic S
    PLoS Comput Biol; 2023 Jan; 19(1):e1010855. PubMed ID: 36689488
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Geometry of Energy Landscapes and the Optimizability of Deep Neural Networks.
    Becker S; Zhang Y; Lee AA
    Phys Rev Lett; 2020 Mar; 124(10):108301. PubMed ID: 32216422
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A machine learning method for extracting symbolic knowledge from recurrent neural networks.
    Vahed A; Omlin CW
    Neural Comput; 2004 Jan; 16(1):59-71. PubMed ID: 15006023
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Three learning phases for radial-basis-function networks.
    Schwenker F; Kestler HA; Palm G
    Neural Netw; 2001 May; 14(4-5):439-58. PubMed ID: 11411631
    [TBL] [Abstract][Full Text] [Related]  

  • 15. The impact of sparsity in low-rank recurrent neural networks.
    Herbert E; Ostojic S
    PLoS Comput Biol; 2022 Aug; 18(8):e1010426. PubMed ID: 35944030
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Modular representation of layered neural networks.
    Watanabe C; Hiramatsu K; Kashino K
    Neural Netw; 2018 Jan; 97():62-73. PubMed ID: 29096203
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics.
    Maheswaranathan N; Williams AH; Golub MD; Ganguli S; Sussillo D
    Adv Neural Inf Process Syst; 2019 Dec; 32():15696-15705. PubMed ID: 32782423
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Backpropagation algorithms and Reservoir Computing in Recurrent Neural Networks for the forecasting of complex spatiotemporal dynamics.
    Vlachas PR; Pathak J; Hunt BR; Sapsis TP; Girvan M; Ott E; Koumoutsakos P
    Neural Netw; 2020 Jun; 126():191-217. PubMed ID: 32248008
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Learning Fixed Points of Recurrent Neural Networks by Reparameterizing the Network Model.
    Zhu V; Rosenbaum R
    Neural Comput; 2024 Jul; 36(8):1568-1600. PubMed ID: 39028956
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
    Song HF; Yang GR; Wang XJ
    PLoS Comput Biol; 2016 Feb; 12(2):e1004792. PubMed ID: 26928718
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 11.