These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

364 related articles for article (PubMed ID: 31619125)

  • 1. Considerations in using recurrent neural networks to probe neural dynamics.
    Kao JC
    J Neurophysiol; 2019 Dec; 122(6):2504-2521. PubMed ID: 31619125
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Stimulus-Driven and Spontaneous Dynamics in Excitatory-Inhibitory Recurrent Neural Networks for Sequence Representation.
    Rajakumar A; Rinzel J; Chen ZS
    Neural Comput; 2021 Sep; 33(10):2603-2645. PubMed ID: 34530451
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
    Song HF; Yang GR; Wang XJ
    PLoS Comput Biol; 2016 Feb; 12(2):e1004792. PubMed ID: 26928718
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Recognizing recurrent neural networks (rRNN): Bayesian inference for recurrent neural networks.
    Bitzer S; Kiebel SJ
    Biol Cybern; 2012 Jul; 106(4-5):201-17. PubMed ID: 22581026
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks.
    Sussillo D; Barak O
    Neural Comput; 2013 Mar; 25(3):626-49. PubMed ID: 23272922
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Structured flexibility in recurrent neural networks via neuromodulation.
    Costacurta JC; Bhandarkar S; Zoltowski DM; Linderman SW
    bioRxiv; 2024 Jul; ():. PubMed ID: 39091788
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Inferring population dynamics in macaque cortex.
    Meghanath G; Jimenez B; Makin JG
    J Neural Eng; 2023 Nov; 20(5):. PubMed ID: 37875104
    [No Abstract]   [Full Text] [Related]  

  • 8. PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks.
    Ehrlich DB; Stone JT; Brandfonbrener D; Atanasov A; Murray JD
    eNeuro; 2021; 8(1):. PubMed ID: 33328247
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Reconstructing computational system dynamics from neural data with recurrent neural networks.
    Durstewitz D; Koppe G; Thurm MI
    Nat Rev Neurosci; 2023 Nov; 24(11):693-710. PubMed ID: 37794121
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Learning dynamical systems by recurrent neural networks from orbits.
    Kimura M; Nakano R
    Neural Netw; 1998 Dec; 11(9):1589-1599. PubMed ID: 12662730
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Recurrent neural network from adder's perspective: Carry-lookahead RNN.
    Jiang H; Qin F; Cao J; Peng Y; Shao Y
    Neural Netw; 2021 Dec; 144():297-306. PubMed ID: 34543855
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Neural circuits as computational dynamical systems.
    Sussillo D
    Curr Opin Neurobiol; 2014 Apr; 25():156-63. PubMed ID: 24509098
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Generalized Recurrent Neural Network accommodating Dynamic Causal Modeling for functional MRI analysis.
    Wang Y; Wang Y; Lui YW
    Neuroimage; 2018 Sep; 178():385-402. PubMed ID: 29782993
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Probing the Relationship Between Latent Linear Dynamical Systems and Low-Rank Recurrent Neural Network Models.
    Valente A; Ostojic S; Pillow JW
    Neural Comput; 2022 Aug; 34(9):1871-1892. PubMed ID: 35896161
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Behavioral Classification of Sequential Neural Activity Using Time Varying Recurrent Neural Networks.
    Zhang Y; Mitelut C; Arpin DJ; Vaillancourt D; Murphy T; Saxena S
    bioRxiv; 2023 May; ():. PubMed ID: 37214954
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Sparse RNNs can support high-capacity classification.
    Turcu D; Abbott LF
    PLoS Comput Biol; 2022 Dec; 18(12):e1010759. PubMed ID: 36516226
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics.
    Maheswaranathan N; Williams AH; Golub MD; Ganguli S; Sussillo D
    Adv Neural Inf Process Syst; 2019 Dec; 32():15696-15705. PubMed ID: 32782423
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Interpreting a recurrent neural network's predictions of ICU mortality risk.
    Ho LV; Aczon M; Ledbetter D; Wetzel R
    J Biomed Inform; 2021 Feb; 114():103672. PubMed ID: 33422663
    [TBL] [Abstract][Full Text] [Related]  

  • 19. RNNCon: Contribution Coverage Testing for Stacked Recurrent Neural Networks.
    Du X; Zeng H; Chen S; Lei Z
    Entropy (Basel); 2023 Mar; 25(3):. PubMed ID: 36981408
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Learning With Interpretable Structure From Gated RNN.
    Hou BJ; Zhou ZH
    IEEE Trans Neural Netw Learn Syst; 2020 Jul; 31(7):2267-2279. PubMed ID: 32071002
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 19.