These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

326 related articles for article (PubMed ID: 33328247)

  • 1. PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks.
    Ehrlich DB; Stone JT; Brandfonbrener D; Atanasov A; Murray JD
    eNeuro; 2021; 8(1):. PubMed ID: 33328247
    [TBL] [Abstract][Full Text] [Related]  

  • 2. tension: A Python package for FORCE learning.
    Liu LB; Losonczy A; Liao Z
    PLoS Comput Biol; 2022 Dec; 18(12):e1010722. PubMed ID: 36534709
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
    Song HF; Yang GR; Wang XJ
    PLoS Comput Biol; 2016 Feb; 12(2):e1004792. PubMed ID: 26928718
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Training biologically plausible recurrent neural networks on cognitive tasks with long-term dependencies.
    Soo WWM; Goudar V; Wang XJ
    bioRxiv; 2023 Oct; ():. PubMed ID: 37873445
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Considerations in using recurrent neural networks to probe neural dynamics.
    Kao JC
    J Neurophysiol; 2019 Dec; 122(6):2504-2521. PubMed ID: 31619125
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Winning the Lottery With Neural Connectivity Constraints: Faster Learning Across Cognitive Tasks With Spatially Constrained Sparse RNNs.
    Khona M; Chandra S; Ma JJ; Fiete IR
    Neural Comput; 2023 Oct; 35(11):1850-1869. PubMed ID: 37725708
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Task representations in neural networks trained to perform many cognitive tasks.
    Yang GR; Joglekar MR; Song HF; Newsome WT; Wang XJ
    Nat Neurosci; 2019 Feb; 22(2):297-306. PubMed ID: 30643294
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Exploring weight initialization, diversity of solutions, and degradation in recurrent neural networks trained for temporal and decision-making tasks.
    Jarne C; Laje R
    J Comput Neurosci; 2023 Nov; 51(4):407-431. PubMed ID: 37561278
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.
    Miconi T
    Elife; 2017 Feb; 6():. PubMed ID: 28230528
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Reward-based training of recurrent neural networks for cognitive and value-based tasks.
    Song HF; Yang GR; Wang XJ
    Elife; 2017 Jan; 6():. PubMed ID: 28084991
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Compositional pretraining improves computational efficiency and matches animal behavior on complex tasks.
    Hocker D; Constantinople CM; Savin C
    bioRxiv; 2024 Nov; ():. PubMed ID: 38318205
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Towards the next generation of recurrent network models for cognitive neuroscience.
    Yang GR; Molano-Mazón M
    Curr Opin Neurobiol; 2021 Oct; 70():182-192. PubMed ID: 34844122
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Local online learning in recurrent networks with random feedback.
    Murray JM
    Elife; 2019 May; 8():. PubMed ID: 31124785
    [TBL] [Abstract][Full Text] [Related]  

  • 14. ChampKit: A framework for rapid evaluation of deep neural networks for patch-based histopathology classification.
    Kaczmarzyk JR; Gupta R; Kurc TM; Abousamra S; Saltz JH; Koo PK
    Comput Methods Programs Biomed; 2023 Sep; 239():107631. PubMed ID: 37271050
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Generalized Recurrent Neural Network accommodating Dynamic Causal Modeling for functional MRI analysis.
    Wang Y; Wang Y; Lui YW
    Neuroimage; 2018 Sep; 178():385-402. PubMed ID: 29782993
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Learning the Synaptic and Intrinsic Membrane Dynamics Underlying Working Memory in Spiking Neural Network Models.
    Li Y; Kim R; Sejnowski TJ
    Neural Comput; 2021 Nov; 33(12):3264-3287. PubMed ID: 34710902
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Structured flexibility in recurrent neural networks via neuromodulation.
    Costacurta JC; Bhandarkar S; Zoltowski DM; Linderman SW
    bioRxiv; 2024 Jul; ():. PubMed ID: 39091788
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Stimulus-Driven and Spontaneous Dynamics in Excitatory-Inhibitory Recurrent Neural Networks for Sequence Representation.
    Rajakumar A; Rinzel J; Chen ZS
    Neural Comput; 2021 Sep; 33(10):2603-2645. PubMed ID: 34530451
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Artificial intelligence for skin permeability prediction: deep learning.
    Ita K; Roshanaei S
    J Drug Target; 2024 Dec; 32(3):334-346. PubMed ID: 38258521
    [TBL] [Abstract][Full Text] [Related]  

  • 20. MotorNet, a Python toolbox for controlling differentiable biomechanical effectors with artificial neural networks.
    Codol O; Michaels JA; Kashefi M; Pruszynski JA; Gribble PL
    Elife; 2024 Jul; 12():. PubMed ID: 39078880
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 17.