BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

303 related articles for article (PubMed ID: 33328247)

  • 1. PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks.
    Ehrlich DB; Stone JT; Brandfonbrener D; Atanasov A; Murray JD
    eNeuro; 2021; 8(1):. PubMed ID: 33328247
    [TBL] [Abstract][Full Text] [Related]  

  • 2. tension: A Python package for FORCE learning.
    Liu LB; Losonczy A; Liao Z
    PLoS Comput Biol; 2022 Dec; 18(12):e1010722. PubMed ID: 36534709
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
    Song HF; Yang GR; Wang XJ
    PLoS Comput Biol; 2016 Feb; 12(2):e1004792. PubMed ID: 26928718
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Training biologically plausible recurrent neural networks on cognitive tasks with long-term dependencies.
    Soo WWM; Goudar V; Wang XJ
    bioRxiv; 2023 Oct; ():. PubMed ID: 37873445
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Considerations in using recurrent neural networks to probe neural dynamics.
    Kao JC
    J Neurophysiol; 2019 Dec; 122(6):2504-2521. PubMed ID: 31619125
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Winning the Lottery With Neural Connectivity Constraints: Faster Learning Across Cognitive Tasks With Spatially Constrained Sparse RNNs.
    Khona M; Chandra S; Ma JJ; Fiete IR
    Neural Comput; 2023 Oct; 35(11):1850-1869. PubMed ID: 37725708
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Task representations in neural networks trained to perform many cognitive tasks.
    Yang GR; Joglekar MR; Song HF; Newsome WT; Wang XJ
    Nat Neurosci; 2019 Feb; 22(2):297-306. PubMed ID: 30643294
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Exploring weight initialization, diversity of solutions, and degradation in recurrent neural networks trained for temporal and decision-making tasks.
    Jarne C; Laje R
    J Comput Neurosci; 2023 Nov; 51(4):407-431. PubMed ID: 37561278
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Reward-based training of recurrent neural networks for cognitive and value-based tasks.
    Song HF; Yang GR; Wang XJ
    Elife; 2017 Jan; 6():. PubMed ID: 28084991
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.
    Miconi T
    Elife; 2017 Feb; 6():. PubMed ID: 28230528
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Curriculum learning inspired by behavioral shaping trains neural networks to adopt animal-like decision making strategies.
    Hocker D; Constantinople CM; Savin C
    bioRxiv; 2024 Feb; ():. PubMed ID: 38318205
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Towards the next generation of recurrent network models for cognitive neuroscience.
    Yang GR; Molano-Mazón M
    Curr Opin Neurobiol; 2021 Oct; 70():182-192. PubMed ID: 34844122
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Local online learning in recurrent networks with random feedback.
    Murray JM
    Elife; 2019 May; 8():. PubMed ID: 31124785
    [TBL] [Abstract][Full Text] [Related]  

  • 14. ChampKit: A framework for rapid evaluation of deep neural networks for patch-based histopathology classification.
    Kaczmarzyk JR; Gupta R; Kurc TM; Abousamra S; Saltz JH; Koo PK
    Comput Methods Programs Biomed; 2023 Sep; 239():107631. PubMed ID: 37271050
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Generalized Recurrent Neural Network accommodating Dynamic Causal Modeling for functional MRI analysis.
    Wang Y; Wang Y; Lui YW
    Neuroimage; 2018 Sep; 178():385-402. PubMed ID: 29782993
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Learning the Synaptic and Intrinsic Membrane Dynamics Underlying Working Memory in Spiking Neural Network Models.
    Li Y; Kim R; Sejnowski TJ
    Neural Comput; 2021 Nov; 33(12):3264-3287. PubMed ID: 34710902
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Stimulus-Driven and Spontaneous Dynamics in Excitatory-Inhibitory Recurrent Neural Networks for Sequence Representation.
    Rajakumar A; Rinzel J; Chen ZS
    Neural Comput; 2021 Sep; 33(10):2603-2645. PubMed ID: 34530451
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Artificial intelligence for skin permeability prediction: deep learning.
    Ita K; Roshanaei S
    J Drug Target; 2024 Dec; 32(3):334-346. PubMed ID: 38258521
    [TBL] [Abstract][Full Text] [Related]  

  • 19. BRAND: a platform for closed-loop experiments with deep network models.
    Ali YH; Bodkin K; Rigotti-Thompson M; Patel K; Card NS; Bhaduri B; Nason-Tomaszewski SR; Mifsud DM; Hou X; Nicolas C; Allcroft S; Hochberg LR; Au Yong N; Stavisky SD; Miller LE; Brandman DM; Pandarinath C
    J Neural Eng; 2024 Apr; 21(2):. PubMed ID: 38579696
    [No Abstract]   [Full Text] [Related]  

  • 20. Reconstructing computational system dynamics from neural data with recurrent neural networks.
    Durstewitz D; Koppe G; Thurm MI
    Nat Rev Neurosci; 2023 Nov; 24(11):693-710. PubMed ID: 37794121
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 16.