BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

175 related articles for article (PubMed ID: 37873445)

  • 1. Training biologically plausible recurrent neural networks on cognitive tasks with long-term dependencies.
    Soo WWM; Goudar V; Wang XJ
    bioRxiv; 2023 Oct; ():. PubMed ID: 37873445
    [TBL] [Abstract][Full Text] [Related]  

  • 2. PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks.
    Ehrlich DB; Stone JT; Brandfonbrener D; Atanasov A; Murray JD
    eNeuro; 2021; 8(1):. PubMed ID: 33328247
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Achieving Online Regression Performance of LSTMs With Simple RNNs.
    Vural NM; Ilhan F; Yilmaz SF; Ergut S; Kozat SS
    IEEE Trans Neural Netw Learn Syst; 2022 Dec; 33(12):7632-7643. PubMed ID: 34138720
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
    Song HF; Yang GR; Wang XJ
    PLoS Comput Biol; 2016 Feb; 12(2):e1004792. PubMed ID: 26928718
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Gated Orthogonal Recurrent Units: On Learning to Forget.
    Jing L; Gulcehre C; Peurifoy J; Shen Y; Tegmark M; Soljacic M; Bengio Y
    Neural Comput; 2019 Apr; 31(4):765-783. PubMed ID: 30764742
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Temporal-kernel recurrent neural networks.
    Sutskever I; Hinton G
    Neural Netw; 2010 Mar; 23(2):239-43. PubMed ID: 19932002
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks.
    He T; Mao H; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Winning the Lottery With Neural Connectivity Constraints: Faster Learning Across Cognitive Tasks With Spatially Constrained Sparse RNNs.
    Khona M; Chandra S; Ma JJ; Fiete IR
    Neural Comput; 2023 Oct; 35(11):1850-1869. PubMed ID: 37725708
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.
    Miconi T
    Elife; 2017 Feb; 6():. PubMed ID: 28230528
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Considerations in using recurrent neural networks to probe neural dynamics.
    Kao JC
    J Neurophysiol; 2019 Dec; 122(6):2504-2521. PubMed ID: 31619125
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Curriculum learning inspired by behavioral shaping trains neural networks to adopt animal-like decision making strategies.
    Hocker D; Constantinople CM; Savin C
    bioRxiv; 2024 Feb; ():. PubMed ID: 38318205
    [TBL] [Abstract][Full Text] [Related]  

  • 12. RNNCon: Contribution Coverage Testing for Stacked Recurrent Neural Networks.
    Du X; Zeng H; Chen S; Lei Z
    Entropy (Basel); 2023 Mar; 25(3):. PubMed ID: 36981408
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Explicit Duration Recurrent Networks.
    Yu SZ
    IEEE Trans Neural Netw Learn Syst; 2022 Jul; 33(7):3120-3130. PubMed ID: 33497341
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Working Memory Connections for LSTM.
    Landi F; Baraldi L; Cornia M; Cucchiara R
    Neural Netw; 2021 Dec; 144():334-341. PubMed ID: 34547671
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Generalized Recurrent Neural Network accommodating Dynamic Causal Modeling for functional MRI analysis.
    Wang Y; Wang Y; Lui YW
    Neuroimage; 2018 Sep; 178():385-402. PubMed ID: 29782993
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Investigating the temporal dynamics of electroencephalogram (EEG) microstates using recurrent neural networks.
    Sikka A; Jamalabadi H; Krylova M; Alizadeh S; van der Meer JN; Danyeli L; Deliano M; Vicheva P; Hahn T; Koenig T; Bathula DR; Walter M
    Hum Brain Mapp; 2020 Jun; 41(9):2334-2346. PubMed ID: 32090423
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Reverse engineering recurrent networks for sentiment classification reveals line attractor dynamics.
    Maheswaranathan N; Williams AH; Golub MD; Ganguli S; Sussillo D
    Adv Neural Inf Process Syst; 2019 Dec; 32():15696-15705. PubMed ID: 32782423
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Neural learning rules for generating flexible predictions and computing the successor representation.
    Fang C; Aronov D; Abbott LF; Mackevicius EL
    Elife; 2023 Mar; 12():. PubMed ID: 36928104
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Recurrent Neural Networks With Auxiliary Memory Units.
    Wang J; Zhang L; Guo Q; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2018 May; 29(5):1652-1661. PubMed ID: 28333646
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A bio-inspired bistable recurrent cell allows for long-lasting memory.
    Vecoven N; Ernst D; Drion G
    PLoS One; 2021; 16(6):e0252676. PubMed ID: 34101750
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.