These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

171 related articles for article (PubMed ID: 37873445)

  • 1. Training biologically plausible recurrent neural networks on cognitive tasks with long-term dependencies.
    Soo WWM; Goudar V; Wang XJ
    bioRxiv; 2023 Oct; ():. PubMed ID: 37873445
    [TBL] [Abstract][Full Text] [Related]  

  • 2. PsychRNN: An Accessible and Flexible Python Package for Training Recurrent Neural Network Models on Cognitive Tasks.
    Ehrlich DB; Stone JT; Brandfonbrener D; Atanasov A; Murray JD
    eNeuro; 2021; 8(1):. PubMed ID: 33328247
    [TBL] [Abstract][Full Text] [Related]  

  • 3. A critical review of RNN and LSTM variants in hydrological time series predictions.
    Waqas M; Humphries UW
    MethodsX; 2024 Dec; 13():102946. PubMed ID: 39324077
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Achieving Online Regression Performance of LSTMs With Simple RNNs.
    Vural NM; Ilhan F; Yilmaz SF; Ergut S; Kozat SS
    IEEE Trans Neural Netw Learn Syst; 2022 Dec; 33(12):7632-7643. PubMed ID: 34138720
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework.
    Song HF; Yang GR; Wang XJ
    PLoS Comput Biol; 2016 Feb; 12(2):e1004792. PubMed ID: 26928718
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Gated Orthogonal Recurrent Units: On Learning to Forget.
    Jing L; Gulcehre C; Peurifoy J; Shen Y; Tegmark M; Soljacic M; Bengio Y
    Neural Comput; 2019 Apr; 31(4):765-783. PubMed ID: 30764742
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Temporal-kernel recurrent neural networks.
    Sutskever I; Hinton G
    Neural Netw; 2010 Mar; 23(2):239-43. PubMed ID: 19932002
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks.
    He T; Mao H; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Winning the Lottery With Neural Connectivity Constraints: Faster Learning Across Cognitive Tasks With Spatially Constrained Sparse RNNs.
    Khona M; Chandra S; Ma JJ; Fiete IR
    Neural Comput; 2023 Oct; 35(11):1850-1869. PubMed ID: 37725708
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Biologically plausible learning in recurrent neural networks reproduces neural dynamics observed during cognitive tasks.
    Miconi T
    Elife; 2017 Feb; 6():. PubMed ID: 28230528
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Interpretable, highly accurate brain decoding of subtly distinct brain states from functional MRI using intrinsic functional networks and long short-term memory recurrent neural networks.
    Li H; Fan Y
    Neuroimage; 2019 Nov; 202():116059. PubMed ID: 31362049
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Structured flexibility in recurrent neural networks via neuromodulation.
    Costacurta JC; Bhandarkar S; Zoltowski DM; Linderman SW
    bioRxiv; 2024 Jul; ():. PubMed ID: 39091788
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Considerations in using recurrent neural networks to probe neural dynamics.
    Kao JC
    J Neurophysiol; 2019 Dec; 122(6):2504-2521. PubMed ID: 31619125
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Compositional pretraining improves computational efficiency and matches animal behavior on complex tasks.
    Hocker D; Constantinople CM; Savin C
    bioRxiv; 2024 Nov; ():. PubMed ID: 38318205
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Unconditional stability of a recurrent neural circuit implementing divisive normalization.
    Rawat S; Heeger DJ; Martiniani S
    ArXiv; 2024 Oct; ():. PubMed ID: 39398197
    [TBL] [Abstract][Full Text] [Related]  

  • 16. RNNCon: Contribution Coverage Testing for Stacked Recurrent Neural Networks.
    Du X; Zeng H; Chen S; Lei Z
    Entropy (Basel); 2023 Mar; 25(3):. PubMed ID: 36981408
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Explicit Duration Recurrent Networks.
    Yu SZ
    IEEE Trans Neural Netw Learn Syst; 2022 Jul; 33(7):3120-3130. PubMed ID: 33497341
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Working Memory Connections for LSTM.
    Landi F; Baraldi L; Cornia M; Cucchiara R
    Neural Netw; 2021 Dec; 144():334-341. PubMed ID: 34547671
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Generalized Recurrent Neural Network accommodating Dynamic Causal Modeling for functional MRI analysis.
    Wang Y; Wang Y; Lui YW
    Neuroimage; 2018 Sep; 178():385-402. PubMed ID: 29782993
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Investigating the temporal dynamics of electroencephalogram (EEG) microstates using recurrent neural networks.
    Sikka A; Jamalabadi H; Krylova M; Alizadeh S; van der Meer JN; Danyeli L; Deliano M; Vicheva P; Hahn T; Koenig T; Bathula DR; Walter M
    Hum Brain Mapp; 2020 Jun; 41(9):2334-2346. PubMed ID: 32090423
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.