These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

161 related articles for article (PubMed ID: 31059455)

  • 1. Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences.
    Quan Z; Zeng W; Li X; Liu Y; Yu Y; Yang W
    IEEE Trans Neural Netw Learn Syst; 2020 Mar; 31(3):813-826. PubMed ID: 31059455
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Interpretable, highly accurate brain decoding of subtly distinct brain states from functional MRI using intrinsic functional networks and long short-term memory recurrent neural networks.
    Li H; Fan Y
    Neuroimage; 2019 Nov; 202():116059. PubMed ID: 31362049
    [TBL] [Abstract][Full Text] [Related]  

  • 3. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures.
    Yu Y; Si X; Hu C; Zhang J
    Neural Comput; 2019 Jul; 31(7):1235-1270. PubMed ID: 31113301
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Local online learning in recurrent networks with random feedback.
    Murray JM
    Elife; 2019 May; 8():. PubMed ID: 31124785
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Recurrent Neural Networks With Auxiliary Memory Units.
    Wang J; Zhang L; Guo Q; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2018 May; 29(5):1652-1661. PubMed ID: 28333646
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Working Memory Connections for LSTM.
    Landi F; Baraldi L; Cornia M; Cucchiara R
    Neural Netw; 2021 Dec; 144():334-341. PubMed ID: 34547671
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks.
    He T; Mao H; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A generalized LSTM-like training algorithm for second-order recurrent neural networks.
    Monner D; Reggia JA
    Neural Netw; 2012 Jan; 25(1):70-83. PubMed ID: 21803542
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks.
    Zazo R; Lozano-Diez A; Gonzalez-Dominguez J; Toledano DT; Gonzalez-Rodriguez J
    PLoS One; 2016; 11(1):e0146917. PubMed ID: 26824467
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Effect of recurrent infomax on the information processing capability of input-driven recurrent neural networks.
    Tanaka T; Nakajima K; Aoyagi T
    Neurosci Res; 2020 Jul; 156():225-233. PubMed ID: 32068068
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Gated Orthogonal Recurrent Units: On Learning to Forget.
    Jing L; Gulcehre C; Peurifoy J; Shen Y; Tegmark M; Soljacic M; Bengio Y
    Neural Comput; 2019 Apr; 31(4):765-783. PubMed ID: 30764742
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Design and Implementation of Fast Spoken Foul Language Recognition with Different End-to-End Deep Neural Network Architectures.
    Ba Wazir AS; Karim HA; Abdullah MHL; AlDahoul N; Mansor S; Fauzi MFA; See J; Naim AS
    Sensors (Basel); 2021 Jan; 21(3):. PubMed ID: 33494254
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Training recurrent neural networks robust to incomplete data: Application to Alzheimer's disease progression modeling.
    Mehdipour Ghazi M; Nielsen M; Pai A; Cardoso MJ; Modat M; Ourselin S; Sørensen L;
    Med Image Anal; 2019 Apr; 53():39-46. PubMed ID: 30682584
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Explicit Duration Recurrent Networks.
    Yu SZ
    IEEE Trans Neural Netw Learn Syst; 2022 Jul; 33(7):3120-3130. PubMed ID: 33497341
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Automatic detection and classification of marmoset vocalizations using deep and recurrent neural networks.
    Zhang YJ; Huang JF; Gong N; Ling ZH; Hu Y
    J Acoust Soc Am; 2018 Jul; 144(1):478. PubMed ID: 30075670
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Achieving Online Regression Performance of LSTMs With Simple RNNs.
    Vural NM; Ilhan F; Yilmaz SF; Ergut S; Kozat SS
    IEEE Trans Neural Netw Learn Syst; 2022 Dec; 33(12):7632-7643. PubMed ID: 34138720
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Robust and brain-like working memory through short-term synaptic plasticity.
    Kozachkov L; Tauber J; Lundqvist M; Brincat SL; Slotine JJ; Miller EK
    PLoS Comput Biol; 2022 Dec; 18(12):e1010776. PubMed ID: 36574424
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Comparing SNNs and RNNs on neuromorphic vision datasets: Similarities and differences.
    He W; Wu Y; Deng L; Li G; Wang H; Tian Y; Ding W; Wang W; Xie Y
    Neural Netw; 2020 Dec; 132():108-120. PubMed ID: 32866745
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Extended-Range Prediction Model Using NSGA-III Optimized RNN-GRU-LSTM for Driver Stress and Drowsiness.
    Chui KT; Gupta BB; Liu RW; Zhang X; Vasant P; Thomas JJ
    Sensors (Basel); 2021 Sep; 21(19):. PubMed ID: 34640732
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A bio-inspired bistable recurrent cell allows for long-lasting memory.
    Vecoven N; Ernst D; Drion G
    PLoS One; 2021; 16(6):e0252676. PubMed ID: 34101750
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.