These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

179 related articles for article (PubMed ID: 34547671)

  • 1. Working Memory Connections for LSTM.
    Landi F; Baraldi L; Cornia M; Cucchiara R
    Neural Netw; 2021 Dec; 144():334-341. PubMed ID: 34547671
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks.
    He T; Mao H; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305
    [TBL] [Abstract][Full Text] [Related]  

  • 3. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures.
    Yu Y; Si X; Hu C; Zhang J
    Neural Comput; 2019 Jul; 31(7):1235-1270. PubMed ID: 31113301
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Explicit Duration Recurrent Networks.
    Yu SZ
    IEEE Trans Neural Netw Learn Syst; 2022 Jul; 33(7):3120-3130. PubMed ID: 33497341
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A New Delay Connection for Long Short-Term Memory Networks.
    Wang J; Zhang L; Chen Y; Yi Z
    Int J Neural Syst; 2018 Aug; 28(6):1750061. PubMed ID: 29382286
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Learning to forget: continual prediction with LSTM.
    Gers FA; Schmidhuber J; Cummins F
    Neural Comput; 2000 Oct; 12(10):2451-71. PubMed ID: 11032042
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Interpretable, highly accurate brain decoding of subtly distinct brain states from functional MRI using intrinsic functional networks and long short-term memory recurrent neural networks.
    Li H; Fan Y
    Neuroimage; 2019 Nov; 202():116059. PubMed ID: 31362049
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A critical review of RNN and LSTM variants in hydrological time series predictions.
    Waqas M; Humphries UW
    MethodsX; 2024 Dec; 13():102946. PubMed ID: 39324077
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks.
    Zazo R; Lozano-Diez A; Gonzalez-Dominguez J; Toledano DT; Gonzalez-Rodriguez J
    PLoS One; 2016; 11(1):e0146917. PubMed ID: 26824467
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Recurrent Neural Networks With External Addressable Long-Term and Working Memory for Learning Long-Term Dependences.
    Quan Z; Zeng W; Li X; Liu Y; Yu Y; Yang W
    IEEE Trans Neural Netw Learn Syst; 2020 Mar; 31(3):813-826. PubMed ID: 31059455
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Evolving Long Short-Term Memory Network-Based Text Classification.
    Singh A; Dargar SK; Gupta A; Kumar A; Srivastava AK; Srivastava M; Kumar Tiwari P; Ullah MA
    Comput Intell Neurosci; 2022; 2022():4725639. PubMed ID: 35237308
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Using long short term memory and convolutional neural networks for driver drowsiness detection.
    Quddus A; Shahidi Zandi A; Prest L; Comeau FJE
    Accid Anal Prev; 2021 Jun; 156():106107. PubMed ID: 33848710
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A generalized LSTM-like training algorithm for second-order recurrent neural networks.
    Monner D; Reggia JA
    Neural Netw; 2012 Jan; 25(1):70-83. PubMed ID: 21803542
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Achieving Online Regression Performance of LSTMs With Simple RNNs.
    Vural NM; Ilhan F; Yilmaz SF; Ergut S; Kozat SS
    IEEE Trans Neural Netw Learn Syst; 2022 Dec; 33(12):7632-7643. PubMed ID: 34138720
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Gated Orthogonal Recurrent Units: On Learning to Forget.
    Jing L; Gulcehre C; Peurifoy J; Shen Y; Tegmark M; Soljacic M; Bengio Y
    Neural Comput; 2019 Apr; 31(4):765-783. PubMed ID: 30764742
    [TBL] [Abstract][Full Text] [Related]  

  • 16. A bio-inspired bistable recurrent cell allows for long-lasting memory.
    Vecoven N; Ernst D; Drion G
    PLoS One; 2021; 16(6):e0252676. PubMed ID: 34101750
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Character gated recurrent neural networks for Arabic sentiment analysis.
    Omara E; Mousa M; Ismail N
    Sci Rep; 2022 Jun; 12(1):9779. PubMed ID: 35697814
    [TBL] [Abstract][Full Text] [Related]  

  • 18. A Modified Long Short-Term Memory Cell.
    Haralabopoulos G; Razis G; Anagnostopoulos I
    Int J Neural Syst; 2023 Jul; 33(7):2350039. PubMed ID: 37300815
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Natural Language Statistical Features of LSTM-Generated Texts.
    Lippi M; Montemurro MA; Degli Esposti M; Cristadoro G
    IEEE Trans Neural Netw Learn Syst; 2019 Nov; 30(11):3326-3337. PubMed ID: 30951479
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Gating Revisited: Deep Multi-Layer RNNs That can be Trained.
    Turkoglu MO; DaAronco S; Wegner JD; Schindler K
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4081-4092. PubMed ID: 33687837
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.