These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

161 related articles for article (PubMed ID: 36525915)

  • 1. SGORNN: Combining scalar gates and orthogonal constraints in recurrent networks.
    Taylor-Melanson W; Ferreira MD; Matwin S
    Neural Netw; 2023 Feb; 159():25-33. PubMed ID: 36525915
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Orthogonal Gated Recurrent Unit With Neumann-Cayley Transformation.
    Zadorozhnyy V; Mucllari E; Pospisil C; Nguyen D; Ye Q
    Neural Comput; 2024 Nov; 36(12):2651-2676. PubMed ID: 39312497
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Gating Revisited: Deep Multi-Layer RNNs That can be Trained.
    Turkoglu MO; DaAronco S; Wegner JD; Schindler K
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4081-4092. PubMed ID: 33687837
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Gated Orthogonal Recurrent Units: On Learning to Forget.
    Jing L; Gulcehre C; Peurifoy J; Shen Y; Tegmark M; Soljacic M; Bengio Y
    Neural Comput; 2019 Apr; 31(4):765-783. PubMed ID: 30764742
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Subtraction Gates: Another Way to Learn Long-Term Dependencies in Recurrent Neural Networks.
    He T; Mao H; Yi Z
    IEEE Trans Neural Netw Learn Syst; 2022 Apr; 33(4):1740-1751. PubMed ID: 33373305
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Character gated recurrent neural networks for Arabic sentiment analysis.
    Omara E; Mousa M; Ismail N
    Sci Rep; 2022 Jun; 12(1):9779. PubMed ID: 35697814
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Explicit Duration Recurrent Networks.
    Yu SZ
    IEEE Trans Neural Netw Learn Syst; 2022 Jul; 33(7):3120-3130. PubMed ID: 33497341
    [TBL] [Abstract][Full Text] [Related]  

  • 8. RNNCon: Contribution Coverage Testing for Stacked Recurrent Neural Networks.
    Du X; Zeng H; Chen S; Lei Z
    Entropy (Basel); 2023 Mar; 25(3):. PubMed ID: 36981408
    [TBL] [Abstract][Full Text] [Related]  

  • 9. SS-RNN: A Strengthened Skip Algorithm for Data Classification Based on Recurrent Neural Networks.
    Cao W; Shi YZ; Qiu H; Zhang B
    Front Genet; 2021; 12():746181. PubMed ID: 34721533
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Structured pruning of recurrent neural networks through neuron selection.
    Wen L; Zhang X; Bai H; Xu Z
    Neural Netw; 2020 Mar; 123():134-141. PubMed ID: 31855748
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Working Memory Connections for LSTM.
    Landi F; Baraldi L; Cornia M; Cucchiara R
    Neural Netw; 2021 Dec; 144():334-341. PubMed ID: 34547671
    [TBL] [Abstract][Full Text] [Related]  

  • 12. A Post-training Quantization Method for the Design of Fixed-Point-Based FPGA/ASIC Hardware Accelerators for LSTM/GRU Algorithms.
    Rapuano E; Pacini T; Fanucci L
    Comput Intell Neurosci; 2022; 2022():9485933. PubMed ID: 35602644
    [TBL] [Abstract][Full Text] [Related]  

  • 13. A critical review of RNN and LSTM variants in hydrological time series predictions.
    Waqas M; Humphries UW
    MethodsX; 2024 Dec; 13():102946. PubMed ID: 39324077
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Considerations in using recurrent neural networks to probe neural dynamics.
    Kao JC
    J Neurophysiol; 2019 Dec; 122(6):2504-2521. PubMed ID: 31619125
    [TBL] [Abstract][Full Text] [Related]  

  • 15. G2Basy: A framework to improve the RNN language model and ease overfitting problem.
    Yuwen L; Chen S; Yuan X
    PLoS One; 2021; 16(4):e0249820. PubMed ID: 33852595
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Fading memory as inductive bias in residual recurrent networks.
    Dubinin I; Effenberger F
    Neural Netw; 2024 May; 173():106179. PubMed ID: 38387205
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Optimizing RNNs for EMG Signal Classification: A Novel Strategy Using Grey Wolf Optimization.
    Aviles M; Alvarez-Alvarado JM; Robles-Ocampo JB; Sevilla-Camacho PY; Rodríguez-Reséndiz J
    Bioengineering (Basel); 2024 Jan; 11(1):. PubMed ID: 38247954
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Training recurrent neural networks robust to incomplete data: Application to Alzheimer's disease progression modeling.
    Mehdipour Ghazi M; Nielsen M; Pai A; Cardoso MJ; Modat M; Ourselin S; Sørensen L;
    Med Image Anal; 2019 Apr; 53():39-46. PubMed ID: 30682584
    [TBL] [Abstract][Full Text] [Related]  

  • 19. A Review of Recurrent Neural Networks: LSTM Cells and Network Architectures.
    Yu Y; Si X; Hu C; Zhang J
    Neural Comput; 2019 Jul; 31(7):1235-1270. PubMed ID: 31113301
    [TBL] [Abstract][Full Text] [Related]  

  • 20.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

    [Next]    [New Search]
    of 9.