These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
151 related articles for article (PubMed ID: 34828189)
1. Entanglement-Structured LSTM Boosts Chaotic Time Series Forecasting. Meng X; Yang T Entropy (Basel); 2021 Nov; 23(11):. PubMed ID: 34828189 [TBL] [Abstract][Full Text] [Related]
2. Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks. Vlachas PR; Byeon W; Wan ZY; Sapsis TP; Koumoutsakos P Proc Math Phys Eng Sci; 2018 May; 474(2213):20170844. PubMed ID: 29887750 [TBL] [Abstract][Full Text] [Related]
3. Prediction of chaotic time series using recurrent neural networks and reservoir computing techniques: A comparative study. Shahi S; Fenton FH; Cherry EM Mach Learn Appl; 2022 Jun; 8():. PubMed ID: 35755176 [TBL] [Abstract][Full Text] [Related]
5. Entanglement and the generation of random states in the quantum chaotic dynamics of kicked coupled tops. Trail CM; Madhok V; Deutsch IH Phys Rev E Stat Nonlin Soft Matter Phys; 2008 Oct; 78(4 Pt 2):046211. PubMed ID: 18999512 [TBL] [Abstract][Full Text] [Related]
11. Fast Quantum State Transfer and Entanglement Renormalization Using Long-Range Interactions. Eldredge Z; Gong ZX; Young JT; Moosavian AH; Foss-Feig M; Gorshkov AV Phys Rev Lett; 2017 Oct; 119(17):170503. PubMed ID: 29219445 [TBL] [Abstract][Full Text] [Related]
12. Taming the Chaos in Neural Network Time Series Predictions. Raubitzek S; Neubauer T Entropy (Basel); 2021 Oct; 23(11):. PubMed ID: 34828122 [TBL] [Abstract][Full Text] [Related]
19. Unsupervised Pre-training of a Deep LSTM-based Stacked Autoencoder for Multivariate Time Series Forecasting Problems. Sagheer A; Kotb M Sci Rep; 2019 Dec; 9(1):19038. PubMed ID: 31836728 [TBL] [Abstract][Full Text] [Related]
20. Entanglement Renormalization of Thermofield Double States. Lin CJ; Li Z; Hsieh TH Phys Rev Lett; 2021 Aug; 127(8):080602. PubMed ID: 34477410 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]