These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
173 related articles for article (PubMed ID: 33658520)
1. Predictive learning as a network mechanism for extracting low-dimensional latent space representations. Recanatesi S; Farrell M; Lajoie G; Deneve S; Rigotti M; Shea-Brown E Nat Commun; 2021 Mar; 12(1):1417. PubMed ID: 33658520 [TBL] [Abstract][Full Text] [Related]
2. Structured Semantic Knowledge Can Emerge Automatically from Predicting Word Sequences in Child-Directed Speech. Huebner PA; Willits JA Front Psychol; 2018; 9():133. PubMed ID: 29520243 [TBL] [Abstract][Full Text] [Related]
3. Latent representations in hippocampal network model co-evolve with behavioral exploration of task structure. Cone I; Clopath C Nat Commun; 2024 Jan; 15(1):687. PubMed ID: 38263408 [TBL] [Abstract][Full Text] [Related]
5. Dynamic network modeling and dimensionality reduction for human ECoG activity. Yang Y; Sani OG; Chang EF; Shanechi MM J Neural Eng; 2019 Aug; 16(5):056014. PubMed ID: 31096206 [TBL] [Abstract][Full Text] [Related]
6. Linking Connectivity, Dynamics, and Computations in Low-Rank Recurrent Neural Networks. Mastrogiuseppe F; Ostojic S Neuron; 2018 Aug; 99(3):609-623.e29. PubMed ID: 30057201 [TBL] [Abstract][Full Text] [Related]
7. Goal-Directed Planning for Habituated Agents by Active Inference Using a Variational Recurrent Neural Network. Matsumoto T; Tani J Entropy (Basel); 2020 May; 22(5):. PubMed ID: 33286336 [TBL] [Abstract][Full Text] [Related]
8. Representation in natural and artificial agents: an embodied cognitive science perspective. Pfeifer R; Scheier C Z Naturforsch C J Biosci; 1998; 53(7-8):480-503. PubMed ID: 9755508 [TBL] [Abstract][Full Text] [Related]
9. Gaussian process based nonlinear latent structure discovery in multivariate spike train data. Wu A; Roy NA; Keeley S; Pillow JW Adv Neural Inf Process Syst; 2017 Dec; 30():3496-3505. PubMed ID: 31244512 [TBL] [Abstract][Full Text] [Related]
10. Expressive architectures enhance interpretability of dynamics-based neural population models. Sedler AR; Versteeg C; Pandarinath C Neuron Behav Data Anal Theory; 2023; 2023():. PubMed ID: 38699512 [TBL] [Abstract][Full Text] [Related]
11. Neural Foundations of Mental Simulation: Future Prediction of Latent Representations on Dynamic Scenes. Nayebi A; Rajalingham R; Jazayeri M; Yang GR ArXiv; 2023 Oct; ():. PubMed ID: 37292459 [TBL] [Abstract][Full Text] [Related]
12. From deep learning to mechanistic understanding in neuroscience: the structure of retinal prediction. Tanaka H; Nayebi A; Maheswaranathan N; McIntosh L; Baccus SA; Ganguli S Adv Neural Inf Process Syst; 2019 Dec; 32():8537-8547. PubMed ID: 35283616 [TBL] [Abstract][Full Text] [Related]
13. Adaptive tracking of human ECoG network dynamics. Ahmadipour P; Yang Y; Chang EF; Shanechi MM J Neural Eng; 2021 Feb; 18(1):016011. PubMed ID: 33624610 [TBL] [Abstract][Full Text] [Related]
14. Emergence of number sense through the integration of multimodal information: developmental learning insights from neural network models. Noda K; Soda T; Yamashita Y Front Neurosci; 2024; 18():1330512. PubMed ID: 38298912 [TBL] [Abstract][Full Text] [Related]
15. Organization of the state space of a simple recurrent network before and after training on recursive linguistic structures. Cernanský M; Makula M; Benusková L Neural Netw; 2007 Mar; 20(2):236-44. PubMed ID: 16687236 [TBL] [Abstract][Full Text] [Related]
16. Human-level control through deep reinforcement learning. Mnih V; Kavukcuoglu K; Silver D; Rusu AA; Veness J; Bellemare MG; Graves A; Riedmiller M; Fidjeland AK; Ostrovski G; Petersen S; Beattie C; Sadik A; Antonoglou I; King H; Kumaran D; Wierstra D; Legg S; Hassabis D Nature; 2015 Feb; 518(7540):529-33. PubMed ID: 25719670 [TBL] [Abstract][Full Text] [Related]
17. Covariate dimension reduction for survival data via the Gaussian process latent variable model. Barrett JE; Coolen AC Stat Med; 2016 Apr; 35(8):1340-53. PubMed ID: 26526057 [TBL] [Abstract][Full Text] [Related]
18. Learning Low-Dimensional Temporal Representations with Latent Alignments. Su B; Wu Y IEEE Trans Pattern Anal Mach Intell; 2020 Nov; 42(11):2842-2857. PubMed ID: 31144626 [TBL] [Abstract][Full Text] [Related]
19. Autoencoder networks extract latent variables and encode these variables in their connectomes. Farrell M; Recanatesi S; Reid RC; Mihalas S; Shea-Brown E Neural Netw; 2021 Sep; 141():330-343. PubMed ID: 33957382 [TBL] [Abstract][Full Text] [Related]