These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

115 related articles for article (PubMed ID: 38968778)

  • 1. Learning sequence attractors in recurrent networks with hidden neurons.
    Lu Y; Wu S
    Neural Netw; 2024 Oct; 178():106466. PubMed ID: 38968778
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Continuous attractors for dynamic memories.
    Spalla D; Cornacchia IM; Treves A
    Elife; 2021 Sep; 10():. PubMed ID: 34520345
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Memory States and Transitions between Them in Attractor Neural Networks.
    Recanatesi S; Katkov M; Tsodyks M
    Neural Comput; 2017 Oct; 29(10):2684-2711. PubMed ID: 28777725
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Neural learning rules for generating flexible predictions and computing the successor representation.
    Fang C; Aronov D; Abbott LF; Mackevicius EL
    Elife; 2023 Mar; 12():. PubMed ID: 36928104
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A Gaussian attractor network for memory and recognition with experience-dependent learning.
    Hu X; Zhang B
    Neural Comput; 2010 May; 22(5):1333-57. PubMed ID: 20100070
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Computational study on the neural mechanism of sequential pattern memory.
    Morita M
    Brain Res Cogn Brain Res; 1996 Dec; 5(1-2):137-46. PubMed ID: 9049080
    [TBL] [Abstract][Full Text] [Related]  

  • 7. High-Dimensional Brain: A Tool for Encoding and Rapid Learning of Memories by Single Neurons.
    Tyukin I; Gorban AN; Calvo C; Makarova J; Makarov VA
    Bull Math Biol; 2019 Nov; 81(11):4856-4888. PubMed ID: 29556797
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Slow manifolds within network dynamics encode working memory efficiently and robustly.
    Ghazizadeh E; Ching S
    PLoS Comput Biol; 2021 Sep; 17(9):e1009366. PubMed ID: 34525089
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Network capacity analysis for latent attractor computation.
    Doboli S; Minai AA
    Network; 2003 May; 14(2):273-302. PubMed ID: 12790185
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Robust Associative Learning Is Sufficient to Explain the Structural and Dynamical Properties of Local Cortical Circuits.
    Zhang D; Zhang C; Stepanyants A
    J Neurosci; 2019 Aug; 39(35):6888-6904. PubMed ID: 31270161
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Collective computational intelligence in biology - Emergence of memory in somatic tissues.
    Samarasinghe S
    Biosystems; 2023 Jan; 223():104816. PubMed ID: 36436698
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Associative memory of structured knowledge.
    Steinberg J; Sompolinsky H
    Sci Rep; 2022 Dec; 12(1):21808. PubMed ID: 36528630
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Hot coffee: associative memory with bump attractor cell assemblies of spiking neurons.
    Huyck CR; Vergani AA
    J Comput Neurosci; 2020 Aug; 48(3):299-316. PubMed ID: 32715350
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Localist attractor networks.
    Zemel RS; Mozer MC
    Neural Comput; 2001 May; 13(5):1045-64. PubMed ID: 11359644
    [TBL] [Abstract][Full Text] [Related]  

  • 15. SSTE: Syllable-Specific Temporal Encoding to FORCE-learn audio sequences with an associative memory approach.
    Jannesar N; Akbarzadeh-Sherbaf K; Safari S; Vahabie AH
    Neural Netw; 2024 Sep; 177():106368. PubMed ID: 38761415
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Coexistence of Cyclic Sequential Pattern Recognition and Associative Memory in Neural Networks by Attractor Mechanisms.
    Huo J; Yu J; Wang M; Yi Z; Leng J; Liao Y
    IEEE Trans Neural Netw Learn Syst; 2024 Mar; PP():. PubMed ID: 38442060
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Memory dynamics in attractor networks with saliency weights.
    Tang H; Li H; Yan R
    Neural Comput; 2010 Jul; 22(7):1899-926. PubMed ID: 20235821
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Learning continuous chaotic attractors with a reservoir computer.
    Smith LM; Kim JZ; Lu Z; Bassett DS
    Chaos; 2022 Jan; 32(1):011101. PubMed ID: 35105129
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Learning precise spatiotemporal sequences via biophysically realistic learning rules in a modular, spiking network.
    Cone I; Shouval HZ
    Elife; 2021 Mar; 10():. PubMed ID: 33734085
    [TBL] [Abstract][Full Text] [Related]  

  • 20. A unified approach to building and controlling spiking attractor networks.
    Eliasmith C
    Neural Comput; 2005 Jun; 17(6):1276-314. PubMed ID: 15901399
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.