These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

140 related articles for article (PubMed ID: 36436698)

  • 21. NeuroLISP: High-level symbolic programming with attractor neural networks.
    Davis GP; Katz GE; Gentili RJ; Reggia JA
    Neural Netw; 2022 Feb; 146():200-219. PubMed ID: 34894482
    [TBL] [Abstract][Full Text] [Related]  

  • 22. Global adaptation in networks of selfish components: emergent associative memory at the system scale.
    Watson RA; Mills R; Buckley CL
    Artif Life; 2011; 17(3):147-66. PubMed ID: 21554114
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Self-Optimization in Continuous-Time Recurrent Neural Networks.
    Zarco M; Froese T
    Front Robot AI; 2018; 5():96. PubMed ID: 33500975
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Associative memory in networks of spiking neurons.
    Sommer FT; Wennekers T
    Neural Netw; 2001; 14(6-7):825-34. PubMed ID: 11665774
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Learning associative memories by error backpropagation.
    Zheng P; Zhang J; Tang W
    IEEE Trans Neural Netw; 2011 Mar; 22(3):347-55. PubMed ID: 21189234
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Dynamic synchronization and chaos in an associative neural network with multiple active memories.
    Raffone A; van Leeuwen C
    Chaos; 2003 Sep; 13(3):1090-104. PubMed ID: 12946202
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Continuous attractors for dynamic memories.
    Spalla D; Cornacchia IM; Treves A
    Elife; 2021 Sep; 10():. PubMed ID: 34520345
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Memory States and Transitions between Them in Attractor Neural Networks.
    Recanatesi S; Katkov M; Tsodyks M
    Neural Comput; 2017 Oct; 29(10):2684-2711. PubMed ID: 28777725
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Bayesian inference in ring attractor networks.
    Kutschireiter A; Basnak MA; Wilson RI; Drugowitsch J
    Proc Natl Acad Sci U S A; 2023 Feb; 120(9):e2210622120. PubMed ID: 36812206
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Compositional memory in attractor neural networks with one-step learning.
    Davis GP; Katz GE; Gentili RJ; Reggia JA
    Neural Netw; 2021 Jun; 138():78-97. PubMed ID: 33631609
    [TBL] [Abstract][Full Text] [Related]  

  • 31. A discrete fully recurrent network of max product units for associative memory and classification.
    Brouwer RK
    Int J Neural Syst; 2002; 12(3-4):247-62. PubMed ID: 12370953
    [TBL] [Abstract][Full Text] [Related]  

  • 32. A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.
    Alemi A; Baldassi C; Brunel N; Zecchina R
    PLoS Comput Biol; 2015 Aug; 11(8):e1004439. PubMed ID: 26291608
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Storing structured sparse memories in a multi-modular cortical network model.
    Dubreuil AM; Brunel N
    J Comput Neurosci; 2016 Apr; 40(2):157-75. PubMed ID: 26852335
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Multiscale and Extended Retrieval of Associative Memory Structures in a Cortical Model of Local-Global Inhibition Balance.
    Burns TF; Haga 芳賀 達也 T; Fukai 深井朋樹 T
    eNeuro; 2022; 9(3):. PubMed ID: 35606151
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Effect of dilution in asymmetric recurrent neural networks.
    Folli V; Gosti G; Leonetti M; Ruocco G
    Neural Netw; 2018 Aug; 104():50-59. PubMed ID: 29705670
    [TBL] [Abstract][Full Text] [Related]  

  • 36. AHaH computing-from metastable switches to attractors to machine learning.
    Nugent MA; Molter TW
    PLoS One; 2014; 9(2):e85175. PubMed ID: 24520315
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Associative memory design using overlapping decomposition and generalized brain-state-in-a-box neural networks.
    Oh C; Zak SH
    Int J Neural Syst; 2003 Jun; 13(3):139-53. PubMed ID: 12884448
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Learning attractors in an asynchronous, stochastic electronic neural network.
    Del Giudice P; Fusi S; Badoni D; Dante V; Amit DJ
    Network; 1998 May; 9(2):183-205. PubMed ID: 9861985
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Storage capacity of networks with discrete synapses and sparsely encoded memories.
    Feng Y; Brunel N
    Phys Rev E; 2022 May; 105(5-1):054408. PubMed ID: 35706193
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Distinguishing spurious and nominal attractors applying unlearning to an asymmetric neural network.
    Horas JA; Bea EA
    Int J Neural Syst; 2002 Apr; 12(2):109-16. PubMed ID: 12035125
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.