These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

120 related articles for article (PubMed ID: 38083492)

  • 21. Evaluation of the computational capabilities of a memristive random network (MN
    Suarez LE; Kendall JD; Nino JC
    Neural Netw; 2018 Oct; 106():223-236. PubMed ID: 30077960
    [TBL] [Abstract][Full Text] [Related]  

  • 22. All-optical spiking neurosynaptic networks with self-learning capabilities.
    Feldmann J; Youngblood N; Wright CD; Bhaskaran H; Pernice WHP
    Nature; 2019 May; 569(7755):208-214. PubMed ID: 31068721
    [TBL] [Abstract][Full Text] [Related]  

  • 23. A Reservoir Computing Model of Reward-Modulated Motor Learning and Automaticity.
    Pyle R; Rosenbaum R
    Neural Comput; 2019 Jul; 31(7):1430-1461. PubMed ID: 31113300
    [TBL] [Abstract][Full Text] [Related]  

  • 24. Optimal modularity and memory capacity of neural reservoirs.
    Rodriguez N; Izquierdo E; Ahn YY
    Netw Neurosci; 2019; 3(2):551-566. PubMed ID: 31089484
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Functional differentiations in evolutionary reservoir computing networks.
    Yamaguti Y; Tsuda I
    Chaos; 2021 Jan; 31(1):013137. PubMed ID: 33754767
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Computational capabilities of random automata networks for reservoir computing.
    Snyder D; Goudarzi A; Teuscher C
    Phys Rev E Stat Nonlin Soft Matter Phys; 2013 Apr; 87(4):042808. PubMed ID: 23679474
    [TBL] [Abstract][Full Text] [Related]  

  • 27. SpaRCe: Improved Learning of Reservoir Computing Systems Through Sparse Representations.
    Manneschi L; Lin AC; Vasilaki E
    IEEE Trans Neural Netw Learn Syst; 2023 Feb; 34(2):824-838. PubMed ID: 34398765
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Biological neurons act as generalization filters in reservoir computing.
    Sumi T; Yamamoto H; Katori Y; Ito K; Moriya S; Konno T; Sato S; Hirano-Iwata A
    Proc Natl Acad Sci U S A; 2023 Jun; 120(25):e2217008120. PubMed ID: 37307467
    [TBL] [Abstract][Full Text] [Related]  

  • 29. Rotating neurons for all-analog implementation of cyclic reservoir computing.
    Liang X; Zhong Y; Tang J; Liu Z; Yao P; Sun K; Zhang Q; Gao B; Heidari H; Qian H; Wu H
    Nat Commun; 2022 Mar; 13(1):1549. PubMed ID: 35322037
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Constructing Precisely Computing Networks with Biophysical Spiking Neurons.
    Schwemmer MA; Fairhall AL; Denéve S; Shea-Brown ET
    J Neurosci; 2015 Jul; 35(28):10112-34. PubMed ID: 26180189
    [TBL] [Abstract][Full Text] [Related]  

  • 31. Rethinking the performance comparison between SNNS and ANNS.
    Deng L; Wu Y; Hu X; Liang L; Ding Y; Li G; Zhao G; Li P; Xie Y
    Neural Netw; 2020 Jan; 121():294-307. PubMed ID: 31586857
    [TBL] [Abstract][Full Text] [Related]  

  • 32. Comparing SNNs and RNNs on neuromorphic vision datasets: Similarities and differences.
    He W; Wu Y; Deng L; Li G; Wang H; Tian Y; Ding W; Wang W; Xie Y
    Neural Netw; 2020 Dec; 132():108-120. PubMed ID: 32866745
    [TBL] [Abstract][Full Text] [Related]  

  • 33. The covariance perceptron: A new paradigm for classification and processing of time series in recurrent neuronal networks.
    Gilson M; Dahmen D; Moreno-Bote R; Insabato A; Helias M
    PLoS Comput Biol; 2020 Oct; 16(10):e1008127. PubMed ID: 33044953
    [TBL] [Abstract][Full Text] [Related]  

  • 34. Model-size reduction for reservoir computing by concatenating internal states through time.
    Sakemi Y; Morino K; Leleu T; Aihara K
    Sci Rep; 2020 Dec; 10(1):21794. PubMed ID: 33311595
    [TBL] [Abstract][Full Text] [Related]  

  • 35. Passing the Message: Representation Transfer in Modular Balanced Networks.
    Zajzon B; Mahmoudian S; Morrison A; Duarte R
    Front Comput Neurosci; 2019; 13():79. PubMed ID: 31920605
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Multifractal analysis of information processing in hippocampal neural ensembles during working memory under Δ⁹-tetrahydrocannabinol administration.
    Fetterhoff D; Opris I; Simpson SL; Deadwyler SA; Hampson RE; Kraft RA
    J Neurosci Methods; 2015 Apr; 244():136-53. PubMed ID: 25086297
    [TBL] [Abstract][Full Text] [Related]  

  • 37. The deep arbitrary polynomial chaos neural network or how Deep Artificial Neural Networks could benefit from data-driven homogeneous chaos theory.
    Oladyshkin S; Praditia T; Kroeker I; Mohammadi F; Nowak W; Otte S
    Neural Netw; 2023 Sep; 166():85-104. PubMed ID: 37480771
    [TBL] [Abstract][Full Text] [Related]  

  • 38. A neuro-inspired general framework for the evolution of stochastic dynamical systems: Cellular automata, random Boolean networks and echo state networks towards criticality.
    Pontes-Filho S; Lind P; Yazidi A; Zhang J; Hammer H; Mello GBM; Sandvig I; Tufte G; Nichele S
    Cogn Neurodyn; 2020 Oct; 14(5):657-674. PubMed ID: 33014179
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Fundamental short-term memory of semi-artificial neuronal network.
    Ito H; Kudoh SN
    Annu Int Conf IEEE Eng Med Biol Soc; 2013; 2013():811-4. PubMed ID: 24109811
    [TBL] [Abstract][Full Text] [Related]  

  • 40. Simulation platform for pattern recognition based on reservoir computing with memristor networks.
    Tanaka G; Nakane R
    Sci Rep; 2022 Jun; 12(1):9868. PubMed ID: 35701445
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 6.