These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

107 related articles for article (PubMed ID: 37796669)

  • 21. Computation with spikes in a winner-take-all network.
    Oster M; Douglas R; Liu SC
    Neural Comput; 2009 Sep; 21(9):2437-65. PubMed ID: 19548795
    [TBL] [Abstract][Full Text] [Related]  

  • 22. A 4K-Input High-Speed Winner-Take-All (WTA) Circuit with Single-Winner Selection for Change-Driven Vision Sensors.
    Pardo F; Reig C; Boluda JA; Vegara F
    Sensors (Basel); 2019 Jan; 19(2):. PubMed ID: 30669700
    [TBL] [Abstract][Full Text] [Related]  

  • 23. Dynamic analysis of a general class of winner-take-all competitive neural networks.
    Fang Y; Cohen MA; Kincaid TG
    IEEE Trans Neural Netw; 2010 May; 21(5):771-83. PubMed ID: 20215068
    [TBL] [Abstract][Full Text] [Related]  

  • 24. An improved dual neural network for solving a class of quadratic programming problems and its k-winners-take-all application.
    Hu X; Wang J
    IEEE Trans Neural Netw; 2008 Dec; 19(12):2022-31. PubMed ID: 19054727
    [TBL] [Abstract][Full Text] [Related]  

  • 25. Distributed k-winners-take-all via multiple neural networks with inertia.
    Wang X; Yang S; Guo Z; Huang T
    Neural Netw; 2022 Jul; 151():385-397. PubMed ID: 35483307
    [TBL] [Abstract][Full Text] [Related]  

  • 26. Prototype-Based Interpretation of the Functionality of Neurons in Winner-Take-All Neural Networks.
    Zarei-Sabzevar R; Ghiasi-Shirazi K; Harati A
    IEEE Trans Neural Netw Learn Syst; 2023 Nov; 34(11):9016-9028. PubMed ID: 35275829
    [TBL] [Abstract][Full Text] [Related]  

  • 27. Designing spiking neural networks for robust and reconfigurable computation.
    Börner G; Schittler Neves F; Timme M
    Chaos; 2023 Aug; 33(8):. PubMed ID: 38060785
    [TBL] [Abstract][Full Text] [Related]  

  • 28. Fast convergence rates of deep neural networks for classification.
    Kim Y; Ohn I; Kim D
    Neural Netw; 2021 Jun; 138():179-197. PubMed ID: 33676328
    [TBL] [Abstract][Full Text] [Related]  

  • 29. A New Discrete-Time Multi-Constrained $K$-Winner-Take-All Recurrent Network and Its Application to Prioritized Scheduling.
    Tien PL
    IEEE Trans Neural Netw Learn Syst; 2017 Nov; 28(11):2674-2685. PubMed ID: 28113608
    [TBL] [Abstract][Full Text] [Related]  

  • 30. Brain rhythm bursts are enhanced by multiplicative noise.
    Powanwe AS; Longtin A
    Chaos; 2021 Jan; 31(1):013117. PubMed ID: 33754759
    [TBL] [Abstract][Full Text] [Related]  

  • 31. k-winners-take-all neural net with Theta(1) time complexity.
    Hsu TC; Wang SD
    IEEE Trans Neural Netw; 1997; 8(6):1557-61. PubMed ID: 18255756
    [TBL] [Abstract][Full Text] [Related]  

  • 32. A robust deep neural network for denoising task-based fMRI data: An application to working memory and episodic memory.
    Yang Z; Zhuang X; Sreenivasan K; Mishra V; Curran T; Cordes D
    Med Image Anal; 2020 Feb; 60():101622. PubMed ID: 31811979
    [TBL] [Abstract][Full Text] [Related]  

  • 33. Multiplicative-noise-induced coherence resonance via two different mechanisms in bistable neural models.
    Tang J; Jia Y; Yi M; Ma J; Li J
    Phys Rev E Stat Nonlin Soft Matter Phys; 2008 Jun; 77(6 Pt 1):061905. PubMed ID: 18643298
    [TBL] [Abstract][Full Text] [Related]  

  • 34. The scaling of winner-takes-all accuracy with population size.
    Shamir M
    Neural Comput; 2006 Nov; 18(11):2719-29. PubMed ID: 16999576
    [TBL] [Abstract][Full Text] [Related]  

  • 35. MR-self Noise2Noise: self-supervised deep learning-based image quality improvement of submillimeter resolution 3D MR images.
    Jung W; Lee HS; Seo M; Nam Y; Choi Y; Shin NY; Ahn KJ; Kim BS; Jang J
    Eur Radiol; 2023 Apr; 33(4):2686-2698. PubMed ID: 36378250
    [TBL] [Abstract][Full Text] [Related]  

  • 36. Method to calculate the moments of the membrane voltage in a model neuron driven by multiplicative filtered shot noise.
    Wolff L; Lindner B
    Phys Rev E Stat Nonlin Soft Matter Phys; 2008 Apr; 77(4 Pt 1):041913. PubMed ID: 18517662
    [TBL] [Abstract][Full Text] [Related]  

  • 37. Selective positive-negative feedback produces the winner-take-all competition in recurrent neural networks.
    Li S; Liu B; Li Y
    IEEE Trans Neural Netw Learn Syst; 2013 Feb; 24(2):301-9. PubMed ID: 24808283
    [TBL] [Abstract][Full Text] [Related]  

  • 38. Convergence and objective functions of some fault/noise-injection-based online learning algorithms for RBF networks.
    Ho KI; Leung CS; Sum J
    IEEE Trans Neural Netw; 2010 Jun; 21(6):938-47. PubMed ID: 20388593
    [TBL] [Abstract][Full Text] [Related]  

  • 39. Collective stability of networks of winner-take-all circuits.
    Rutishauser U; Douglas RJ; Slotine JJ
    Neural Comput; 2011 Mar; 23(3):735-73. PubMed ID: 21162667
    [TBL] [Abstract][Full Text] [Related]  

  • 40. When response variability increases neural network robustness to synaptic noise.
    Basalyga G; Salinas E
    Neural Comput; 2006 Jun; 18(6):1349-79. PubMed ID: 16764507
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 6.