These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

71 related articles for article (PubMed ID: 18255891)

  • 1. A dynamic K-winners-take-all neural network.
    Yang JF; Chen CM
    IEEE Trans Syst Man Cybern B Cybern; 1997; 27(3):523-6. PubMed ID: 18255891
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A model of analogue K-winners-take-all neural circuit.
    Tymoshchuk PV
    Neural Netw; 2013 Jun; 42():44-61. PubMed ID: 23501169
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Analysis and design of a k -winners-take-all model with a single state variable and the heaviside step activation function.
    Wang J
    IEEE Trans Neural Netw; 2010 Sep; 21(9):1496-506. PubMed ID: 20709640
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Analysis on the convergence time of dual neural network-based kWTA.
    Xiao Y; Liu Y; Leung CS; Sum JP; Ho K
    IEEE Trans Neural Netw Learn Syst; 2012 Apr; 23(4):676-82. PubMed ID: 24805051
    [TBL] [Abstract][Full Text] [Related]  

  • 5. A simplified dual neural network for quadratic programming with its KWTA application.
    Liu S; Wang J
    IEEE Trans Neural Netw; 2006 Nov; 17(6):1500-10. PubMed ID: 17131664
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A general mean-based iterative winner-take-all neural network.
    Yang JF; Chen CM; Wang WC; Lee JY
    IEEE Trans Neural Netw; 1995; 6(1):14-24. PubMed ID: 18263281
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Design of a K-Winners-Take-All Model With a Binary Spike Train.
    Tymoshchuk PV; Wunsch DC
    IEEE Trans Cybern; 2019 Aug; 49(8):3131-3140. PubMed ID: 30040665
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A novel recurrent neural network with one neuron and finite-time convergence for k-winners-take-all operation.
    Liu Q; Dang C; Cao J
    IEEE Trans Neural Netw; 2010 Jul; 21(7):1140-8. PubMed ID: 20659863
    [TBL] [Abstract][Full Text] [Related]  

  • 9. A new recurrent neural network for solving convex quadratic programming problems with an application to the k-winners-take-all problem.
    Hu X; Zhang B
    IEEE Trans Neural Netw; 2009 Apr; 20(4):654-64. PubMed ID: 19228555
    [TBL] [Abstract][Full Text] [Related]  

  • 10. A new k-winners-take-all neural network and its array architecture.
    Yen JC; Guo JI; Chen HC
    IEEE Trans Neural Netw; 1998; 9(5):901-12. PubMed ID: 18255775
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Effect of input noise and output node stochastic on Wang's kWTA.
    Sum J; Leung CS; Ho K
    IEEE Trans Neural Netw Learn Syst; 2013 Sep; 24(9):1472-8. PubMed ID: 24808584
    [TBL] [Abstract][Full Text] [Related]  

  • 12. A stochastic population approach to the problem of stable recruitment hierarchies in spiking neural networks.
    Günay C; Maida AS
    Biol Cybern; 2006 Jan; 94(1):33-45. PubMed ID: 16283375
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Layer Winner-Take-All neural networks based on existing competitive structures.
    Chen CM; Yang JF
    IEEE Trans Syst Man Cybern B Cybern; 2000; 30(1):25-30. PubMed ID: 18244726
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Properties and Performance of Imperfect Dual Neural Network-Based kWTA Networks.
    Feng R; Leung CS; Sum J; Xiao Y
    IEEE Trans Neural Netw Learn Syst; 2015 Sep; 26(9):2188-93. PubMed ID: 25376043
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Distributed k-winners-take-all via multiple neural networks with inertia.
    Wang X; Yang S; Guo Z; Huang T
    Neural Netw; 2022 Jul; 151():385-397. PubMed ID: 35483307
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Initialization-Based k-Winners-Take-All Neural Network Model Using Modified Gradient Descent.
    Zhang Y; Li S; Geng G
    IEEE Trans Neural Netw Learn Syst; 2023 Aug; 34(8):4130-4138. PubMed ID: 34752408
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Network capacity analysis for latent attractor computation.
    Doboli S; Minai AA
    Network; 2003 May; 14(2):273-302. PubMed ID: 12790185
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Local coupled feedforward neural network.
    Sun J
    Neural Netw; 2010 Jan; 23(1):108-13. PubMed ID: 19596550
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Distributed and Time-Delayed k-Winner-Take-All Network for Competitive Coordination of Multiple Robots.
    Jin L; Liang S; Luo X; Zhou M
    IEEE Trans Cybern; 2022 May; PP():. PubMed ID: 35533157
    [TBL] [Abstract][Full Text] [Related]  

  • 20. New accurate and flexible design procedure for a stable KWTA continuous time network.
    Costea RL; Marinov CA
    IEEE Trans Neural Netw; 2011 Sep; 22(9):1357-67. PubMed ID: 21768047
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 4.