These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
88 related articles for article (PubMed ID: 28186910)
21. A 4K-Input High-Speed Winner-Take-All (WTA) Circuit with Single-Winner Selection for Change-Driven Vision Sensors. Pardo F; Reig C; Boluda JA; Vegara F Sensors (Basel); 2019 Jan; 19(2):. PubMed ID: 30669700 [TBL] [Abstract][Full Text] [Related]
22. Learning neural networks with noisy inputs using the errors-in-variables approach. Van Gorp J; Schoukens J; Pintelon R IEEE Trans Neural Netw; 2000; 11(2):402-14. PubMed ID: 18249770 [TBL] [Abstract][Full Text] [Related]
23. Multiplicative neural noise can favor an independent components representation of sensory input. Gottschalk A; Sexton MG; Roschke G Network; 2004 Nov; 15(4):291-311. PubMed ID: 15600235 [TBL] [Abstract][Full Text] [Related]
24. Computer simulations of the effects of different synaptic input systems on the steady-state input-output structure of the motoneuron pool. Heckman CJ J Neurophysiol; 1994 May; 71(5):1727-39. PubMed ID: 7914915 [TBL] [Abstract][Full Text] [Related]
25. Probabilistic design of layered neural networks based on their unified framework. Watanabe S; Fukumizu K IEEE Trans Neural Netw; 1995; 6(3):691-702. PubMed ID: 18263354 [TBL] [Abstract][Full Text] [Related]
26. All fiber-optic neural network using coupled SOA based ring lasers. Hill MT; Frietman EE; de Waardt H; Khoe GD; Dorren HS IEEE Trans Neural Netw; 2002; 13(6):1504-13. PubMed ID: 18244545 [TBL] [Abstract][Full Text] [Related]
27. Dynamical behavior of autoassociative memory performing novelty filtering for signal enhancement. Ko H; Jacyna GM IEEE Trans Neural Netw; 2000; 11(5):1152-61. PubMed ID: 18249841 [TBL] [Abstract][Full Text] [Related]
28. Dynamic analysis of a general class of winner-take-all competitive neural networks. Fang Y; Cohen MA; Kincaid TG IEEE Trans Neural Netw; 2010 May; 21(5):771-83. PubMed ID: 20215068 [TBL] [Abstract][Full Text] [Related]
29. Layer Winner-Take-All neural networks based on existing competitive structures. Chen CM; Yang JF IEEE Trans Syst Man Cybern B Cybern; 2000; 30(1):25-30. PubMed ID: 18244726 [TBL] [Abstract][Full Text] [Related]
31. A New Discrete-Time Multi-Constrained $K$-Winner-Take-All Recurrent Network and Its Application to Prioritized Scheduling. Tien PL IEEE Trans Neural Netw Learn Syst; 2017 Nov; 28(11):2674-2685. PubMed ID: 28113608 [TBL] [Abstract][Full Text] [Related]
32. Performance Bounds for Single Layer Threshold Networks when Tracking a Drifting Adversary. Kuh A; Tian X Neural Netw; 1997 Jul; 10(5):897-906. PubMed ID: 12662878 [TBL] [Abstract][Full Text] [Related]
33. Principal component extraction using recursive least squares learning. Bannour S; Azimi-Sadjadi MR IEEE Trans Neural Netw; 1995; 6(2):457-69. PubMed ID: 18263327 [TBL] [Abstract][Full Text] [Related]
34. Extracting the principal behavior of a probabilistic supervisor through neural networks ensemble. Hartono P; Hashimoto S Int J Neural Syst; 2002; 12(3-4):291-301. PubMed ID: 12370956 [TBL] [Abstract][Full Text] [Related]
35. Deep CNNs with Robust LBP Guiding Pooling for Face Recognition. Ma Z; Ding Y; Li B; Yuan X Sensors (Basel); 2018 Nov; 18(11):. PubMed ID: 30423850 [TBL] [Abstract][Full Text] [Related]
37. A global gradient-noise covariance expression for stationary real Gaussian inputs. An PE; Brown M; Harris CJ IEEE Trans Neural Netw; 1995; 6(6):1549-51. PubMed ID: 18263449 [TBL] [Abstract][Full Text] [Related]
38. Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks. Manjunath G; Jaeger H Neural Comput; 2013 Mar; 25(3):671-96. PubMed ID: 23272918 [TBL] [Abstract][Full Text] [Related]
39. Computer simulation of the steady-state input-output function of the cat medial gastrocnemius motoneuron pool. Heckman CJ; Binder MD J Neurophysiol; 1991 Apr; 65(4):952-67. PubMed ID: 2051212 [TBL] [Abstract][Full Text] [Related]
40. Relation between weight size and degree of over-fitting in neural network regression. Hagiwara K; Fukumizu K Neural Netw; 2008 Jan; 21(1):48-58. PubMed ID: 18206348 [TBL] [Abstract][Full Text] [Related] [Previous] [Next] [New Search]