These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
27. The Bézier variant of Kantorovich type Cai QB J Inequal Appl; 2018; 2018(1):90. PubMed ID: 29681722 [TBL] [Abstract][Full Text] [Related]
28. Adaptive neural network asymptotic tracking control for nonstrict feedback stochastic nonlinear systems. Liu Y; Zhu Q Neural Netw; 2021 Nov; 143():283-290. PubMed ID: 34166891 [TBL] [Abstract][Full Text] [Related]
29. Theory of deep convolutional neural networks III: Approximating radial functions. Mao T; Shi Z; Zhou DX Neural Netw; 2021 Dec; 144():778-790. PubMed ID: 34688019 [TBL] [Abstract][Full Text] [Related]
30. Approximation bounds for smooth functions in C(IRd) by neural and mixture networks. Maiorov V; Meir RS IEEE Trans Neural Netw; 1998; 9(5):969-78. PubMed ID: 18255780 [TBL] [Abstract][Full Text] [Related]
31. Approximation properties of Gaussian-binary restricted Boltzmann machines and Gaussian-binary deep belief networks. Gu L; Yang L; Zhou F Neural Netw; 2022 Sep; 153():49-63. PubMed ID: 35700559 [TBL] [Abstract][Full Text] [Related]
32. Basis operator network: A neural network-based model for learning nonlinear operators via neural basis. Hua N; Lu W Neural Netw; 2023 Jul; 164():21-37. PubMed ID: 37146447 [TBL] [Abstract][Full Text] [Related]
33. Approximation by modified Kantorovich-Stancu operators. Opriş AA J Inequal Appl; 2018; 2018(1):346. PubMed ID: 30839826 [TBL] [Abstract][Full Text] [Related]
34. Universal Approximation Using Feedforward Neural Networks: A Survey of Some Existing Methods, and Some New Results. Chung Tsoi A; Scarselli F Neural Netw; 1998 Jan; 11(1):15-37. PubMed ID: 12662846 [TBL] [Abstract][Full Text] [Related]
35. Neural network approximation: Three hidden layers are enough. Shen Z; Yang H; Zhang S Neural Netw; 2021 Sep; 141():160-173. PubMed ID: 33906082 [TBL] [Abstract][Full Text] [Related]
36. Multistability analysis of delayed recurrent neural networks with a class of piecewise nonlinear activation functions. Liu Y; Wang Z; Ma Q; Shen H Neural Netw; 2022 Aug; 152():80-89. PubMed ID: 35526368 [TBL] [Abstract][Full Text] [Related]
37. Constructive feedforward neural networks using hermite polynomial activation functions. Ma L; Khorasani K IEEE Trans Neural Netw; 2005 Jul; 16(4):821-33. PubMed ID: 16121724 [TBL] [Abstract][Full Text] [Related]
38. Constructing general partial differential equations using polynomial and neural networks. Zjavka L; Pedrycz W Neural Netw; 2016 Jan; 73():58-69. PubMed ID: 26547244 [TBL] [Abstract][Full Text] [Related]
39. Towards a mathematical framework to inform neural network modelling via polynomial regression. Morala P; Cifuentes JA; Lillo RE; Ucar I Neural Netw; 2021 Oct; 142():57-72. PubMed ID: 33984736 [TBL] [Abstract][Full Text] [Related]
40. A novel method of the generalized interval-valued fuzzy rough approximation operators. Xue T; Xue Z; Cheng H; Liu J; Zhu T ScientificWorldJournal; 2014; 2014():783940. PubMed ID: 25162065 [TBL] [Abstract][Full Text] [Related] [Previous] [Next] [New Search]