These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: When is approximation by Gaussian networks necessarily a linear process?
    Author: Mhaskar HN.
    Journal: Neural Netw; 2004 Sep; 17(7):989-1001. PubMed ID: 15312841.
    Abstract:
    Let s > or = 1 be an integer. A Gaussian network is a function on Rs of the form [Formula: see text]. The minimal separation among the centers, defined by (1/2) min(1 < or = j not = k < or = N) [Formula: see text], is an important characteristic of the network that determines the stability of interpolation by Gaussian networks, the degree of approximation by such networks, etc. Let (within this abstract only) the set of all Gaussian networks with minimal separation exceeding 1/m be denoted by Gm. We prove that for functions [Formula: see text] such that [Formula: see text], if the degree of L2(nonlinear) approximation of [Formula: see text] from Gm is [Formula: see text] then necessarily the degree of approximation of [Formula: see text] by (rectangular) partial sums of degree m2 of the Hermite expansion of [Formula: see text] is also [Formula: see text]. Moreover, Gaussian networks in Gm having fixed centers in a ball of radius [Formula: see text] and coefficients being linear functionals of [Formula: see text] can be constructed to yield the same degree of approximation. Similar results are proved for the Lp norms, 1 < or = p < or =[Formula: see text] but with the condition that the number of neurons N, should satisfy logN = [Formula: see text](m2).
    [Abstract] [Full Text] [Related] [New Search]