125 related articles for article (PubMed ID: 38159509)
1. Neural networks with ReLU powers need less depth.
Cabanilla KIM; Mohammad RZ; Lope JEC
Neural Netw; 2024 Apr; 172():106073. PubMed ID: 38159509
[TBL] [Abstract][Full Text] [Related]
2. Optimal approximation of piecewise smooth functions using deep ReLU neural networks.
Petersen P; Voigtlaender F
Neural Netw; 2018 Dec; 108():296-330. PubMed ID: 30245431
[TBL] [Abstract][Full Text] [Related]
3. Approximation in shift-invariant spaces with deep ReLU neural networks.
Yang Y; Li Z; Wang Y
Neural Netw; 2022 Sep; 153():269-281. PubMed ID: 35763879
[TBL] [Abstract][Full Text] [Related]
4. Approximation of smooth functionals using deep ReLU networks.
Song L; Liu Y; Fan J; Zhou DX
Neural Netw; 2023 Sep; 166():424-436. PubMed ID: 37549610
[TBL] [Abstract][Full Text] [Related]
5. Deep ReLU neural networks in high-dimensional approximation.
Dũng D; Nguyen VK
Neural Netw; 2021 Oct; 142():619-635. PubMed ID: 34392126
[TBL] [Abstract][Full Text] [Related]
6. Approximation rates for neural networks with encodable weights in smoothness spaces.
Gühring I; Raslan M
Neural Netw; 2021 Feb; 134():107-130. PubMed ID: 33310376
[TBL] [Abstract][Full Text] [Related]
7. Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations.
Belomestny D; Naumov A; Puchkin N; Samsonov S
Neural Netw; 2023 Apr; 161():242-253. PubMed ID: 36774863
[TBL] [Abstract][Full Text] [Related]
8. Error bounds for approximations with deep ReLU networks.
Yarotsky D
Neural Netw; 2017 Oct; 94():103-114. PubMed ID: 28756334
[TBL] [Abstract][Full Text] [Related]
9. Smooth Function Approximation by Deep Neural Networks with General Activation Functions.
Ohn I; Kim Y
Entropy (Basel); 2019 Jun; 21(7):. PubMed ID: 33267341
[TBL] [Abstract][Full Text] [Related]
10. Convergence of deep convolutional neural networks.
Xu Y; Zhang H
Neural Netw; 2022 Sep; 153():553-563. PubMed ID: 35839599
[TBL] [Abstract][Full Text] [Related]
11. Random Sketching for Neural Networks With ReLU.
Wang D; Zeng J; Lin SB
IEEE Trans Neural Netw Learn Syst; 2021 Feb; 32(2):748-762. PubMed ID: 32275612
[TBL] [Abstract][Full Text] [Related]
12. An exact mapping from ReLU networks to spiking neural networks.
Stanojevic A; Woźniak S; Bellec G; Cherubini G; Pantazi A; Gerstner W
Neural Netw; 2023 Nov; 168():74-88. PubMed ID: 37742533
[TBL] [Abstract][Full Text] [Related]
13. Simultaneous neural network approximation for smooth functions.
Hon S; Yang H
Neural Netw; 2022 Oct; 154():152-164. PubMed ID: 35882083
[TBL] [Abstract][Full Text] [Related]
14. Low dimensional approximation and generalization of multivariate functions on smooth manifolds using deep ReLU neural networks.
Labate D; Shi J
Neural Netw; 2024 Jun; 174():106223. PubMed ID: 38458005
[TBL] [Abstract][Full Text] [Related]
15. Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem.
Montanelli H; Yang H
Neural Netw; 2020 Sep; 129():1-6. PubMed ID: 32473577
[TBL] [Abstract][Full Text] [Related]
16. On the capacity of deep generative networks for approximating distributions.
Yang Y; Li Z; Wang Y
Neural Netw; 2022 Jan; 145():144-154. PubMed ID: 34749027
[TBL] [Abstract][Full Text] [Related]
17. ReLU Networks Are Universal Approximators via Piecewise Linear or Constant Functions.
Huang C
Neural Comput; 2020 Nov; 32(11):2249-2278. PubMed ID: 32946706
[TBL] [Abstract][Full Text] [Related]
18. Theory of deep convolutional neural networks III: Approximating radial functions.
Mao T; Shi Z; Zhou DX
Neural Netw; 2021 Dec; 144():778-790. PubMed ID: 34688019
[TBL] [Abstract][Full Text] [Related]
19. Fast generalization error bound of deep learning without scale invariance of activation functions.
Terada Y; Hirose R
Neural Netw; 2020 Sep; 129():344-358. PubMed ID: 32593931
[TBL] [Abstract][Full Text] [Related]
20. On the approximation of functions by tanh neural networks.
De Ryck T; Lanthaler S; Mishra S
Neural Netw; 2021 Nov; 143():732-750. PubMed ID: 34482172
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]