295 related articles for article (PubMed ID: 32650153)
1. Quantifying the generalization error in deep learning in terms of data distribution and neural network smoothness.
Jin P; Lu L; Tang Y; Karniadakis GE
Neural Netw; 2020 Oct; 130():85-99. PubMed ID: 32650153
[TBL] [Abstract][Full Text] [Related]
2. High-dimensional dynamics of generalization error in neural networks.
Advani MS; Saxe AM; Sompolinsky H
Neural Netw; 2020 Dec; 132():428-446. PubMed ID: 33022471
[TBL] [Abstract][Full Text] [Related]
3. Generalization Analysis of Pairwise Learning for Ranking With Deep Neural Networks.
Huang S; Zhou J; Feng H; Zhou DX
Neural Comput; 2023 May; 35(6):1135-1158. PubMed ID: 37037041
[TBL] [Abstract][Full Text] [Related]
4. Upper bound of the expected training error of neural network regression for a Gaussian noise sequence.
Hagiwara K; Hayasaka T; Toda N; Usui S; Kuno K
Neural Netw; 2001 Dec; 14(10):1419-29. PubMed ID: 11771721
[TBL] [Abstract][Full Text] [Related]
5. Approximation rates for neural networks with encodable weights in smoothness spaces.
Gühring I; Raslan M
Neural Netw; 2021 Feb; 134():107-130. PubMed ID: 33310376
[TBL] [Abstract][Full Text] [Related]
6. Why ResNet Works? Residuals Generalize.
He F; Liu T; Tao D
IEEE Trans Neural Netw Learn Syst; 2020 Dec; 31(12):5349-5362. PubMed ID: 32031953
[TBL] [Abstract][Full Text] [Related]
7. An analysis of training and generalization errors in shallow and deep networks.
Mhaskar HN; Poggio T
Neural Netw; 2020 Jan; 121():229-241. PubMed ID: 31574413
[TBL] [Abstract][Full Text] [Related]
8. Approximation of smooth functionals using deep ReLU networks.
Song L; Liu Y; Fan J; Zhou DX
Neural Netw; 2023 Sep; 166():424-436. PubMed ID: 37549610
[TBL] [Abstract][Full Text] [Related]
9. Generalization Analysis of CNNs for Classification on Spheres.
Feng H; Huang S; Zhou DX
IEEE Trans Neural Netw Learn Syst; 2023 Sep; 34(9):6200-6213. PubMed ID: 34941530
[TBL] [Abstract][Full Text] [Related]
10. Going Deeper, Generalizing Better: An Information-Theoretic View for Deep Learning.
Zhang J; Liu T; Tao D
IEEE Trans Neural Netw Learn Syst; 2023 Aug; PP():. PubMed ID: 37585328
[TBL] [Abstract][Full Text] [Related]
11. Improving generalization of deep neural networks by leveraging margin distribution.
Lyu SH; Wang L; Zhou ZH
Neural Netw; 2022 Jul; 151():48-60. PubMed ID: 35395512
[TBL] [Abstract][Full Text] [Related]
12. Optimizing neural networks for medical data sets: A case study on neonatal apnea prediction.
Shirwaikar RD; Acharya U D; Makkithaya K; M S; Srivastava S; Lewis U LES
Artif Intell Med; 2019 Jul; 98():59-76. PubMed ID: 31521253
[TBL] [Abstract][Full Text] [Related]
13. Fast generalization error bound of deep learning without scale invariance of activation functions.
Terada Y; Hirose R
Neural Netw; 2020 Sep; 129():344-358. PubMed ID: 32593931
[TBL] [Abstract][Full Text] [Related]
14. Deep ReLU neural networks in high-dimensional approximation.
Dũng D; Nguyen VK
Neural Netw; 2021 Oct; 142():619-635. PubMed ID: 34392126
[TBL] [Abstract][Full Text] [Related]
15. Deep learning for patient-specific quality assurance: Identifying errors in radiotherapy delivery by radiomic analysis of gamma images with convolutional neural networks.
Nyflot MJ; Thammasorn P; Wootton LS; Ford EC; Chaovalitwongse WA
Med Phys; 2019 Feb; 46(2):456-464. PubMed ID: 30548601
[TBL] [Abstract][Full Text] [Related]
16. On the minimax optimality and superiority of deep neural network learning over sparse parameter spaces.
Hayakawa S; Suzuki T
Neural Netw; 2020 Mar; 123():343-361. PubMed ID: 31901565
[TBL] [Abstract][Full Text] [Related]
17. Evolving artificial neural networks with feedback.
Herzog S; Tetzlaff C; Wörgötter F
Neural Netw; 2020 Mar; 123():153-162. PubMed ID: 31874331
[TBL] [Abstract][Full Text] [Related]
18. On the problem in model selection of neural network regression in overrealizable scenario.
Hagiwara K
Neural Comput; 2002 Aug; 14(8):1979-2002. PubMed ID: 12180410
[TBL] [Abstract][Full Text] [Related]
19. Low dimensional approximation and generalization of multivariate functions on smooth manifolds using deep ReLU neural networks.
Labate D; Shi J
Neural Netw; 2024 Jun; 174():106223. PubMed ID: 38458005
[TBL] [Abstract][Full Text] [Related]
20. Theory of deep convolutional neural networks: Downsampling.
Zhou DX
Neural Netw; 2020 Apr; 124():319-327. PubMed ID: 32036229
[TBL] [Abstract][Full Text] [Related]
[Next] [New Search]