121 related articles for article (PubMed ID: 10953247)
1. Measuring the VC-dimension using optimized experimental design.
Shao X; Cherkassky V; Li W
Neural Comput; 2000 Aug; 12(8):1969-86. PubMed ID: 10953247
[TBL] [Abstract][Full Text] [Related]
2. Model complexity control for regression using VC generalization bounds.
Cherkassky V; Shao X; Mulier FM; Vapnik VN
IEEE Trans Neural Netw; 1999; 10(5):1075-89. PubMed ID: 18252610
[TBL] [Abstract][Full Text] [Related]
3. VC-dimension of univariate decision trees.
Yildiz OT
IEEE Trans Neural Netw Learn Syst; 2015 Feb; 26(2):378-87. PubMed ID: 25594983
[TBL] [Abstract][Full Text] [Related]
4. On the practical applicability of VC dimension bounds.
Holden SB; Niranjan M
Neural Comput; 1995 Nov; 7(6):1265-88. PubMed ID: 7584902
[TBL] [Abstract][Full Text] [Related]
5. The Vapnik-Chervonenkis dimension of graph and recursive neural networks.
Scarselli F; Tsoi AC; Hagenbuchner M
Neural Netw; 2018 Dec; 108():248-259. PubMed ID: 30219742
[TBL] [Abstract][Full Text] [Related]
6. VC-dimension of exterior visibility.
Isler V; Kannan S; Daniilidis K; Valtr P
IEEE Trans Pattern Anal Mach Intell; 2004 May; 26(5):667-71. PubMed ID: 15460289
[TBL] [Abstract][Full Text] [Related]
7. Neurodynamical classifiers with low model complexity.
Pant H; Soman S; Jayadeva ; Bhaya A
Neural Netw; 2020 Dec; 132():405-415. PubMed ID: 33011671
[TBL] [Abstract][Full Text] [Related]
8. Improving the generalization capacity of cascade classifiers.
Ludwig O; Nunes U; Ribeiro B; Premebida C
IEEE Trans Cybern; 2013 Dec; 43(6):2135-46. PubMed ID: 23757522
[TBL] [Abstract][Full Text] [Related]
9. Discretized-Vapnik-Chervonenkis dimension for analyzing complexity of real function classes.
Zhang C; Bian W; Tao D; Lin W
IEEE Trans Neural Netw Learn Syst; 2012 Sep; 23(9):1461-72. PubMed ID: 24807929
[TBL] [Abstract][Full Text] [Related]
10. Motion estimation using Statistical Learning Theory.
Wechsler H; Duric Z; Li F; Cherkassky V
IEEE Trans Pattern Anal Mach Intell; 2004 Apr; 26(4):466-78. PubMed ID: 15382651
[TBL] [Abstract][Full Text] [Related]
11. A local Vapnik-Chervonenkis complexity.
Oneto L; Anguita D; Ridella S
Neural Netw; 2016 Oct; 82():62-75. PubMed ID: 27474843
[TBL] [Abstract][Full Text] [Related]
12. Signal estimation and denoising using VC-theory.
Cherkassky V; Shao X
Neural Netw; 2001 Jan; 14(1):37-52. PubMed ID: 11213212
[TBL] [Abstract][Full Text] [Related]
13. A comparative analysis of support vector machines and extreme learning machines.
Liu X; Gao C; Li P
Neural Netw; 2012 Sep; 33():58-66. PubMed ID: 22572469
[TBL] [Abstract][Full Text] [Related]
14. Structure of indicator function classes with finite Vapnik-Chervonenkis dimensions.
Zhang C; Tao D
IEEE Trans Neural Netw Learn Syst; 2013 Jul; 24(7):1156-60. PubMed ID: 24808529
[TBL] [Abstract][Full Text] [Related]
15. To understand double descent, we need to understand VC theory.
Cherkassky V; Lee EH
Neural Netw; 2024 Jan; 169():242-256. PubMed ID: 37913656
[TBL] [Abstract][Full Text] [Related]
16. Neural networks with local receptive fields and superlinear VC dimension.
Schmitt M
Neural Comput; 2002 Apr; 14(4):919-56. PubMed ID: 11936967
[TBL] [Abstract][Full Text] [Related]
17. Linear constraints on weight representation for generalized learning of multilayer networks.
Ishii M; Kumazawa I
Neural Comput; 2001 Dec; 13(12):2851-63. PubMed ID: 11705413
[TBL] [Abstract][Full Text] [Related]
18. Comparison of model selection for regression.
Cherkassky V; Ma Y
Neural Comput; 2003 Jul; 15(7):1691-714. PubMed ID: 12816572
[TBL] [Abstract][Full Text] [Related]
19. Assessing Generalization Ability of Majority Vote Point Classifiers.
Sevakula RK; Verma NK
IEEE Trans Neural Netw Learn Syst; 2017 Dec; 28(12):2985-2997. PubMed ID: 28113524
[TBL] [Abstract][Full Text] [Related]
20.
; ; . PubMed ID:
[No Abstract] [Full Text] [Related]
[Next] [New Search]