These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

107 related articles for article (PubMed ID: 37262931)

  • 1. Approximate spectral decomposition of Fisher information matrix for simple ReLU networks.
    Takeishi Y; Iida M; Takeuchi J
    Neural Netw; 2023 Jul; 164():691-706. PubMed ID: 37262931
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Pathological Spectra of the Fisher Information Metric and Its Variants in Deep Neural Networks.
    Karakida R; Akaho S; Amari SI
    Neural Comput; 2021 Jul; 33(8):2274-2307. PubMed ID: 34310678
    [TBL] [Abstract][Full Text] [Related]  

  • 3. On minimal representations of shallow ReLU networks.
    Dereich S; Kassing S
    Neural Netw; 2022 Apr; 148():121-128. PubMed ID: 35123261
    [TBL] [Abstract][Full Text] [Related]  

  • 4. A Sequential Learning Approach for Single Hidden Layer Neural Networks.
    Morris AJ; Zhang J
    Neural Netw; 1998 Jan; 11(1):65-80. PubMed ID: 12662849
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Design of double fuzzy clustering-driven context neural networks.
    Kim EH; Oh SK; Pedrycz W
    Neural Netw; 2018 Aug; 104():1-14. PubMed ID: 29689457
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Spectral properties of the hierarchical product of graphs.
    Skardal PS; Wash K
    Phys Rev E; 2016 Nov; 94(5-1):052311. PubMed ID: 27967095
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Convergence of deep convolutional neural networks.
    Xu Y; Zhang H
    Neural Netw; 2022 Sep; 153():553-563. PubMed ID: 35839599
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Deep ReLU neural networks in high-dimensional approximation.
    Dũng D; Nguyen VK
    Neural Netw; 2021 Oct; 142():619-635. PubMed ID: 34392126
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Optimal approximation of piecewise smooth functions using deep ReLU neural networks.
    Petersen P; Voigtlaender F
    Neural Netw; 2018 Dec; 108():296-330. PubMed ID: 30245431
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Novel maximum-margin training algorithms for supervised neural networks.
    Ludwig O; Nunes U
    IEEE Trans Neural Netw; 2010 Jun; 21(6):972-84. PubMed ID: 20409990
    [TBL] [Abstract][Full Text] [Related]  

  • 11. A Hybrid Method Based on Extreme Learning Machine and Self Organizing Map for Pattern Classification.
    Jammoussi I; Ben Nasr M
    Comput Intell Neurosci; 2020; 2020():2918276. PubMed ID: 32908471
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Storing cycles in Hopfield-type networks with pseudoinverse learning rule: admissibility and network topology.
    Zhang C; Dangelmayr G; Oprea I
    Neural Netw; 2013 Oct; 46():283-98. PubMed ID: 23872430
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Approximation properties of Gaussian-binary restricted Boltzmann machines and Gaussian-binary deep belief networks.
    Gu L; Yang L; Zhou F
    Neural Netw; 2022 Sep; 153():49-63. PubMed ID: 35700559
    [TBL] [Abstract][Full Text] [Related]  

  • 14. The Kolmogorov-Arnold representation theorem revisited.
    Schmidt-Hieber J
    Neural Netw; 2021 May; 137():119-126. PubMed ID: 33592434
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Spectral pruning of fully connected layers.
    Buffoni L; Civitelli E; Giambagli L; Chicchi L; Fanelli D
    Sci Rep; 2022 Jul; 12(1):11201. PubMed ID: 35778586
    [TBL] [Abstract][Full Text] [Related]  

  • 16. A functional neural network computing some eigenvalues and eigenvectors of a special real matrix.
    Liu Y; You Z; Cao L
    Neural Netw; 2005 Dec; 18(10):1293-300. PubMed ID: 16153802
    [TBL] [Abstract][Full Text] [Related]  

  • 17. A dynamical model for the analysis and acceleration of learning in feedforward networks.
    Ampazis N; Perantonis SJ; Taylor JG
    Neural Netw; 2001 Oct; 14(8):1075-88. PubMed ID: 11681752
    [TBL] [Abstract][Full Text] [Related]  

  • 18. On the approximation of functions by tanh neural networks.
    De Ryck T; Lanthaler S; Mishra S
    Neural Netw; 2021 Nov; 143():732-750. PubMed ID: 34482172
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Approximation capabilities of neural networks on unbounded domains.
    Wang MX; Qu Y
    Neural Netw; 2022 Jan; 145():56-67. PubMed ID: 34717234
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Random Sketching for Neural Networks With ReLU.
    Wang D; Zeng J; Lin SB
    IEEE Trans Neural Netw Learn Syst; 2021 Feb; 32(2):748-762. PubMed ID: 32275612
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 6.