These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

137 related articles for article (PubMed ID: 32946706)

  • 41. Multimodal transistors as ReLU activation functions in physical neural network classifiers.
    Surekcigil Pesch I; Bestelink E; de Sagazan O; Mehonic A; Sporea RA
    Sci Rep; 2022 Jan; 12(1):670. PubMed ID: 35027631
    [TBL] [Abstract][Full Text] [Related]  

  • 42. Necessary conditions on minimal system configuration for general MISO Mamdani fuzzy systems as universal approximators.
    Ding Y; Ying H; Shao S
    IEEE Trans Syst Man Cybern B Cybern; 2000; 30(6):857-64. PubMed ID: 18252416
    [TBL] [Abstract][Full Text] [Related]  

  • 43. Constructive function-approximation by three-layer artificial neural networks.
    Suzuki S
    Neural Netw; 1998 Aug; 11(6):1049-1058. PubMed ID: 12662774
    [TBL] [Abstract][Full Text] [Related]  

  • 44. Optimization of Microchannels and Application of Basic Activation Functions of Deep Neural Network for Accuracy Analysis of Microfluidic Parameter Data.
    Ahmed F; Shimizu M; Wang J; Sakai K; Kiwa T
    Micromachines (Basel); 2022 Aug; 13(8):. PubMed ID: 36014274
    [TBL] [Abstract][Full Text] [Related]  

  • 45. Neural networks with a continuous squashing function in the output are universal approximators.
    Castro JL; Mantas CJ; Benítez JM
    Neural Netw; 2000 Jul; 13(6):561-3. PubMed ID: 10987509
    [TBL] [Abstract][Full Text] [Related]  

  • 46. Parameterized Convex Universal Approximators for Decision-Making Problems.
    Kim J; Kim Y
    IEEE Trans Neural Netw Learn Syst; 2024 Feb; 35(2):2448-2459. PubMed ID: 35857729
    [TBL] [Abstract][Full Text] [Related]  

  • 47. Basis operator network: A neural network-based model for learning nonlinear operators via neural basis.
    Hua N; Lu W
    Neural Netw; 2023 Jul; 164():21-37. PubMed ID: 37146447
    [TBL] [Abstract][Full Text] [Related]  

  • 48. Approximation capabilities of neural networks on unbounded domains.
    Wang MX; Qu Y
    Neural Netw; 2022 Jan; 145():56-67. PubMed ID: 34717234
    [TBL] [Abstract][Full Text] [Related]  

  • 49. Studying the Evolution of Neural Activation Patterns During Training of Feed-Forward ReLU Networks.
    Hartmann D; Franzen D; Brodehl S
    Front Artif Intell; 2021; 4():642374. PubMed ID: 35005614
    [TBL] [Abstract][Full Text] [Related]  

  • 50. Dimension independent bounds for general shallow networks.
    Mhaskar HN
    Neural Netw; 2020 Mar; 123():142-152. PubMed ID: 31869651
    [TBL] [Abstract][Full Text] [Related]  

  • 51. Non-differentiable saddle points and sub-optimal local minima exist for deep ReLU networks.
    Liu B; Liu Z; Zhang T; Yuan T
    Neural Netw; 2021 Dec; 144():75-89. PubMed ID: 34454244
    [TBL] [Abstract][Full Text] [Related]  

  • 52. Relaxed conditions for radial-basis function networks to be universal approximators.
    Liao Y; Fang SC; Nuttle HL
    Neural Netw; 2003 Sep; 16(7):1019-28. PubMed ID: 14692636
    [TBL] [Abstract][Full Text] [Related]  

  • 53. Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems.
    Chen T; Chen H
    IEEE Trans Neural Netw; 1995; 6(4):911-7. PubMed ID: 18263379
    [TBL] [Abstract][Full Text] [Related]  

  • 54. Representation of nonlinear random transformations by non-gaussian stochastic neural networks.
    Turchetti C; Crippa P; Pirani M; Biagetti G
    IEEE Trans Neural Netw; 2008 Jun; 19(6):1033-60. PubMed ID: 18541503
    [TBL] [Abstract][Full Text] [Related]  

  • 55. A digital hardware pulse-mode neuron with piecewise linear activation function.
    Hikawa H
    IEEE Trans Neural Netw; 2003; 14(5):1028-37. PubMed ID: 18244557
    [TBL] [Abstract][Full Text] [Related]  

  • 56. A Neurodynamic Approach for Real-Time Scheduling via Maximizing Piecewise Linear Utility.
    Guo Z; Baruah SK
    IEEE Trans Neural Netw Learn Syst; 2016 Feb; 27(2):238-48. PubMed ID: 26336153
    [TBL] [Abstract][Full Text] [Related]  

  • 57. Self-organizing radial basis function network for real-time approximation of continuous-time dynamical systems.
    Lian J; Lee Y; Sudhoff SD; Zak SH
    IEEE Trans Neural Netw; 2008 Mar; 19(3):460-74. PubMed ID: 18334365
    [TBL] [Abstract][Full Text] [Related]  

  • 58. Local Linearity Analysis of Deep Learning CT Denoising Algorithms.
    Li J; Wang W; Tivnan M; Sulam J; Prince JL; McNitt-Gray M; Stayman JW; Gang GJ
    Proc SPIE Int Soc Opt Eng; 2022 Jun; 12304():. PubMed ID: 36320561
    [TBL] [Abstract][Full Text] [Related]  

  • 59. A Single Hidden Layer Feedforward Network with Only One Neuron in the Hidden Layer Can Approximate Any Univariate Function.
    Guliyev NJ; Ismailov VE
    Neural Comput; 2016 Jul; 28(7):1289-304. PubMed ID: 27171269
    [TBL] [Abstract][Full Text] [Related]  

  • 60. Fuzzy jump wavelet neural network based on rule induction for dynamic nonlinear system identification with real data applications.
    Kharazihai Isfahani M; Zekri M; Marateb HR; Mañanas MA
    PLoS One; 2019; 14(12):e0224075. PubMed ID: 31816627
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 7.