BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

180 related articles for article (PubMed ID: 28129193)

  • 1. High-Dimensional Function Approximation With Neural Networks for Large Volumes of Data.
    Andras P
    IEEE Trans Neural Netw Learn Syst; 2018 Feb; 29(2):500-508. PubMed ID: 28129193
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Function approximation using combined unsupervised and supervised learning.
    Andras P
    IEEE Trans Neural Netw Learn Syst; 2014 Mar; 25(3):495-505. PubMed ID: 24807446
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Low dimensional approximation and generalization of multivariate functions on smooth manifolds using deep ReLU neural networks.
    Labate D; Shi J
    Neural Netw; 2024 Jun; 174():106223. PubMed ID: 38458005
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Optimal approximation of piecewise smooth functions using deep ReLU neural networks.
    Petersen P; Voigtlaender F
    Neural Netw; 2018 Dec; 108():296-330. PubMed ID: 30245431
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Dimension independent bounds for general shallow networks.
    Mhaskar HN
    Neural Netw; 2020 Mar; 123():142-152. PubMed ID: 31869651
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Deep ReLU neural networks in high-dimensional approximation.
    Dũng D; Nguyen VK
    Neural Netw; 2021 Oct; 142():619-635. PubMed ID: 34392126
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Efficient Approximation of High-Dimensional Functions With Neural Networks.
    Cheridito P; Jentzen A; Rossmannek F
    IEEE Trans Neural Netw Learn Syst; 2022 Jul; 33(7):3079-3093. PubMed ID: 33513112
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A deep network construction that adapts to intrinsic dimensionality beyond the domain.
    Cloninger A; Klock T
    Neural Netw; 2021 Sep; 141():404-419. PubMed ID: 34146968
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Classification logit two-sample testing by neural networks for differentiating near manifold densities.
    Cheng X; Cloninger A
    IEEE Trans Inf Theory; 2022 Oct; 68(10):6631-6662. PubMed ID: 37810208
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Embedding Functional Brain Networks in Low Dimensional Spaces Using Manifold Learning Techniques.
    Casanova R; Lyday RG; Bahrami M; Burdette JH; Simpson SL; Laurienti PJ
    Front Neuroinform; 2021; 15():740143. PubMed ID: 35002665
    [No Abstract]   [Full Text] [Related]  

  • 11. Manifold fitting with CycleGAN.
    Yao Z; Su J; Yau ST
    Proc Natl Acad Sci U S A; 2024 Jan; 121(5):e2311436121. PubMed ID: 38266050
    [TBL] [Abstract][Full Text] [Related]  

  • 12. A direct approach for function approximation on data defined manifolds.
    Mhaskar HN
    Neural Netw; 2020 Dec; 132():253-268. PubMed ID: 32927428
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Panoramic Manifold Projection (Panoramap) for Single-Cell Data Dimensionality Reduction and Visualization.
    Wang Y; Xu Y; Zang Z; Wu L; Li Z
    Int J Mol Sci; 2022 Jul; 23(14):. PubMed ID: 35887125
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Neural manifold analysis of brain circuit dynamics in health and disease.
    Mitchell-Heggs R; Prado S; Gava GP; Go MA; Schultz SR
    J Comput Neurosci; 2023 Feb; 51(1):1-21. PubMed ID: 36522604
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Deep Recursive Embedding for High-Dimensional Data.
    Zhou Z; Zu X; Wang Y; Lelieveldt BPF; Tao Q
    IEEE Trans Vis Comput Graph; 2022 Feb; 28(2):1237-1248. PubMed ID: 34699363
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Neural network approximation: Three hidden layers are enough.
    Shen Z; Yang H; Zhang S
    Neural Netw; 2021 Sep; 141():160-173. PubMed ID: 33906082
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Guaranteed approximation error estimation of neural networks and model modification.
    Yang Y; Wang T; Woolard JP; Xiang W
    Neural Netw; 2022 Jul; 151():61-69. PubMed ID: 35395513
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Doing the Impossible: Why Neural Networks Can Be Trained at All.
    Hodas NO; Stinis P
    Front Psychol; 2018; 9():1185. PubMed ID: 30050485
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Basis operator network: A neural network-based model for learning nonlinear operators via neural basis.
    Hua N; Lu W
    Neural Netw; 2023 Jul; 164():21-37. PubMed ID: 37146447
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Smart sampling and incremental function learning for very large high dimensional data.
    Loyola R DG; Pedergnana M; Gimeno García S
    Neural Netw; 2016 Jun; 78():75-87. PubMed ID: 26476936
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.