These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

123 related articles for article (PubMed ID: 38442488)

  • 1. On the approximation of bi-Lipschitz maps by invertible neural networks.
    Jin B; Zhou Z; Zou J
    Neural Netw; 2024 Jun; 174():106214. PubMed ID: 38442488
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Basis operator network: A neural network-based model for learning nonlinear operators via neural basis.
    Hua N; Lu W
    Neural Netw; 2023 Jul; 164():21-37. PubMed ID: 37146447
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Guaranteed approximation error estimation of neural networks and model modification.
    Yang Y; Wang T; Woolard JP; Xiang W
    Neural Netw; 2022 Jul; 151():61-69. PubMed ID: 35395513
    [TBL] [Abstract][Full Text] [Related]  

  • 4. iFlowGAN: An Invertible Flow-Based Generative Adversarial Network for Unsupervised Image-to-Image Translation.
    Dai L; Tang J
    IEEE Trans Pattern Anal Mach Intell; 2022 Aug; 44(8):4151-4162. PubMed ID: 33651682
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Approximation bounds for convolutional neural networks in operator learning.
    Franco NR; Fresca S; Manzoni A; Zunino P
    Neural Netw; 2023 Apr; 161():129-141. PubMed ID: 36745938
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Approximation in shift-invariant spaces with deep ReLU neural networks.
    Yang Y; Li Z; Wang Y
    Neural Netw; 2022 Sep; 153():269-281. PubMed ID: 35763879
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Optimal approximation of piecewise smooth functions using deep ReLU neural networks.
    Petersen P; Voigtlaender F
    Neural Netw; 2018 Dec; 108():296-330. PubMed ID: 30245431
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Robust Reconstruction of the Void Fraction from Noisy Magnetic Flux Density Using Invertible Neural Networks.
    Kumar N; Krause L; Wondrak T; Eckert S; Eckert K; Gumhold S
    Sensors (Basel); 2024 Feb; 24(4):. PubMed ID: 38400371
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Error bounds for approximations with deep ReLU networks.
    Yarotsky D
    Neural Netw; 2017 Oct; 94():103-114. PubMed ID: 28756334
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Designing Universally-Approximating Deep Neural Networks: A First-Order Optimization Approach.
    Wu Z; Xiao M; Fang C; Lin Z
    IEEE Trans Pattern Anal Mach Intell; 2024 Mar; PP():. PubMed ID: 38526901
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Elliptic differential operators on Lipschitz domains and abstract boundary value problems.
    Behrndt J; Micheler T
    J Funct Anal; 2014 Nov; 267(10):3657-3709. PubMed ID: 27570299
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Operator compression with deep neural networks.
    Kröpfl F; Maier R; Peterseim D
    Adv Contin Discret Model; 2022; 2022(1):29. PubMed ID: 35531267
    [TBL] [Abstract][Full Text] [Related]  

  • 13. On the capacity of deep generative networks for approximating distributions.
    Yang Y; Li Z; Wang Y
    Neural Netw; 2022 Jan; 145():144-154. PubMed ID: 34749027
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Approximation rates for neural networks with encodable weights in smoothness spaces.
    Gühring I; Raslan M
    Neural Netw; 2021 Feb; 134():107-130. PubMed ID: 33310376
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Sample-Based Continuous Approximate Method for Constructing Interval Neural Network.
    Shen X; Ouyang T; Hashimoto K; Wu Y
    IEEE Trans Neural Netw Learn Syst; 2024 Jun; PP():. PubMed ID: 38870003
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Neural networks with ReLU powers need less depth.
    Cabanilla KIM; Mohammad RZ; Lope JEC
    Neural Netw; 2024 Apr; 172():106073. PubMed ID: 38159509
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Approximation capabilities of measure-preserving neural networks.
    Zhu A; Jin P; Tang Y
    Neural Netw; 2022 Mar; 147():72-80. PubMed ID: 34995951
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Approximation of state-space trajectories by locally recurrent globally feed-forward neural networks.
    Patan K
    Neural Netw; 2008 Jan; 21(1):59-64. PubMed ID: 18158233
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Invertible Neural BRDF for Object Inverse Rendering.
    Chen Z; Nobuhara S; Nishino K
    IEEE Trans Pattern Anal Mach Intell; 2022 Dec; 44(12):9380-9395. PubMed ID: 34807819
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Extended Dynamic Mode Decomposition with Invertible Dictionary Learning.
    Jin Y; Hou L; Zhong S
    Neural Netw; 2024 May; 173():106177. PubMed ID: 38382398
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.