These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

150 related articles for article (PubMed ID: 30921562)

  • 1. Gradient based hyperparameter optimization in Echo State Networks.
    Thiede LA; Parlitz U
    Neural Netw; 2019 Jul; 115():23-29. PubMed ID: 30921562
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Research on a learning rate with energy index in deep learning.
    Zhao H; Liu F; Zhang H; Liang Z
    Neural Netw; 2019 Feb; 110():225-231. PubMed ID: 30599419
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Recent advances in physical reservoir computing: A review.
    Tanaka G; Yamane T; Héroux JB; Nakane R; Kanazawa N; Takeda S; Numata H; Nakano D; Hirose A
    Neural Netw; 2019 Jul; 115():100-123. PubMed ID: 30981085
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Piecewise convexity of artificial neural networks.
    Rister B; Rubin DL
    Neural Netw; 2017 Oct; 94():34-45. PubMed ID: 28732233
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Computational analysis of memory capacity in echo state networks.
    Farkaš I; Bosák R; Gergeľ P
    Neural Netw; 2016 Nov; 83():109-120. PubMed ID: 27599031
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Optimization and applications of echo state networks with leaky-integrator neurons.
    Jaeger H; Lukosevicius M; Popovici D; Siewert U
    Neural Netw; 2007 Apr; 20(3):335-52. PubMed ID: 17517495
    [TBL] [Abstract][Full Text] [Related]  

  • 7. A framework for parameter estimation and model selection in kernel deep stacking networks.
    Welchowski T; Schmid M
    Artif Intell Med; 2016 Jun; 70():31-40. PubMed ID: 27431035
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Tuning hyperparameters of machine learning algorithms and deep neural networks using metaheuristics: A bioinformatics study on biomedical and biological cases.
    Nematzadeh S; Kiani F; Torkamanian-Afshar M; Aydin N
    Comput Biol Chem; 2022 Apr; 97():107619. PubMed ID: 35033837
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Gradient-based optimization of hyperparameters.
    Bengio Y
    Neural Comput; 2000 Aug; 12(8):1889-900. PubMed ID: 10953243
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Ensemble effort estimation with metaheuristic hyperparameters and weight optimization for achieving accuracy.
    Yasmin A; Haider Butt W; Daud A
    PLoS One; 2024; 19(4):e0300296. PubMed ID: 38573895
    [TBL] [Abstract][Full Text] [Related]  

  • 11. An internet traffic classification method based on echo state network and improved salp swarm algorithm.
    Zhang M; Sun W; Tian J; Zheng X; Guan S
    PeerJ Comput Sci; 2022; 8():e860. PubMed ID: 35494824
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Boundedness and convergence analysis of weight elimination for cyclic training of neural networks.
    Wang J; Ye Z; Gao W; Zurada JM
    Neural Netw; 2016 Oct; 82():49-61. PubMed ID: 27472447
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Machine Learning-Enabled NIR Spectroscopy. Part 3: Hyperparameter by Design (HyD) Based ANN-MLP Optimization, Model Generalizability, and Model Transferability.
    Ali H; Muthudoss P; Chauhan C; Kaliappan I; Kumar D; Paudel A; Ramasamy G
    AAPS PharmSciTech; 2023 Dec; 24(8):254. PubMed ID: 38062329
    [TBL] [Abstract][Full Text] [Related]  

  • 14. [Sparse Denoising Autoencoder Application in Identification of Counterfeit Pharmaceutical].
    Yang HH; Luo ZC; Jiang SJ; Zhang XB; Yin LH
    Guang Pu Xue Yu Guang Pu Fen Xi; 2016 Sep; 36(9):2774-9. PubMed ID: 30084593
    [TBL] [Abstract][Full Text] [Related]  

  • 15. A local Echo State Property through the largest Lyapunov exponent.
    Wainrib G; Galtier MN
    Neural Netw; 2016 Apr; 76():39-45. PubMed ID: 26849424
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Hyperparameter Optimization Techniques for Designing Software Sensors Based on Artificial Neural Networks.
    Blume S; Benedens T; Schramm D
    Sensors (Basel); 2021 Dec; 21(24):. PubMed ID: 34960528
    [TBL] [Abstract][Full Text] [Related]  

  • 17. On the Post Hoc Explainability of Optimized Self-Organizing Reservoir Network for Action Recognition.
    Lee GC; Loo CK
    Sensors (Basel); 2022 Mar; 22(5):. PubMed ID: 35271052
    [TBL] [Abstract][Full Text] [Related]  

  • 18. An unsupervised parameter learning model for RVFL neural network.
    Zhang Y; Wu J; Cai Z; Du B; Yu PS
    Neural Netw; 2019 Apr; 112():85-97. PubMed ID: 30771727
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Bayesian Multi-objective Hyperparameter Optimization for Accurate, Fast, and Efficient Neural Network Accelerator Design.
    Parsa M; Mitchell JP; Schuman CD; Patton RM; Potok TE; Roy K
    Front Neurosci; 2020; 14():667. PubMed ID: 32848531
    [TBL] [Abstract][Full Text] [Related]  

  • 20. ASD+M: Automatic parameter tuning in stochastic optimization and on-line learning.
    Wawrzyński P
    Neural Netw; 2017 Dec; 96():1-10. PubMed ID: 28950104
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 8.