These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

290 related articles for article (PubMed ID: 18249761)

  • 41. TAO-robust backpropagation learning algorithm.
    Pernía-Espinoza AV; Ordieres-Meré JB; Martínez-de-Pisón FJ; González-Marcos A
    Neural Netw; 2005 Mar; 18(2):191-204. PubMed ID: 15795116
    [TBL] [Abstract][Full Text] [Related]  

  • 42. A hybrid linear/nonlinear training algorithm for feedforward neural networks.
    McLoone S; Brown MD; Irwin G; Lightbody A
    IEEE Trans Neural Netw; 1998; 9(4):669-84. PubMed ID: 18252490
    [TBL] [Abstract][Full Text] [Related]  

  • 43. Learning without local minima in radial basis function networks.
    Bianchini M; Frasconi P; Gori M
    IEEE Trans Neural Netw; 1995; 6(3):749-56. PubMed ID: 18263359
    [TBL] [Abstract][Full Text] [Related]  

  • 44. An improved algorithm for learning long-term dependency problems in adaptive processing of data structures.
    Cho SY; Chi Z; Siu WC; Tsoi AC
    IEEE Trans Neural Netw; 2003; 14(4):781-93. PubMed ID: 18238059
    [TBL] [Abstract][Full Text] [Related]  

  • 45. Convergence of cyclic and almost-cyclic learning with momentum for feedforward neural networks.
    Wang J; Yang J; Wu W
    IEEE Trans Neural Netw; 2011 Aug; 22(8):1297-306. PubMed ID: 21813357
    [TBL] [Abstract][Full Text] [Related]  

  • 46. Support vector machine based training of multilayer feedforward neural networks as optimized by particle swarm algorithm: application in QSAR studies of bioactivity of organic compounds.
    Lin WQ; Jiang JH; Zhou YP; Wu HL; Shen GL; Yu RQ
    J Comput Chem; 2007 Jan; 28(2):519-27. PubMed ID: 17186488
    [TBL] [Abstract][Full Text] [Related]  

  • 47. A hybrid neural learning algorithm using evolutionary learning and derivative free local search method.
    Ghosh R; Yearwood J; Ghosh M; Bagirov A
    Int J Neural Syst; 2006 Jun; 16(3):201-13. PubMed ID: 17044241
    [TBL] [Abstract][Full Text] [Related]  

  • 48. Robust adaptive learning of feedforward neural networks via LMI optimizations.
    Jing X
    Neural Netw; 2012 Jul; 31():33-45. PubMed ID: 22459273
    [TBL] [Abstract][Full Text] [Related]  

  • 49. Adding learning to cellular genetic algorithms for training recurrent neural networks.
    Ku KW; Mak MW; Siu WC
    IEEE Trans Neural Netw; 1999; 10(2):239-52. PubMed ID: 18252524
    [TBL] [Abstract][Full Text] [Related]  

  • 50. A Hybrid Constructive Algorithm for Single-Layer Feedforward Networks Learning.
    Wu X; Rózycki P; Wilamowski BM
    IEEE Trans Neural Netw Learn Syst; 2015 Aug; 26(8):1659-68. PubMed ID: 25216485
    [TBL] [Abstract][Full Text] [Related]  

  • 51. A Novel Pruning Algorithm for Smoothing Feedforward Neural Networks Based on Group Lasso Method.
    Wang J; Xu C; Yang X; Zurada JM
    IEEE Trans Neural Netw Learn Syst; 2018 May; 29(5):2012-2024. PubMed ID: 28961129
    [TBL] [Abstract][Full Text] [Related]  

  • 52. Training feedforward networks with the Marquardt algorithm.
    Hagan MT; Menhaj MB
    IEEE Trans Neural Netw; 1994; 5(6):989-93. PubMed ID: 18267874
    [TBL] [Abstract][Full Text] [Related]  

  • 53. Hierarchical genetic algorithm for near optimal feedforward neural network design.
    Yen G; Lu H
    Int J Neural Syst; 2002 Feb; 12(1):31-43. PubMed ID: 11852443
    [TBL] [Abstract][Full Text] [Related]  

  • 54. Training Feedforward Neural Network Using Enhanced Black Hole Algorithm: A Case Study on COVID-19 Related ACE2 Gene Expression Classification.
    Pashaei E; Pashaei E
    Arab J Sci Eng; 2021; 46(4):3807-3828. PubMed ID: 33520590
    [TBL] [Abstract][Full Text] [Related]  

  • 55. Error minimized extreme learning machine with growth of hidden nodes and incremental learning.
    Feng G; Huang GB; Lin Q; Gay R
    IEEE Trans Neural Netw; 2009 Aug; 20(8):1352-7. PubMed ID: 19596632
    [TBL] [Abstract][Full Text] [Related]  

  • 56. Dynamic learning rate optimization of the backpropagation algorithm.
    Yu XH; Chen GA; Cheng SX
    IEEE Trans Neural Netw; 1995; 6(3):669-77. PubMed ID: 18263352
    [TBL] [Abstract][Full Text] [Related]  

  • 57. Dynamic optimal learning rates of a certain class of fuzzy neural networks and its applications with genetic algorithm.
    Wang CH; Liu HL; Lin CT
    IEEE Trans Syst Man Cybern B Cybern; 2001; 31(3):467-75. PubMed ID: 18244813
    [TBL] [Abstract][Full Text] [Related]  

  • 58. The convergence of backpropagation trained neural networks for various weight update frequencies.
    Torresen J
    Int J Neural Syst; 1997 Jun; 8(3):263-77. PubMed ID: 9427101
    [TBL] [Abstract][Full Text] [Related]  

  • 59. Universal Approximation Using Feedforward Neural Networks: A Survey of Some Existing Methods, and Some New Results.
    Chung Tsoi A; Scarselli F
    Neural Netw; 1998 Jan; 11(1):15-37. PubMed ID: 12662846
    [TBL] [Abstract][Full Text] [Related]  

  • 60. A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation.
    Vuković N; Miljković Z
    Neural Netw; 2013 Oct; 46():210-26. PubMed ID: 23811384
    [TBL] [Abstract][Full Text] [Related]  

    [Previous]   [Next]    [New Search]
    of 15.