These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

137 related articles for article (PubMed ID: 12662858)

  • 1. Optimal Linear Combinations of Neural Networks.
    Hashem S
    Neural Netw; 1997 Jun; 10(4):599-614. PubMed ID: 12662858
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Improving model accuracy using optimal linear combinations of trained neural networks.
    Hashem S; Schmeiser B
    IEEE Trans Neural Netw; 1995; 6(3):792-4. PubMed ID: 18263368
    [TBL] [Abstract][Full Text] [Related]  

  • 3. An information theoretic approach for combining neural network process models.
    Sridhar DV; Bartlett EB; Seagrave RC
    Neural Netw; 1999 Jul; 12(6):915-926. PubMed ID: 12662666
    [TBL] [Abstract][Full Text] [Related]  

  • 4. MABAL: a Novel Deep-Learning Architecture for Machine-Assisted Bone Age Labeling.
    Mutasa S; Chang PD; Ruzal-Shapiro C; Ayyala R
    J Digit Imaging; 2018 Aug; 31(4):513-519. PubMed ID: 29404850
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Neural Networks for Predicting Conditional Probability Densities: Improved Training Scheme Combining EM and RVFL.
    Taylor JG; Husmeier D
    Neural Netw; 1998 Jan; 11(1):89-116. PubMed ID: 12662851
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A new formulation for feedforward neural networks.
    Razavi S; Tolson BA
    IEEE Trans Neural Netw; 2011 Oct; 22(10):1588-98. PubMed ID: 21859600
    [TBL] [Abstract][Full Text] [Related]  

  • 7. On Temporal Generalization of Simple Recurrent Networks.
    Ahalt SC; Xiaomei L; DeLiang W
    Neural Netw; 1996 Oct; 9(7):1099-1118. PubMed ID: 12662586
    [TBL] [Abstract][Full Text] [Related]  

  • 8. A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation.
    Vuković N; Miljković Z
    Neural Netw; 2013 Oct; 46():210-26. PubMed ID: 23811384
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Fast Second Order Learning Algorithm for Feedforward Multilayer Neural Networks and its Applications.
    Stodolski M; Bojarczak P; Osowski S
    Neural Netw; 1996 Dec; 9(9):1583-1596. PubMed ID: 12662555
    [TBL] [Abstract][Full Text] [Related]  

  • 10. COVNET: a cooperative coevolutionary model for evolving artificial neural networks.
    Garcia-Pedrajas N; Hervas-Martinez C; Munoz-Perez J
    IEEE Trans Neural Netw; 2003; 14(3):575-96. PubMed ID: 18238040
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Effective neural network ensemble approach for improving generalization performance.
    Yang J; Zeng X; Zhong S; Wu S
    IEEE Trans Neural Netw Learn Syst; 2013 Jun; 24(6):878-87. PubMed ID: 24808470
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research.
    Agatonovic-Kustrin S; Beresford R
    J Pharm Biomed Anal; 2000 Jun; 22(5):717-27. PubMed ID: 10815714
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Dynamical Neural Networks that Ensure Exponential Identification Error Convergence.
    Ioannou PA; Christodoulou MA; Kosmatopoulos EB
    Neural Netw; 1997 Mar; 10(2):299-314. PubMed ID: 12662528
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Quantum perceptron over a field and neural network architecture selection in a quantum computer.
    da Silva AJ; Ludermir TB; de Oliveira WR
    Neural Netw; 2016 Apr; 76():55-64. PubMed ID: 26878722
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Making use of population information in evolutionary artificial neural networks.
    Yao X; Liu Y
    IEEE Trans Syst Man Cybern B Cybern; 1998; 28(3):417-25. PubMed ID: 18255957
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Prediction of air pollutant concentration based on sparse response back-propagation training feedforward neural networks.
    Ding W; Zhang J; Leung Y
    Environ Sci Pollut Res Int; 2016 Oct; 23(19):19481-94. PubMed ID: 27384165
    [TBL] [Abstract][Full Text] [Related]  

  • 17. A new constructive algorithm for architectural and functional adaptation of artificial neural networks.
    Islam MM; Sattar MA; Amin MF; Yao X; Murase K
    IEEE Trans Syst Man Cybern B Cybern; 2009 Dec; 39(6):1590-605. PubMed ID: 19502131
    [TBL] [Abstract][Full Text] [Related]  

  • 18. How can a massive training artificial neural network (MTANN) be trained with a small number of cases in the distinction between nodules and vessels in thoracic CT?
    Suzuki K; Doi K
    Acad Radiol; 2005 Oct; 12(10):1333-41. PubMed ID: 16179210
    [TBL] [Abstract][Full Text] [Related]  

  • 19. A novel type of activation function in artificial neural networks: Trained activation function.
    Ertuğrul ÖF
    Neural Netw; 2018 Mar; 99():148-157. PubMed ID: 29427841
    [TBL] [Abstract][Full Text] [Related]  

  • 20. An Efficient EM-based Training Algorithm for Feedforward Neural Networks.
    Farmer J; Ji C; Ma S
    Neural Netw; 1997 Mar; 10(2):243-256. PubMed ID: 12662523
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.