These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

74 related articles for article (PubMed ID: 12850011)

  • 61. A method to determine the required number of neural-network training repetitions.
    Iyer MS; Rhinehart RR
    IEEE Trans Neural Netw; 1999; 10(2):427-32. PubMed ID: 18252540
    [TBL] [Abstract][Full Text] [Related]  

  • 62. Two artificial synapses are better than one.
    Adam GC
    Nature; 2018 Jun; 558(7708):39-40. PubMed ID: 29872204
    [No Abstract]   [Full Text] [Related]  

  • 63. Extreme Trust Region Policy Optimization for Active Object Recognition.
    Liu H; Wu Y; Sun F; Huaping Liu ; Yupei Wu ; Fuchun Sun ; Sun F; Liu H; Wu Y
    IEEE Trans Neural Netw Learn Syst; 2018 Jun; 29(6):2253-2258. PubMed ID: 29771676
    [TBL] [Abstract][Full Text] [Related]  

  • 64. On the Trustworthiness of Soft Computing in Medicine.
    Wolff D; Marschollek M; Kupka T
    Stud Health Technol Inform; 2019; 258():51-52. PubMed ID: 30942712
    [No Abstract]   [Full Text] [Related]  

  • 65. Optimal field-scale groundwater remediation using neural networks and the genetic algorithm.
    Rogers LL; Dowla FU; Johnson VM
    Environ Sci Technol; 1995 May; 29(5):1145-55. PubMed ID: 22192005
    [No Abstract]   [Full Text] [Related]  

  • 66. Neural networks with nonlinear synapses and a static noise.
    Sompolinsky H
    Phys Rev A Gen Phys; 1986 Sep; 34(3):2571-2574. PubMed ID: 9897569
    [No Abstract]   [Full Text] [Related]  

  • 67. Theory of synapse distribution on dendrites in neural networks.
    Matsuba I
    Phys Rev A Gen Phys; 1989 Oct; 40(7):4045-4049. PubMed ID: 9902625
    [No Abstract]   [Full Text] [Related]  

  • 68. On structure-exploiting trust-region regularized nonlinear least squares algorithms for neural-network learning.
    Mizutani E; Demmel JW
    Neural Netw; 2003; 16(5-6):745-53. PubMed ID: 12850030
    [TBL] [Abstract][Full Text] [Related]  

  • 69. Speeding up backpropagation using multiobjective evolutionary algorithms.
    Abbass HA
    Neural Comput; 2003 Nov; 15(11):2705-26. PubMed ID: 14577859
    [TBL] [Abstract][Full Text] [Related]  

  • 70. Leap-frog is a robust algorithm for training neural networks.
    Holm JE; Botha EC
    Network; 1999 Feb; 10(1):1-13. PubMed ID: 10372759
    [TBL] [Abstract][Full Text] [Related]  

  • 71. Generation of optimal artificial neural networks using a pattern search algorithm: application to approximation of chemical systems.
    Ihme M; Marsden AL; Pitsch H
    Neural Comput; 2008 Feb; 20(2):573-601. PubMed ID: 18045024
    [TBL] [Abstract][Full Text] [Related]  

  • 72. An efficient training algorithm for dynamic synapse neural networks using trust region methods.
    Namarvar HH; Berger TW
    Neural Netw; 2003; 16(5-6):585-91. PubMed ID: 12850011
    [TBL] [Abstract][Full Text] [Related]  

  • 73.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

  • 74.
    ; ; . PubMed ID:
    [No Abstract]   [Full Text] [Related]  

    [Previous]     [New Search]
    of 4.