These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

228 related articles for article (PubMed ID: 24808599)

  • 1. Ensemble learning in fixed expansion layer networks for mitigating catastrophic forgetting.
    Coop R; Mishtal A; Arel I
    IEEE Trans Neural Netw Learn Syst; 2013 Oct; 24(10):1623-34. PubMed ID: 24808599
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks.
    Velez R; Clune J
    PLoS One; 2017; 12(11):e0187736. PubMed ID: 29145413
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Rapid feedforward computation by temporal encoding and learning with spiking neurons.
    Yu Q; Tang H; Tan KC; Li H
    IEEE Trans Neural Netw Learn Syst; 2013 Oct; 24(10):1539-52. PubMed ID: 24808592
    [TBL] [Abstract][Full Text] [Related]  

  • 4. New training strategies for constructive neural networks with application to regression problems.
    Ma L; Khorasani K
    Neural Netw; 2004 May; 17(4):589-609. PubMed ID: 15109686
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Are multi-layer backpropagation networks catastrophically amnesic?
    Yamaguchi M
    Scand J Psychol; 2004 Nov; 45(5):357-61. PubMed ID: 15535804
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Avoiding Catastrophic Forgetting.
    Hasselmo ME
    Trends Cogn Sci; 2017 Jun; 21(6):407-408. PubMed ID: 28442279
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.
    Masse NY; Grant GD; Freedman DJ
    Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147
    [TBL] [Abstract][Full Text] [Related]  

  • 8. An H(∞) control approach to robust learning of feedforward neural networks.
    Jing X
    Neural Netw; 2011 Sep; 24(7):759-66. PubMed ID: 21458228
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Effective neural network ensemble approach for improving generalization performance.
    Yang J; Zeng X; Zhong S; Wu S
    IEEE Trans Neural Netw Learn Syst; 2013 Jun; 24(6):878-87. PubMed ID: 24808470
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Using noise to compute error surfaces in connectionist networks: a novel means of reducing catastrophic forgetting.
    French RM; Chater N
    Neural Comput; 2002 Jul; 14(7):1755-69. PubMed ID: 12079555
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Neural modularity helps organisms evolve to learn new skills without forgetting old skills.
    Ellefsen KO; Mouret JB; Clune J
    PLoS Comput Biol; 2015 Apr; 11(4):e1004128. PubMed ID: 25837826
    [TBL] [Abstract][Full Text] [Related]  

  • 12. The loading problem for recursive neural networks.
    Gori M; Sperduti A
    Neural Netw; 2005 Oct; 18(8):1064-79. PubMed ID: 16198537
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Universal approximation of extreme learning machine with adaptive growth of hidden nodes.
    Zhang R; Lan Y; Huang GB; Xu ZB
    IEEE Trans Neural Netw Learn Syst; 2012 Feb; 23(2):365-71. PubMed ID: 24808516
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Reducing Catastrophic Forgetting With Associative Learning: A Lesson From Fruit Flies.
    Shen Y; Dasgupta S; Navlakha S
    Neural Comput; 2023 Oct; 35(11):1797-1819. PubMed ID: 37725710
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Catastrophic forgetting in simple networks: an analysis of the pseudorehearsal solution.
    Frean M; Robins A
    Network; 1999 Aug; 10(3):227-36. PubMed ID: 10496474
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Adaptive categorization of ART networks in robot behavior learning using game-theoretic formulation.
    Fung WK; Liu YH
    Neural Netw; 2003 Dec; 16(10):1403-20. PubMed ID: 14622873
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Artificial neural network learning of nonstationary behavior in time series.
    Széliga MI; Verdes PF; Granitto PM; Ceccatto HA
    Int J Neural Syst; 2003 Apr; 13(2):103-9. PubMed ID: 12923923
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Toward Training Recurrent Neural Networks for Lifelong Learning.
    Sodhani S; Chandar S; Bengio Y
    Neural Comput; 2020 Jan; 32(1):1-35. PubMed ID: 31703175
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Recruitment learning of boolean functions in sparse random networks.
    Hogan JM; Diederich J
    Int J Neural Syst; 2001 Dec; 11(6):537-59. PubMed ID: 11852438
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Orthogonality is not a panacea: backpropagation and "catastrophic interference".
    Yamaguchi M
    Scand J Psychol; 2006 Oct; 47(5):339-44. PubMed ID: 16987202
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 12.