These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
3. Neural modularity helps organisms evolve to learn new skills without forgetting old skills. Ellefsen KO; Mouret JB; Clune J PLoS Comput Biol; 2015 Apr; 11(4):e1004128. PubMed ID: 25837826 [TBL] [Abstract][Full Text] [Related]
4. Ensemble learning in fixed expansion layer networks for mitigating catastrophic forgetting. Coop R; Mishtal A; Arel I IEEE Trans Neural Netw Learn Syst; 2013 Oct; 24(10):1623-34. PubMed ID: 24808599 [TBL] [Abstract][Full Text] [Related]
5. A Bayesian attractor network with incremental learning. Sandberg A; Lansner A; Petersson KM; Ekeberg O Network; 2002 May; 13(2):179-94. PubMed ID: 12061419 [TBL] [Abstract][Full Text] [Related]
6. Methods for reducing interference in the Complementary Learning Systems model: oscillating inhibition and autonomous memory rehearsal. Norman KA; Newman EL; Perotte AJ Neural Netw; 2005 Nov; 18(9):1212-28. PubMed ID: 16260116 [TBL] [Abstract][Full Text] [Related]
7. Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation. Golden R; Delanois JE; Sanda P; Bazhenov M PLoS Comput Biol; 2022 Nov; 18(11):e1010628. PubMed ID: 36399437 [TBL] [Abstract][Full Text] [Related]
9. Reducing Catastrophic Forgetting With Associative Learning: A Lesson From Fruit Flies. Shen Y; Dasgupta S; Navlakha S Neural Comput; 2023 Oct; 35(11):1797-1819. PubMed ID: 37725710 [TBL] [Abstract][Full Text] [Related]
10. Using noise to compute error surfaces in connectionist networks: a novel means of reducing catastrophic forgetting. French RM; Chater N Neural Comput; 2002 Jul; 14(7):1755-69. PubMed ID: 12079555 [TBL] [Abstract][Full Text] [Related]
11. Model architecture can transform catastrophic forgetting into positive transfer. Ruiz-Garcia M Sci Rep; 2022 Jun; 12(1):10736. PubMed ID: 35750768 [TBL] [Abstract][Full Text] [Related]
13. Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning. Yao X; Huang T; Wu C; Zhang RX; Sun L Neural Comput; 2019 Nov; 31(11):2266-2291. PubMed ID: 31525313 [TBL] [Abstract][Full Text] [Related]
14. Catastrophic forgetting in simple networks: an analysis of the pseudorehearsal solution. Frean M; Robins A Network; 1999 Aug; 10(3):227-36. PubMed ID: 10496474 [TBL] [Abstract][Full Text] [Related]
15. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting. Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377 [TBL] [Abstract][Full Text] [Related]
16. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization. Masse NY; Grant GD; Freedman DJ Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147 [TBL] [Abstract][Full Text] [Related]
17. A neural network model of retrieval-induced forgetting. Norman KA; Newman EL; Detre G Psychol Rev; 2007 Oct; 114(4):887-953. PubMed ID: 17907868 [TBL] [Abstract][Full Text] [Related]
18. Incremental Concept Learning via Online Generative Memory Recall. Li H; Dong W; Hu BG IEEE Trans Neural Netw Learn Syst; 2021 Jul; 32(7):3206-3216. PubMed ID: 32759086 [TBL] [Abstract][Full Text] [Related]
19. Continual lifelong learning with neural networks: A review. Parisi GI; Kemker R; Part JL; Kanan C; Wermter S Neural Netw; 2019 May; 113():54-71. PubMed ID: 30780045 [TBL] [Abstract][Full Text] [Related]