These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
7. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting. Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377 [TBL] [Abstract][Full Text] [Related]
8. Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting. Xie Z; He F; Fu S; Sato I; Tao D; Sugiyama M Neural Comput; 2021 Jul; 33(8):2163-2192. PubMed ID: 34310675 [TBL] [Abstract][Full Text] [Related]
9. Continual medical image denoising based on triplet neural networks collaboration. Zeng X; Guo Y; Li L; Liu Y Comput Biol Med; 2024 Sep; 179():108914. PubMed ID: 39053331 [TBL] [Abstract][Full Text] [Related]
10. Continual learning with attentive recurrent neural networks for temporal data classification. Yin SY; Huang Y; Chang TY; Chang SF; Tseng VS Neural Netw; 2023 Jan; 158():171-187. PubMed ID: 36459884 [TBL] [Abstract][Full Text] [Related]
12. Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning. Yao X; Huang T; Wu C; Zhang RX; Sun L Neural Comput; 2019 Nov; 31(11):2266-2291. PubMed ID: 31525313 [TBL] [Abstract][Full Text] [Related]
13. Neural modularity helps organisms evolve to learn new skills without forgetting old skills. Ellefsen KO; Mouret JB; Clune J PLoS Comput Biol; 2015 Apr; 11(4):e1004128. PubMed ID: 25837826 [TBL] [Abstract][Full Text] [Related]
14. Continual Learning for Activity Recognition. Kumar Sah R; Mirzadeh SI; Ghasemzadeh H Annu Int Conf IEEE Eng Med Biol Soc; 2022 Jul; 2022():2416-2420. PubMed ID: 36085745 [TBL] [Abstract][Full Text] [Related]
15. Ensemble learning in fixed expansion layer networks for mitigating catastrophic forgetting. Coop R; Mishtal A; Arel I IEEE Trans Neural Netw Learn Syst; 2013 Oct; 24(10):1623-34. PubMed ID: 24808599 [TBL] [Abstract][Full Text] [Related]
16. Continuous learning of spiking networks trained with local rules. Antonov DI; Sviatov KV; Sukhov S Neural Netw; 2022 Nov; 155():512-522. PubMed ID: 36166978 [TBL] [Abstract][Full Text] [Related]
17. Comparing continual task learning in minds and machines. Flesch T; Balaguer J; Dekker R; Nili H; Summerfield C Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10313-E10322. PubMed ID: 30322916 [TBL] [Abstract][Full Text] [Related]
18. Reducing Catastrophic Forgetting With Associative Learning: A Lesson From Fruit Flies. Shen Y; Dasgupta S; Navlakha S Neural Comput; 2023 Oct; 35(11):1797-1819. PubMed ID: 37725710 [TBL] [Abstract][Full Text] [Related]
19. Class-Incremental Learning on Video-Based Action Recognition by Distillation of Various Knowledge. Maraghi VO; Faez K Comput Intell Neurosci; 2022; 2022():4879942. PubMed ID: 35371208 [TBL] [Abstract][Full Text] [Related]
20. Triple-Memory Networks: A Brain-Inspired Method for Continual Learning. Wang L; Lei B; Li Q; Su H; Zhu J; Zhong Y IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):1925-1934. PubMed ID: 34529579 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]