These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
277 related articles for article (PubMed ID: 34339377)
1. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting. Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377 [TBL] [Abstract][Full Text] [Related]
2. Triple-Memory Networks: A Brain-Inspired Method for Continual Learning. Wang L; Lei B; Li Q; Su H; Zhu J; Zhong Y IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):1925-1934. PubMed ID: 34529579 [TBL] [Abstract][Full Text] [Related]
3. Overcoming catastrophic forgetting in neural networks. Kirkpatrick J; Pascanu R; Rabinowitz N; Veness J; Desjardins G; Rusu AA; Milan K; Quan J; Ramalho T; Grabska-Barwinska A; Hassabis D; Clopath C; Kumaran D; Hadsell R Proc Natl Acad Sci U S A; 2017 Mar; 114(13):3521-3526. PubMed ID: 28292907 [TBL] [Abstract][Full Text] [Related]
4. Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning. Yao X; Huang T; Wu C; Zhang RX; Sun L Neural Comput; 2019 Nov; 31(11):2266-2291. PubMed ID: 31525313 [TBL] [Abstract][Full Text] [Related]
5. Brain-inspired replay for continual learning with artificial neural networks. van de Ven GM; Siegelmann HT; Tolias AS Nat Commun; 2020 Aug; 11(1):4069. PubMed ID: 32792531 [TBL] [Abstract][Full Text] [Related]
6. Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation. Peng J; Tang B; Jiang H; Li Z; Lei Y; Lin T; Li H IEEE Trans Neural Netw Learn Syst; 2022 Sep; 33(9):4243-4256. PubMed ID: 33577459 [TBL] [Abstract][Full Text] [Related]
7. LwF-ECG: Learning-without-forgetting approach for electrocardiogram heartbeat classification based on memory with task selector. Ammour N; Alhichri H; Bazi Y; Alajlan N Comput Biol Med; 2021 Oct; 137():104807. PubMed ID: 34496312 [TBL] [Abstract][Full Text] [Related]
8. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization. Masse NY; Grant GD; Freedman DJ Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147 [TBL] [Abstract][Full Text] [Related]
9. Schematic memory persistence and transience for efficient and robust continual learning. Gao Y; Ascoli GA; Zhao L Neural Netw; 2021 Dec; 144():49-60. PubMed ID: 34450446 [TBL] [Abstract][Full Text] [Related]
10. Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation. Golden R; Delanois JE; Sanda P; Bazhenov M PLoS Comput Biol; 2022 Nov; 18(11):e1010628. PubMed ID: 36399437 [TBL] [Abstract][Full Text] [Related]
11. Continual learning with attentive recurrent neural networks for temporal data classification. Yin SY; Huang Y; Chang TY; Chang SF; Tseng VS Neural Netw; 2023 Jan; 158():171-187. PubMed ID: 36459884 [TBL] [Abstract][Full Text] [Related]