These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
227 related articles for article (PubMed ID: 36805261)
1. Improving transparency and representational generalizability through parallel continual learning. Paknezhad M; Rengarajan H; Yuan C; Suresh S; Gupta M; Ramasamy S; Lee HK Neural Netw; 2023 Apr; 161():449-465. PubMed ID: 36805261 [TBL] [Abstract][Full Text] [Related]
2. Continual learning with attentive recurrent neural networks for temporal data classification. Yin SY; Huang Y; Chang TY; Chang SF; Tseng VS Neural Netw; 2023 Jan; 158():171-187. PubMed ID: 36459884 [TBL] [Abstract][Full Text] [Related]
3. Subspace distillation for continual learning. Roy K; Simon C; Moghadam P; Harandi M Neural Netw; 2023 Oct; 167():65-79. PubMed ID: 37625243 [TBL] [Abstract][Full Text] [Related]
4. GC Bayasi N; Hamarneh G; Garbi R IEEE Trans Med Imaging; 2024 Nov; 43(11):3767-3779. PubMed ID: 38717881 [TBL] [Abstract][Full Text] [Related]
5. Triple-Memory Networks: A Brain-Inspired Method for Continual Learning. Wang L; Lei B; Li Q; Su H; Zhu J; Zhong Y IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):1925-1934. PubMed ID: 34529579 [TBL] [Abstract][Full Text] [Related]
6. Adaptive Progressive Continual Learning. Xu J; Ma J; Gao X; Zhu Z IEEE Trans Pattern Anal Mach Intell; 2022 Oct; 44(10):6715-6728. PubMed ID: 34232867 [TBL] [Abstract][Full Text] [Related]
7. GCReID: Generalized continual person re-identification via meta learning and knowledge accumulation. Liu Z; Feng C; Yu K; Hu J; Yang J Neural Netw; 2024 Nov; 179():106561. PubMed ID: 39084171 [TBL] [Abstract][Full Text] [Related]
8. Beneficial Perturbation Network for Designing General Adaptive Artificial Intelligence Systems. Wen S; Rios A; Ge Y; Itti L IEEE Trans Neural Netw Learn Syst; 2022 Aug; 33(8):3778-3791. PubMed ID: 33596177 [TBL] [Abstract][Full Text] [Related]
9. Self-Net: Lifelong Learning via Continual Self-Modeling. Mandivarapu JK; Camp B; Estrada R Front Artif Intell; 2020; 3():19. PubMed ID: 33733138 [TBL] [Abstract][Full Text] [Related]
10. Continual medical image denoising based on triplet neural networks collaboration. Zeng X; Guo Y; Li L; Liu Y Comput Biol Med; 2024 Sep; 179():108914. PubMed ID: 39053331 [TBL] [Abstract][Full Text] [Related]
11. Online continual learning with declarative memory. Xiao Z; Du Z; Wang R; Gan R; Li J Neural Netw; 2023 Jun; 163():146-155. PubMed ID: 37054513 [TBL] [Abstract][Full Text] [Related]
12. Continual pre-training mitigates forgetting in language and vision. Cossu A; Carta A; Passaro L; Lomonaco V; Tuytelaars T; Bacciu D Neural Netw; 2024 Nov; 179():106492. PubMed ID: 38986187 [TBL] [Abstract][Full Text] [Related]
13. Encoding primitives generation policy learning for robotic arm to overcome catastrophic forgetting in sequential multi-tasks learning. Xiong F; Liu Z; Huang K; Yang X; Qiao H; Hussain A Neural Netw; 2020 Sep; 129():163-173. PubMed ID: 32535306 [TBL] [Abstract][Full Text] [Related]
14. Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation. Peng J; Tang B; Jiang H; Li Z; Lei Y; Lin T; Li H IEEE Trans Neural Netw Learn Syst; 2022 Sep; 33(9):4243-4256. PubMed ID: 33577459 [TBL] [Abstract][Full Text] [Related]
15. A Continual Learning Survey: Defying Forgetting in Classification Tasks. De Lange M; Aljundi R; Masana M; Parisot S; Jia X; Leonardis A; Slabaugh G; Tuytelaars T IEEE Trans Pattern Anal Mach Intell; 2022 Jul; 44(7):3366-3385. PubMed ID: 33544669 [TBL] [Abstract][Full Text] [Related]
16. Comparing continual task learning in minds and machines. Flesch T; Balaguer J; Dekker R; Nili H; Summerfield C Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10313-E10322. PubMed ID: 30322916 [TBL] [Abstract][Full Text] [Related]
17. Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning. Yao X; Huang T; Wu C; Zhang RX; Sun L Neural Comput; 2019 Nov; 31(11):2266-2291. PubMed ID: 31525313 [TBL] [Abstract][Full Text] [Related]
18. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization. Masse NY; Grant GD; Freedman DJ Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147 [TBL] [Abstract][Full Text] [Related]
19. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting. Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377 [TBL] [Abstract][Full Text] [Related]