These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

228 related articles for article (PubMed ID: 36399437)

  • 1. Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation.
    Golden R; Delanois JE; Sanda P; Bazhenov M
    PLoS Comput Biol; 2022 Nov; 18(11):e1010628. PubMed ID: 36399437
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation.
    Peng J; Tang B; Jiang H; Li Z; Lei Y; Lin T; Li H
    IEEE Trans Neural Netw Learn Syst; 2022 Sep; 33(9):4243-4256. PubMed ID: 33577459
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks.
    Tadros T; Krishnan GP; Ramyaa R; Bazhenov M
    Nat Commun; 2022 Dec; 13(1):7742. PubMed ID: 36522325
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.
    Masse NY; Grant GD; Freedman DJ
    Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Can sleep protect memories from catastrophic forgetting?
    González OC; Sokolov Y; Krishnan GP; Delanois JE; Bazhenov M
    Elife; 2020 Aug; 9():. PubMed ID: 32748786
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Continuous learning of spiking networks trained with local rules.
    Antonov DI; Sviatov KV; Sukhov S
    Neural Netw; 2022 Nov; 155():512-522. PubMed ID: 36166978
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.
    Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377
    [TBL] [Abstract][Full Text] [Related]  

  • 8. LwF-ECG: Learning-without-forgetting approach for electrocardiogram heartbeat classification based on memory with task selector.
    Ammour N; Alhichri H; Bazi Y; Alajlan N
    Comput Biol Med; 2021 Oct; 137():104807. PubMed ID: 34496312
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Overcoming catastrophic forgetting in neural networks.
    Kirkpatrick J; Pascanu R; Rabinowitz N; Veness J; Desjardins G; Rusu AA; Milan K; Quan J; Ramalho T; Grabska-Barwinska A; Hassabis D; Clopath C; Kumaran D; Hadsell R
    Proc Natl Acad Sci U S A; 2017 Mar; 114(13):3521-3526. PubMed ID: 28292907
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Comparing continual task learning in minds and machines.
    Flesch T; Balaguer J; Dekker R; Nili H; Summerfield C
    Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10313-E10322. PubMed ID: 30322916
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Encoding primitives generation policy learning for robotic arm to overcome catastrophic forgetting in sequential multi-tasks learning.
    Xiong F; Liu Z; Huang K; Yang X; Qiao H; Hussain A
    Neural Netw; 2020 Sep; 129():163-173. PubMed ID: 32535306
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Model architecture can transform catastrophic forgetting into positive transfer.
    Ruiz-Garcia M
    Sci Rep; 2022 Jun; 12(1):10736. PubMed ID: 35750768
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting.
    Xie Z; He F; Fu S; Sato I; Tao D; Sugiyama M
    Neural Comput; 2021 Jul; 33(8):2163-2192. PubMed ID: 34310675
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Neural modularity helps organisms evolve to learn new skills without forgetting old skills.
    Ellefsen KO; Mouret JB; Clune J
    PLoS Comput Biol; 2015 Apr; 11(4):e1004128. PubMed ID: 25837826
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Triple-Memory Networks: A Brain-Inspired Method for Continual Learning.
    Wang L; Lei B; Li Q; Su H; Zhu J; Zhong Y
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):1925-1934. PubMed ID: 34529579
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning.
    Yao X; Huang T; Wu C; Zhang RX; Sun L
    Neural Comput; 2019 Nov; 31(11):2266-2291. PubMed ID: 31525313
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks.
    Velez R; Clune J
    PLoS One; 2017; 12(11):e0187736. PubMed ID: 29145413
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Introducing principles of synaptic integration in the optimization of deep neural networks.
    Dellaferrera G; Woźniak S; Indiveri G; Pantazi A; Eleftheriou E
    Nat Commun; 2022 Apr; 13(1):1885. PubMed ID: 35393422
    [TBL] [Abstract][Full Text] [Related]  

  • 19. A brain-inspired algorithm that mitigates catastrophic forgetting of artificial and spiking neural networks with low computational cost.
    Zhang T; Cheng X; Jia S; Li CT; Poo MM; Xu B
    Sci Adv; 2023 Aug; 9(34):eadi2947. PubMed ID: 37624895
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Beneficial Perturbation Network for Designing General Adaptive Artificial Intelligence Systems.
    Wen S; Rios A; Ge Y; Itti L
    IEEE Trans Neural Netw Learn Syst; 2022 Aug; 33(8):3778-3791. PubMed ID: 33596177
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 12.