These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

292 related articles for article (PubMed ID: 32535306)

  • 1. Encoding primitives generation policy learning for robotic arm to overcome catastrophic forgetting in sequential multi-tasks learning.
    Xiong F; Liu Z; Huang K; Yang X; Qiao H; Hussain A
    Neural Netw; 2020 Sep; 129():163-173. PubMed ID: 32535306
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Self-Net: Lifelong Learning via Continual Self-Modeling.
    Mandivarapu JK; Camp B; Estrada R
    Front Artif Intell; 2020; 3():19. PubMed ID: 33733138
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation.
    Peng J; Tang B; Jiang H; Li Z; Lei Y; Lin T; Li H
    IEEE Trans Neural Netw Learn Syst; 2022 Sep; 33(9):4243-4256. PubMed ID: 33577459
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Continual Learning Using Bayesian Neural Networks.
    Li H; Barnaghi P; Enshaeifar S; Ganz F
    IEEE Trans Neural Netw Learn Syst; 2021 Sep; 32(9):4243-4252. PubMed ID: 32866104
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.
    Masse NY; Grant GD; Freedman DJ
    Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Variational Data-Free Knowledge Distillation for Continual Learning.
    Li X; Wang S; Sun J; Xu Z
    IEEE Trans Pattern Anal Mach Intell; 2023 Oct; 45(10):12618-12634. PubMed ID: 37126627
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.
    Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning.
    Yao X; Huang T; Wu C; Zhang RX; Sun L
    Neural Comput; 2019 Nov; 31(11):2266-2291. PubMed ID: 31525313
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Continual learning with attentive recurrent neural networks for temporal data classification.
    Yin SY; Huang Y; Chang TY; Chang SF; Tseng VS
    Neural Netw; 2023 Jan; 158():171-187. PubMed ID: 36459884
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Online continual learning with declarative memory.
    Xiao Z; Du Z; Wang R; Gan R; Li J
    Neural Netw; 2023 Jun; 163():146-155. PubMed ID: 37054513
    [TBL] [Abstract][Full Text] [Related]  

  • 11. LwF-ECG: Learning-without-forgetting approach for electrocardiogram heartbeat classification based on memory with task selector.
    Ammour N; Alhichri H; Bazi Y; Alajlan N
    Comput Biol Med; 2021 Oct; 137():104807. PubMed ID: 34496312
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Continual Learning for Activity Recognition.
    Kumar Sah R; Mirzadeh SI; Ghasemzadeh H
    Annu Int Conf IEEE Eng Med Biol Soc; 2022 Jul; 2022():2416-2420. PubMed ID: 36085745
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Adaptive Progressive Continual Learning.
    Xu J; Ma J; Gao X; Zhu Z
    IEEE Trans Pattern Anal Mach Intell; 2022 Oct; 44(10):6715-6728. PubMed ID: 34232867
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Neural modularity helps organisms evolve to learn new skills without forgetting old skills.
    Ellefsen KO; Mouret JB; Clune J
    PLoS Comput Biol; 2015 Apr; 11(4):e1004128. PubMed ID: 25837826
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Convolutional Neural Network With Developmental Memory for Continual Learning.
    Park GM; Yoo SM; Kim JH
    IEEE Trans Neural Netw Learn Syst; 2021 Jun; 32(6):2691-2705. PubMed ID: 32692685
    [TBL] [Abstract][Full Text] [Related]  

  • 16. On Sequential Bayesian Inference for Continual Learning.
    Kessler S; Cobb A; Rudner TGJ; Zohren S; Roberts SJ
    Entropy (Basel); 2023 May; 25(6):. PubMed ID: 37372228
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Efficient Architecture Search for Continual Learning.
    Gao Q; Luo Z; Klabjan D; Zhang F
    IEEE Trans Neural Netw Learn Syst; 2023 Nov; 34(11):8555-8565. PubMed ID: 35235526
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation.
    Golden R; Delanois JE; Sanda P; Bazhenov M
    PLoS Comput Biol; 2022 Nov; 18(11):e1010628. PubMed ID: 36399437
    [TBL] [Abstract][Full Text] [Related]  

  • 19. GopGAN: Gradients Orthogonal Projection Generative Adversarial Network With Continual Learning.
    Li X; Wang W
    IEEE Trans Neural Netw Learn Syst; 2023 Jan; 34(1):215-227. PubMed ID: 34270433
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Overcoming catastrophic forgetting in neural networks.
    Kirkpatrick J; Pascanu R; Rabinowitz N; Veness J; Desjardins G; Rusu AA; Milan K; Quan J; Ramalho T; Grabska-Barwinska A; Hassabis D; Clopath C; Kumaran D; Hadsell R
    Proc Natl Acad Sci U S A; 2017 Mar; 114(13):3521-3526. PubMed ID: 28292907
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 15.