These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

140 related articles for article (PubMed ID: 37725710)

  • 1. Reducing Catastrophic Forgetting With Associative Learning: A Lesson From Fruit Flies.
    Shen Y; Dasgupta S; Navlakha S
    Neural Comput; 2023 Oct; 35(11):1797-1819. PubMed ID: 37725710
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Triple-Memory Networks: A Brain-Inspired Method for Continual Learning.
    Wang L; Lei B; Li Q; Su H; Zhu J; Zhong Y
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):1925-1934. PubMed ID: 34529579
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Can sleep protect memories from catastrophic forgetting?
    González OC; Sokolov Y; Krishnan GP; Delanois JE; Bazhenov M
    Elife; 2020 Aug; 9():. PubMed ID: 32748786
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Brain-inspired replay for continual learning with artificial neural networks.
    van de Ven GM; Siegelmann HT; Tolias AS
    Nat Commun; 2020 Aug; 11(1):4069. PubMed ID: 32792531
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.
    Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.
    Masse NY; Grant GD; Freedman DJ
    Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks.
    Tadros T; Krishnan GP; Ramyaa R; Bazhenov M
    Nat Commun; 2022 Dec; 13(1):7742. PubMed ID: 36522325
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Bio-inspired, task-free continual learning through activity regularization.
    Lässig F; Aceituno PV; Sorbaro M; Grewe BF
    Biol Cybern; 2023 Oct; 117(4-5):345-361. PubMed ID: 37589728
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Ensemble learning in fixed expansion layer networks for mitigating catastrophic forgetting.
    Coop R; Mishtal A; Arel I
    IEEE Trans Neural Netw Learn Syst; 2013 Oct; 24(10):1623-34. PubMed ID: 24808599
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation.
    Peng J; Tang B; Jiang H; Li Z; Lei Y; Lin T; Li H
    IEEE Trans Neural Netw Learn Syst; 2022 Sep; 33(9):4243-4256. PubMed ID: 33577459
    [TBL] [Abstract][Full Text] [Related]  

  • 11. LwF-ECG: Learning-without-forgetting approach for electrocardiogram heartbeat classification based on memory with task selector.
    Ammour N; Alhichri H; Bazi Y; Alajlan N
    Comput Biol Med; 2021 Oct; 137():104807. PubMed ID: 34496312
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Schematic memory persistence and transience for efficient and robust continual learning.
    Gao Y; Ascoli GA; Zhao L
    Neural Netw; 2021 Dec; 144():49-60. PubMed ID: 34450446
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Online continual learning with declarative memory.
    Xiao Z; Du Z; Wang R; Gan R; Li J
    Neural Netw; 2023 Jun; 163():146-155. PubMed ID: 37054513
    [TBL] [Abstract][Full Text] [Related]  

  • 14. A Multi-functional Memristive Pavlov Associative Memory Circuit Based on Neural Mechanisms.
    Zhang Y; Zeng Z
    IEEE Trans Biomed Circuits Syst; 2021 Oct; 15(5):978-993. PubMed ID: 34460383
    [TBL] [Abstract][Full Text] [Related]  

  • 15. A sparse quantized hopfield network for online-continual memory.
    Alonso N; Krichmar JL
    Nat Commun; 2024 May; 15(1):3722. PubMed ID: 38697981
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Improving transparency and representational generalizability through parallel continual learning.
    Paknezhad M; Rengarajan H; Yuan C; Suresh S; Gupta M; Ramasamy S; Lee HK
    Neural Netw; 2023 Apr; 161():449-465. PubMed ID: 36805261
    [TBL] [Abstract][Full Text] [Related]  

  • 17. A learning rule for very simple universal approximators consisting of a single layer of perceptrons.
    Auer P; Burgsteiner H; Maass W
    Neural Netw; 2008 Jun; 21(5):786-95. PubMed ID: 18249524
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Neural learning rules for generating flexible predictions and computing the successor representation.
    Fang C; Aronov D; Abbott LF; Mackevicius EL
    Elife; 2023 Mar; 12():. PubMed ID: 36928104
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Adaptive Progressive Continual Learning.
    Xu J; Ma J; Gao X; Zhu Z
    IEEE Trans Pattern Anal Mach Intell; 2022 Oct; 44(10):6715-6728. PubMed ID: 34232867
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Continual learning with attentive recurrent neural networks for temporal data classification.
    Yin SY; Huang Y; Chang TY; Chang SF; Tseng VS
    Neural Netw; 2023 Jan; 158():171-187. PubMed ID: 36459884
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.