These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

167 related articles for article (PubMed ID: 36960579)

  • 1. Efficient, continual, and generalized learning in the brain - neural mechanism of Mental Schema 2.0.
    Ohki T; Kunii N; Chao ZC
    Rev Neurosci; 2023 Dec; 34(8):839-868. PubMed ID: 36960579
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Rethinking the performance comparison between SNNS and ANNS.
    Deng L; Wu Y; Hu X; Liang L; Ding Y; Li G; Zhao G; Li P; Xie Y
    Neural Netw; 2020 Jan; 121():294-307. PubMed ID: 31586857
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Memristors for Neuromorphic Circuits and Artificial Intelligence Applications.
    Miranda E; Suñé J
    Materials (Basel); 2020 Feb; 13(4):. PubMed ID: 32093164
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Deep learning-based feature extraction for prediction and interpretation of sharp-wave ripples in the rodent hippocampus.
    Navas-Olive A; Amaducci R; Jurado-Parras MT; Sebastian ER; de la Prida LM
    Elife; 2022 Sep; 11():. PubMed ID: 36062906
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Synaptic Mechanisms of Memory Consolidation during Sleep Slow Oscillations.
    Wei Y; Krishnan GP; Bazhenov M
    J Neurosci; 2016 Apr; 36(15):4231-47. PubMed ID: 27076422
    [TBL] [Abstract][Full Text] [Related]  

  • 6. A Unified Dynamic Model for Learning, Replay, and Sharp-Wave/Ripples.
    Jahnke S; Timme M; Memmesheimer RM
    J Neurosci; 2015 Dec; 35(49):16236-58. PubMed ID: 26658873
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Triple-Memory Networks: A Brain-Inspired Method for Continual Learning.
    Wang L; Lei B; Li Q; Su H; Zhu J; Zhong Y
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):1925-1934. PubMed ID: 34529579
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Embracing Change: Continual Learning in Deep Neural Networks.
    Hadsell R; Rao D; Rusu AA; Pascanu R
    Trends Cogn Sci; 2020 Dec; 24(12):1028-1040. PubMed ID: 33158755
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Brain-inspired replay for continual learning with artificial neural networks.
    van de Ven GM; Siegelmann HT; Tolias AS
    Nat Commun; 2020 Aug; 11(1):4069. PubMed ID: 32792531
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.
    Masse NY; Grant GD; Freedman DJ
    Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147
    [TBL] [Abstract][Full Text] [Related]  

  • 11. Computational models of Idling brain activity for memory processing.
    Fukai T
    Neurosci Res; 2023 Apr; 189():75-82. PubMed ID: 36592825
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Contributions by metaplasticity to solving the Catastrophic Forgetting Problem.
    Jedlicka P; Tomko M; Robins A; Abraham WC
    Trends Neurosci; 2022 Sep; 45(9):656-666. PubMed ID: 35798611
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Schematic memory persistence and transience for efficient and robust continual learning.
    Gao Y; Ascoli GA; Zhao L
    Neural Netw; 2021 Dec; 144():49-60. PubMed ID: 34450446
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities.
    Pietrzak P; Szczęsny S; Huderek D; Przyborowski Ł
    Sensors (Basel); 2023 Mar; 23(6):. PubMed ID: 36991750
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Artificial Neural Networks for Neuroscientists: A Primer.
    Yang GR; Wang XJ
    Neuron; 2020 Sep; 107(6):1048-1070. PubMed ID: 32970997
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Brain connectivity meets reservoir computing.
    Damicelli F; Hilgetag CC; Goulas A
    PLoS Comput Biol; 2022 Nov; 18(11):e1010639. PubMed ID: 36383563
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.
    Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks.
    Tadros T; Krishnan GP; Ramyaa R; Bazhenov M
    Nat Commun; 2022 Dec; 13(1):7742. PubMed ID: 36522325
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Bio-Inspired Techniques in a Fully Digital Approach for Lifelong Learning.
    Bianchi S; Muñoz-Martin I; Ielmini D
    Front Neurosci; 2020; 14():379. PubMed ID: 32425749
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Learning offline: memory replay in biological and artificial reinforcement learning.
    Roscow EL; Chua R; Costa RP; Jones MW; Lepora N
    Trends Neurosci; 2021 Oct; 44(10):808-821. PubMed ID: 34481635
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 9.