These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

138 related articles for article (PubMed ID: 37839332)

  • 1. A biologically inspired architecture with switching units can learn to generalize across backgrounds.
    Voina D; Shea-Brown E; Mihalas S
    Neural Netw; 2023 Nov; 168():615-630. PubMed ID: 37839332
    [TBL] [Abstract][Full Text] [Related]  

  • 2. Beneficial Perturbation Network for Designing General Adaptive Artificial Intelligence Systems.
    Wen S; Rios A; Ge Y; Itti L
    IEEE Trans Neural Netw Learn Syst; 2022 Aug; 33(8):3778-3791. PubMed ID: 33596177
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Triple-Memory Networks: A Brain-Inspired Method for Continual Learning.
    Wang L; Lei B; Li Q; Su H; Zhu J; Zhong Y
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):1925-1934. PubMed ID: 34529579
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation.
    Peng J; Tang B; Jiang H; Li Z; Lei Y; Lin T; Li H
    IEEE Trans Neural Netw Learn Syst; 2022 Sep; 33(9):4243-4256. PubMed ID: 33577459
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Representations and generalization in artificial and brain neural networks.
    Li Q; Sorscher B; Sompolinsky H
    Proc Natl Acad Sci U S A; 2024 Jul; 121(27):e2311805121. PubMed ID: 38913896
    [TBL] [Abstract][Full Text] [Related]  

  • 6. Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.
    Zhang B; Guo Y; Li Y; He Y; Wang H; Dai Q
    IEEE Trans Neural Netw Learn Syst; 2022 May; 33(5):2010-2022. PubMed ID: 34339377
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Single Circuit in V1 Capable of Switching Contexts During Movement Using an Inhibitory Population as a Switch.
    Voina D; Recanatesi S; Hu B; Shea-Brown E; Mihalas S
    Neural Comput; 2022 Feb; 34(3):541-594. PubMed ID: 35016220
    [TBL] [Abstract][Full Text] [Related]  

  • 8. Engineering a Less Artificial Intelligence.
    Sinz FH; Pitkow X; Reimer J; Bethge M; Tolias AS
    Neuron; 2019 Sep; 103(6):967-979. PubMed ID: 31557461
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Bio-inspired, task-free continual learning through activity regularization.
    Lässig F; Aceituno PV; Sorbaro M; Grewe BF
    Biol Cybern; 2023 Oct; 117(4-5):345-361. PubMed ID: 37589728
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Avoiding Catastrophe: Active Dendrites Enable Multi-Task Learning in Dynamic Environments.
    Iyer A; Grewal K; Velu A; Souza LO; Forest J; Ahmad S
    Front Neurorobot; 2022; 16():846219. PubMed ID: 35574225
    [TBL] [Abstract][Full Text] [Related]  

  • 11. A neural circuit model for a contextual association task inspired by recommender systems.
    Zhu H; Paschalidis IC; Chang A; Stern CE; Hasselmo ME
    Hippocampus; 2020 Apr; 30(4):384-395. PubMed ID: 32057161
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Neural state space alignment for magnitude generalization in humans and recurrent networks.
    Sheahan H; Luyckx F; Nelli S; Teupe C; Summerfield C
    Neuron; 2021 Apr; 109(7):1214-1226.e8. PubMed ID: 33626322
    [TBL] [Abstract][Full Text] [Related]  

  • 13. LwF-ECG: Learning-without-forgetting approach for electrocardiogram heartbeat classification based on memory with task selector.
    Ammour N; Alhichri H; Bazi Y; Alajlan N
    Comput Biol Med; 2021 Oct; 137():104807. PubMed ID: 34496312
    [TBL] [Abstract][Full Text] [Related]  

  • 14. The role of capacity constraints in Convolutional Neural Networks for learning random versus natural data.
    Tsvetkov C; Malhotra G; Evans BD; Bowers JS
    Neural Netw; 2023 Apr; 161():515-524. PubMed ID: 36805266
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.
    Masse NY; Grant GD; Freedman DJ
    Proc Natl Acad Sci U S A; 2018 Oct; 115(44):E10467-E10475. PubMed ID: 30315147
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Continual learning with attentive recurrent neural networks for temporal data classification.
    Yin SY; Huang Y; Chang TY; Chang SF; Tseng VS
    Neural Netw; 2023 Jan; 158():171-187. PubMed ID: 36459884
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Generalizing to generalize: Humans flexibly switch between compositional and conjunctive structures during reinforcement learning.
    Franklin NT; Frank MJ
    PLoS Comput Biol; 2020 Apr; 16(4):e1007720. PubMed ID: 32282795
    [TBL] [Abstract][Full Text] [Related]  

  • 18. Reducing Catastrophic Forgetting With Associative Learning: A Lesson From Fruit Flies.
    Shen Y; Dasgupta S; Navlakha S
    Neural Comput; 2023 Oct; 35(11):1797-1819. PubMed ID: 37725710
    [TBL] [Abstract][Full Text] [Related]  

  • 19. Can neural networks benefit from objectives that encourage iterative convergent computations? A case study of ResNets and object classification.
    Lippl S; Peters B; Kriegeskorte N
    PLoS One; 2024; 19(3):e0293440. PubMed ID: 38512838
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Adaptive Progressive Continual Learning.
    Xu J; Ma J; Gao X; Zhu Z
    IEEE Trans Pattern Anal Mach Intell; 2022 Oct; 44(10):6715-6728. PubMed ID: 34232867
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.