These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.
156 related articles for article (PubMed ID: 36033624)
1. EnforceSNN: Enabling resilient and energy-efficient spiking neural network inference considering approximate DRAMs for embedded systems. Putra RVW; Hanif MA; Shafique M Front Neurosci; 2022; 16():937782. PubMed ID: 36033624 [TBL] [Abstract][Full Text] [Related]
2. RescueSNN: enabling reliable executions on spiking neural network accelerators under permanent faults. Putra RVW; Hanif MA; Shafique M Front Neurosci; 2023; 17():1159440. PubMed ID: 37123371 [TBL] [Abstract][Full Text] [Related]
3. SpQuant-SNN: ultra-low precision membrane potential with sparse activations unlock the potential of on-device spiking neural networks applications. Hasssan A; Meng J; Anupreetham A; Seo JS Front Neurosci; 2024; 18():1440000. PubMed ID: 39296710 [TBL] [Abstract][Full Text] [Related]
4. SNN4Agents: a framework for developing energy-efficient embodied spiking neural networks for autonomous agents. Putra RVW; Marchisio A; Shafique M Front Robot AI; 2024; 11():1401677. PubMed ID: 39131197 [TBL] [Abstract][Full Text] [Related]
5. A Tandem Learning Rule for Effective Training and Rapid Inference of Deep Spiking Neural Networks. Wu J; Chua Y; Zhang M; Li G; Li H; Tan KC IEEE Trans Neural Netw Learn Syst; 2023 Jan; 34(1):446-460. PubMed ID: 34288879 [TBL] [Abstract][Full Text] [Related]
6. A Little Energy Goes a Long Way: Build an Energy-Efficient, Accurate Spiking Neural Network From Convolutional Neural Network. Wu D; Yi X; Huang X Front Neurosci; 2022; 16():759900. PubMed ID: 35692427 [TBL] [Abstract][Full Text] [Related]
7. SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training. Liu F; Zhao W; Chen Y; Wang Z; Yang T; Jiang L Front Neurosci; 2021; 15():756876. PubMed ID: 34803591 [TBL] [Abstract][Full Text] [Related]
8. High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron. Gao H; He J; Wang H; Wang T; Zhong Z; Yu J; Wang Y; Tian M; Shi C Front Neurosci; 2023; 17():1141701. PubMed ID: 36968504 [TBL] [Abstract][Full Text] [Related]
9. SmartDeal: Remodeling Deep Network Weights for Efficient Inference and Training. Chen X; Zhao Y; Wang Y; Xu P; You H; Li C; Fu Y; Lin Y; Wang Z IEEE Trans Neural Netw Learn Syst; 2023 Oct; 34(10):7099-7113. PubMed ID: 35235521 [TBL] [Abstract][Full Text] [Related]
10. SPIDEN: deep Spiking Neural Networks for efficient image denoising. Castagnetti A; Pegatoquet A; Miramond B Front Neurosci; 2023; 17():1224457. PubMed ID: 37638316 [TBL] [Abstract][Full Text] [Related]
12. An FPGA implementation of Bayesian inference with spiking neural networks. Li H; Wan B; Fang Y; Li Q; Liu JK; An L Front Neurosci; 2023; 17():1291051. PubMed ID: 38249589 [TBL] [Abstract][Full Text] [Related]
13. ACE-SNN: Algorithm-Hardware Co-design of Energy-Efficient & Low-Latency Deep Spiking Neural Networks for 3D Image Recognition. Datta G; Kundu S; Jaiswal AR; Beerel PA Front Neurosci; 2022; 16():815258. PubMed ID: 35464314 [TBL] [Abstract][Full Text] [Related]
14. A Scatter-and-Gather Spiking Convolutional Neural Network on a Reconfigurable Neuromorphic Hardware. Zou C; Cui X; Kuang Y; Liu K; Wang Y; Wang X; Huang R Front Neurosci; 2021; 15():694170. PubMed ID: 34867142 [TBL] [Abstract][Full Text] [Related]
15. Fast-SNN: Fast Spiking Neural Network by Converting Quantized ANN. Hu Y; Zheng Q; Jiang X; Pan G IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):14546-14562. PubMed ID: 37721891 [TBL] [Abstract][Full Text] [Related]
16. Digital design of a spatial-pow-STDP learning block with high accuracy utilizing pow CORDIC for large-scale image classifier spatiotemporal SNN. Bahrami MK; Nazari S Sci Rep; 2024 Feb; 14(1):3388. PubMed ID: 38337032 [TBL] [Abstract][Full Text] [Related]
17. On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices. Kwon D; Lim S; Bae JH; Lee ST; Kim H; Seo YT; Oh S; Kim J; Yeom K; Park BG; Lee JH Front Neurosci; 2020; 14():423. PubMed ID: 32733180 [TBL] [Abstract][Full Text] [Related]
18. A Soft-Pruning Method Applied During Training of Spiking Neural Networks for In-memory Computing Applications. Shi Y; Nguyen L; Oh S; Liu X; Kuzum D Front Neurosci; 2019; 13():405. PubMed ID: 31080402 [TBL] [Abstract][Full Text] [Related]
19. Toward High-Accuracy and Low-Latency Spiking Neural Networks With Two-Stage Optimization. Wang Z; Zhang Y; Lian S; Cui X; Yan R; Tang H IEEE Trans Neural Netw Learn Syst; 2023 Dec; PP():. PubMed ID: 38100345 [TBL] [Abstract][Full Text] [Related]
20. A Noise-Based Novel Strategy for Faster SNN Training. Jiang C; Zhang Y Neural Comput; 2023 Aug; 35(9):1593-1608. PubMed ID: 37437192 [TBL] [Abstract][Full Text] [Related] [Next] [New Search]