These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


BIOMARKERS

Molecular Biopsy of Human Tumors

- a resource for Precision Medicine *

125 related articles for article (PubMed ID: 38941738)

  • 1. Self-architectural knowledge distillation for spiking neural networks.
    Qiu H; Ning M; Song Z; Fang W; Chen Y; Sun T; Ma Z; Yuan L; Tian Y
    Neural Netw; 2024 Jun; 178():106475. PubMed ID: 38941738
    [TBL] [Abstract][Full Text] [Related]  

  • 2. A universal ANN-to-SNN framework for achieving high accuracy and low latency deep Spiking Neural Networks.
    Wang Y; Liu H; Zhang M; Luo X; Qu H
    Neural Netw; 2024 Jun; 174():106244. PubMed ID: 38508047
    [TBL] [Abstract][Full Text] [Related]  

  • 3. Rethinking Pretraining as a Bridge From ANNs to SNNs.
    Lin Y; Hu Y; Ma S; Yu D; Li G
    IEEE Trans Neural Netw Learn Syst; 2024 Jul; 35(7):9054-9067. PubMed ID: 36374892
    [TBL] [Abstract][Full Text] [Related]  

  • 4. Training much deeper spiking neural networks with a small number of time-steps.
    Meng Q; Yan S; Xiao M; Wang Y; Lin Z; Luo ZQ
    Neural Netw; 2022 Sep; 153():254-268. PubMed ID: 35759953
    [TBL] [Abstract][Full Text] [Related]  

  • 5. Quantization Framework for Fast Spiking Neural Networks.
    Li C; Ma L; Furber S
    Front Neurosci; 2022; 16():918793. PubMed ID: 35928011
    [TBL] [Abstract][Full Text] [Related]  

  • 6. IDSNN: Towards High-Performance and Low-Latency SNN Training via Initialization and Distillation.
    Fan X; Zhang H; Zhang Y
    Biomimetics (Basel); 2023 Aug; 8(4):. PubMed ID: 37622980
    [TBL] [Abstract][Full Text] [Related]  

  • 7. Rethinking the performance comparison between SNNS and ANNS.
    Deng L; Wu Y; Hu X; Liang L; Ding Y; Li G; Zhao G; Li P; Xie Y
    Neural Netw; 2020 Jan; 121():294-307. PubMed ID: 31586857
    [TBL] [Abstract][Full Text] [Related]  

  • 8. SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.
    Liu F; Zhao W; Chen Y; Wang Z; Yang T; Jiang L
    Front Neurosci; 2021; 15():756876. PubMed ID: 34803591
    [TBL] [Abstract][Full Text] [Related]  

  • 9. Low-Latency Spiking Neural Networks Using Pre-Charged Membrane Potential and Delayed Evaluation.
    Hwang S; Chang J; Oh MH; Min KK; Jang T; Park K; Yu J; Lee JH; Park BG
    Front Neurosci; 2021; 15():629000. PubMed ID: 33679308
    [TBL] [Abstract][Full Text] [Related]  

  • 10. Fast-SNN: Fast Spiking Neural Network by Converting Quantized ANN.
    Hu Y; Zheng Q; Jiang X; Pan G
    IEEE Trans Pattern Anal Mach Intell; 2023 Dec; 45(12):14546-14562. PubMed ID: 37721891
    [TBL] [Abstract][Full Text] [Related]  

  • 11. High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron.
    Gao H; He J; Wang H; Wang T; Zhong Z; Yu J; Wang Y; Tian M; Shi C
    Front Neurosci; 2023; 17():1141701. PubMed ID: 36968504
    [TBL] [Abstract][Full Text] [Related]  

  • 12. Spiking Deep Residual Networks.
    Hu Y; Tang H; Pan G
    IEEE Trans Neural Netw Learn Syst; 2023 Aug; 34(8):5200-5205. PubMed ID: 34723807
    [TBL] [Abstract][Full Text] [Related]  

  • 13. Efficient Processing of Spatio-Temporal Data Streams With Spiking Neural Networks.
    Kugele A; Pfeil T; Pfeiffer M; Chicca E
    Front Neurosci; 2020; 14():439. PubMed ID: 32431592
    [TBL] [Abstract][Full Text] [Related]  

  • 14. Attention Spiking Neural Networks.
    Yao M; Zhao G; Zhang H; Hu Y; Deng L; Tian Y; Xu B; Li G
    IEEE Trans Pattern Anal Mach Intell; 2023 Aug; 45(8):9393-9410. PubMed ID: 37022261
    [TBL] [Abstract][Full Text] [Related]  

  • 15. Spiking neural networks fine-tuning for brain image segmentation.
    Yue Y; Baltes M; Abuhajar N; Sun T; Karanth A; Smith CD; Bihl T; Liu J
    Front Neurosci; 2023; 17():1267639. PubMed ID: 38027484
    [TBL] [Abstract][Full Text] [Related]  

  • 16. Toward High-Accuracy and Low-Latency Spiking Neural Networks With Two-Stage Optimization.
    Wang Z; Zhang Y; Lian S; Cui X; Yan R; Tang H
    IEEE Trans Neural Netw Learn Syst; 2023 Dec; PP():. PubMed ID: 38100345
    [TBL] [Abstract][Full Text] [Related]  

  • 17. Direct training high-performance spiking neural networks for object recognition and detection.
    Zhang H; Li Y; He B; Fan X; Wang Y; Zhang Y
    Front Neurosci; 2023; 17():1229951. PubMed ID: 37614339
    [TBL] [Abstract][Full Text] [Related]  

  • 18. LDD: High-Precision Training of Deep Spiking Neural Network Transformers Guided by an Artificial Neural Network.
    Liu Y; Zhao C; Jiang Y; Fang Y; Chen F
    Biomimetics (Basel); 2024 Jul; 9(7):. PubMed ID: 39056854
    [TBL] [Abstract][Full Text] [Related]  

  • 19. STCA-SNN: self-attention-based temporal-channel joint attention for spiking neural networks.
    Wu X; Song Y; Zhou Y; Jiang Y; Bai Y; Li X; Yang X
    Front Neurosci; 2023; 17():1261543. PubMed ID: 38027490
    [TBL] [Abstract][Full Text] [Related]  

  • 20. Electrocardiography Classification with Leaky Integrate-and-Fire Neurons in an Artificial Neural Network-Inspired Spiking Neural Network Framework.
    Rana A; Kim KK
    Sensors (Basel); 2024 May; 24(11):. PubMed ID: 38894215
    [TBL] [Abstract][Full Text] [Related]  

    [Next]    [New Search]
    of 7.