Wei Fang

Wei Fang
Peking University | PKU · Department of Computer Science and Technology

Bachelor of Technology

About

13
Publications
4,546
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
125
Citations
Education
September 2015 - June 2019
Tsinghua University
Field of study
  • Department of Automation

Publications

Publications (13)
Preprint
Full-text available
In this paper, we propose a new training method Gradual Replacement for deep spiking neural networks.
Conference Paper
Full-text available
Deep Spiking Neural Networks (SNNs) present optimization difficulties for gradient-based approaches due to discrete binary activation and complex spatial-temporal dynamics. Considering the huge success of ResNet in deep learning, it would be natural to train deep SNNs with residual learning. Previous Spiking ResNet mimics the standard residual bloc...
Conference Paper
Full-text available
Spiking Neural Networks (SNNs) have attracted enormous research interest due to temporal information processing capability, low power consumption, and high biological plausibility. However, the formulation of efficient and high-performance learning algorithms for SNNs is still challenging. Most existing learning methods learn weights only, and requ...
Conference Paper
Full-text available
Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips. As these chips are usually resource-constrained, the compression of SNNs is thus crucial along the road of practical use of SNNs. Most existing methods directly apply pruning approaches in artific...
Preprint
Full-text available
Converting Artificial Neural Networks to Spiking Neural Networks (ANN2SNN) is a popular method to get deep SNNs. Most previous ANN2SNN methods use are based on rate coding, which needs many time-steps to establish stable firing rates. In this paper, we propose a novel $\tau$-radix codec method, which uses the Leaky-Integrate-and-Fire spiking neuron...
Preprint
Full-text available
This is just a manuscript.
Preprint
Full-text available
Deep Spiking Neural Networks (SNNs) present optimization difficulties for gradient-based approaches due to discrete binary activation and complex spatial-temporal dynamics. Considering the huge success of ResNet in deep learning, it would be natural to train deep SNNs with residual learning. Previous Spiking ResNet mimics the standard residual bloc...
Preprint
Full-text available
Spiking Neural Networks (SNNs) have been attached great importance due to their biological plausibility and high energy-efficiency on neuromorphic chips. As these chips are usually resource-constrained, the compression of SNNs is thus crucial along the road of practical use of SNNs. Most existing methods directly apply pruning approaches in artific...
Preprint
Full-text available
Deep Spiking Neural Networks (SNNs) are harder to train than ANNs because of their discrete binary activation and spatio-temporal domain error back-propagation. Considering the huge success of ResNet in ANNs' deep learning, it is natural to attempt to use residual learning to train deep SNNs. Previous Spik-ing ResNet used a similar residual block t...
Preprint
Full-text available
The final version of this paper is Optimal ANN-SNN Conversion for High-accuracy and Ultra-low-latency Spiking Neural Networks (https://openreview.net/forum?id=7B3IJMM1k_M).
Code
2021.01.20 update: Error correction in readme.md: T is 10 for N-MNIST but written wrongly as 20 in running code examples in readme.md. You can also refer to: https://github.com/fangwei123456/Parametric-Leaky-Integrate-and-Fire-Spiking-Neuron
Preprint
Spiking Neural Networks (SNNs) have attracted enormous research interest due to temporal information processing capability, low power consumption, and high biological plausibility. However, the formulation of efficient and high-performance learning algorithms for SNNs is still challenging. Most existing learning methods learn weights only, and requ...
Preprint
The Spiking Neural Networks (SNNs) have attracted research interest due to its temporal information processing capability, low power consumption, and high biological plausibility. The Leaky Integrate-and-Fire (LIF) neuron model is one of the most popular spiking neuron models used in SNNs for it achieves a balance between computing cost and biologi...

Network

Cited By