Paper Details

Abstract

Spiking Neural Networks (SNNs) offer energy-efficient and biologically plausible alternatives to conventional Artificial Neural Networks (ANNs). The Spike-driven Transformer architecture integrates SNN efficiency with Transformer-based feature extraction, achieving competitive results in image classification. However, its reliance on the Leaky Integrate-and-Fire (LIF) neuron introduces computational overhead due to the leak mechanism. This work investigates alternative neuron models—IF Hard Reset and IF Soft Reset—which remove the leak dynamics to improve efficiency. We conduct systematic evaluations on CIFAR-10 dataset, analyzing accuracy, inference speed and spike activity patterns across different neuron models. Experimental results show that IF Soft Reset achieves the highest accuracy (94.53%) and fastest inference speed (1323.4 FPS, 12.09 ms latency), outperforming IF Hard Reset (94.44%, 1244.2 FPS) and LIF (94.34%, 1161.2 FPS). The improvement is attributed to its gradual reset behavior, which preserves residual excitation and enhances temporal processing. These findings offer practical design guidelines for deploying efficient spike-based Transformers in resource-constrained environments.

Keywords
Spiking Neural Networks Spiking Neural Models Spike-driven Transformer Integrate-and-Fire (IF) Model.
Contact Information
Ngu Cong Viet Huynh (Corresponding Author)
FPT University, Vietnam
0968683264

All Authors (3)

Tri Minh Nguyen

Affiliation: FPT University

Country: Vietnam

Email: chinjsu130205@gmail.com

Phone: 0785339955

Khang Vuong Huynh

Affiliation: FPT University

Country: Vietnam

Email: khanghvse184160@fpt.edu.vn

Phone: 0989427452

Ngu Cong Viet Huynh C

Affiliation: FPT University

Country: Vietnam

Email: nguhcv@fe.edu.vn

Phone: 0968683264