LAI Jianxiang1, HUANG Fangwan1,2,3, WU Yuezhong1,2,3, YU Zhiyong1,2,3
1. College of Computer and Data Science, Fuzhou University, Fuzhou 350108; 2. Engineering Research Center of Big Data Intelligence of Mi-nistry of Education, Fuzhou University, Fuzhou 350108; 3. Fujian Key Laboratory of Network Computing and Intelligent Information Processing, Fuzhou University, Fuzhou 350108
Abstract:Spiking neural networks are faced with challenges in time series prediction, including the difficulty of balancing computational efficiency and information processing capability in temporal mapping, insufficient scalability of parallel spiking neurons at high time steps, and a tendency to fall into local optimum. To address these issues, a high-time-step-friendly slice parallel spiking neuron(HSPSN) is proposed in this paper. First, a direct temporal mapping method is employed to achieve one-to-one matching of time steps. Then, a slice parallel mechanism is designed to integrate local and global temporal patterns at the neuronal level through the synergy of local and global slices. Finally, a constriction matrix random dropout strategy is adopted to effectively guide the neurons toward superior convergence. Experiments on seven real-world time series prediction datasets demonstrate that HSPSN significantly outperforms existing spiking neural networks in terms of prediction accuracy, energy efficiency and convergence stability and it can effectively capture complex spatiotemporal dependencies in multivariate time series and covariate-based time series.
[1] ROY K, JAISWAL A, PANDA P.Towards Spike-Based Machine Intelligence with Neuromorphic Computing. Nature, 2019, 575(7784): 607-617. [2] ESHRAGHIAN J K, WARD M, NEFTCI E O, et al. Training Spiking Neural Networks Using Lessons from Deep Learning. Procee-dings of the IEEE, 2023, 111(9): 1016-1054. [3] GONG P L, WANG P P, ZHOU Y Y, et al. A Spiking Neural Network with Adaptive Graph Convolution and LSTM for EEG-Based Brain-Computer Interfaces. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2023, 31: 1440-1450. [4] WU Y J, SHI B Z, ZHENG Z, et al. Adaptive Spatiotemporal Neural Networks through Complementary Hybridization. Nature Communications, 2024, 15(1). DOI: 10.1038/s41467-024-51641-x. [5] LÜ C Z, WANG Y S, HAN D Q, et al. Efficient and Effective Time-Series Forecasting with Spiking Neural Networks[C/OL]. [2025-11-22]. https://arxiv.org/pdf/2402.01533. [6] WU W J, HUO D X, CHEN H.SpikF: Spiking Fourier Network for Efficient Long-Term Prediction // Proc of the 42nd International Conference on Machine Learning. San Diego, USA: JMLR, 2025: 67342-67368. [7] HUANG Z H, FANG W, BU T, et al. Differential Coding for Trai-ning-Free ANN-to-SNN Conversion[C/OL].[2025-11-22]. https://arxiv.org/pdf/2503.00301. [8] ZHOU Z K, CHE K W, FANG W, et al. Spikformer V2: Join the High Accuracy Club on ImageNet with an SNN Ticket[C/OL].[2025-11-22]. https://arxiv.org/abs/2401.02020. [9] ZHENG Z Q, HUANG Y C, YU Y C, et al. SpiLiFormer: Enhancing Spiking Transformers with Lateral Inhibition // Proc of the International Conference on Computer Vision. Washington, USA: IEEE, 2025: 25439-24548. [10] YAO X T, LI F R, MO Z T, et al.GLIF: A Unified Gated Leaky Integrate-and-Fire Neuron for Spiking Neural Networks // Proc of the 36th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2022: 32160-32171. [11] ZHANG S M, YANG Q, MA C X, et al. TC-LIF: A Two-Compartment Spiking Neuron Model for Long-Term Sequential Mode-lling. Proceedings of the AAAI Conference on Artificial Intelligence, 2024, 38(15): 16838-16847. [12] FANG W, YU Z F, ZHOU Z K, et al. Parallel Spiking Neurons with High Efficiency and Ability to Learn Long-Term Dependencies[C/OL].[2025-11-22]. https://arxiv.org/pdf/2304.12760. [13] FENG W J, GAO X Y, DU W Q, et al. Efficient Parallel Training Methods for Spiking Neural Networks with Constant Time Complexity[C/OL].[2025-11-22]. https://openreview.net/pdf?id=HZKCXym5cS. [14] CHEN H Q, YU L X, ZHAN S J, et al. Time-Independent Spiking Neuron via Membrane Potential Estimation for Efficient Spiking Neural Networks // Proc of the IEEE International Conference on Acoustics, Speech and Signal Processing. Washington, USA: IEEE, 2025. DOI: 10.1109/ICASSP49660.2025.10890472. [15] IZHIKEVICH E M.Simple Model of Spiking Neurons. IEEE Tran-sactions on Neural Networks, 2003, 14(6): 1569-1572. [16] SHEN S J, WANG C, HUANG R Z, et al. SpikingSSMs: Lear-ning Long Sequences with Sparse and Parallel Spiking State Space Models. Proceedings of the AAAI Conference on Artificial Intelligence, 2025, 39(19): 20380-20388. [17] 张伟,朱凤华,陈圆圆,等.基于注意力机制和分时图卷积的公交客流预测.模式识别与人工智能, 2021, 34(2): 167-175. (ZHANG W, ZHU F H, CHEN Y Y, et al. Bus Passenger Flow Forecast Based on Attention and Time-Sharing Graph Convolutional Network. Pattern Recognition and Artificial Intelligence, 2021, 34(2): 167-175.) [18] LIU Y, HU T G, ZHANG H R, et al. iTransformer: Inverted Transformers Are Effective for Time Series Forecasting[C/OL].[2025-11-22]. https://openreview.net/pdf?id=JePfAI8fah. [19] ZHANG H Y, LIN J Y, ZHANG W T, et al. AdaMixT: Adaptive Weighted Mixture of Multi-scale Expert Transformers for Time Series Forecasting // Proc of the 34th International Joint Conference on Artificial Intelligence. San Francisco, USA: IJCAI, 2025: 3633-3641. [20] HAN M, WANG Q P.Adaptive Graph Convolution Neural Diffe-rential Equation for Spatio-Temporal Time Series Prediction. IEEE Transactions on Knowledge and Data Engineering, 2025, 37(6): 3193-3204. [21] WANG Y X, WU H X, DONG J X, et al.TimeXer: Empowering Transformers for Time Series Forecasting with Exogenous Variables // Proc of the 38th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2024: 469-498. [22] ZHOU P F, LIU Y L, LIANG J L, et al. CrossLinear: Plug-and-Play Cross-Correlation Embedding for Time Series Forecasting with Exogenous Variables // Proc of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining. New York, USA: ACM, 2025: 4120-4131. [23] LÜ C Z, HAN D Q, WANG Y S, et al.Advancing Spiking Neural Networks for Sequential Modeling with Central Pattern Generators // Proc of the 38th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2024: 26915-26940. [24] LÜ C Z, WANG Y S, HAN D Q, et al. Toward Relative Positional Encoding in Spiking Transformers[C/OL].[2025-11-22]. https://arxiv.org/pdf/2501.16745. [25] IZHIKEVICH E M.Which Model to Use for Cortical Spiking Neu-rons. IEEE Transactions on Neural Networks, 2004, 15(5): 1063-1070. [26] SRIVASTAVA N, HINTON G, KRIZHEVSKY A, et al. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. The Journal of Machine Learning Research, 2014, 15: 1929-1958. [27] PIGAREV I N, PIGAREVA M L.Partial Sleep in the Context of Augmentation of Brain Function. Frontiers in Systems Neuroscience, 2014, 8. DOI: 10.3389/fnsys.2014.00075. [28] LAI G K, CHANG W C, YANG Y M, et al. Modeling Long-and Short-Term Temporal Patterns with Deep Neural Networks // Proc of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval. New York, USA: ACM, 2018: 95-104. [29] LAGO J, MARCJASZ G, DE SCHUTTER B, et al. Forecasting Day-Ahead Electricity Prices: A Review of State-of-the-Art Algorithms, Best Practices and an Open-Access Benchmark. Applied Energy, 2021, 293. DOI: 10.1016/j.apenergy.2021.116983. [30] HEWAGE P, BEHERA A, TROVATI M, et al. Temporal Convolutional Neural(TCN) Network for an Effective Weather Forecasting Using Time-Series Data from the Local Weather Station. Soft Computing, 2020, 24(21): 16453-16482. [31] HOU B J, ZHOU Z H.Learning with Interpretable Structure from Gated RNN. IEEE Transactions on Neural Networks and Learning Systems, 2020, 31(7): 2267-2279. [32] WU H X, XU J H, WANG J M, et al.Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting // Proc of the 35th International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2021: 22419-22430. [33] XUE D F, LI W J, LU Y F, et al. MI-TRQR: Mutual Information-Based Temporal Redundancy Quantification and Reduction for Energy-Efficient Spiking Neural Networks[C/OL].[2025-11-22]. https://openreview.net/pdf?id=NRqGpUAjV9. [34] PASZKE A, GROSS S, MASSA F, et al.PyTorch: An Imperative Style, High-Performance Deep Learning Library // Proc of the 33rd International Conference on Neural Information Processing Systems. Cambridge, USA: MIT Press, 2019: 8026-8037. [35] HOROWITZ M.Computing's Energy Problem(and What We Can Do about It) // Proc of the IEEE International Solid-State Circuits Conference Digest of Technical Paper. Washington, USA: IEEE, 2014. DOI: 10.1109/ISSCC.2014.6757323.