The search functionality is under construction.
The search functionality is under construction.

Time Series Forecasting Based on Convolution Transformer

Na WANG, Xianglian ZHAO

  • Full Text Views

    4

  • Cite this

Summary :

For many fields in real life, time series forecasting is essential. Recent studies have shown that Transformer has certain advantages when dealing with such problems, especially when dealing with long sequence time input and long sequence time forecasting problems. In order to improve the efficiency and local stability of Transformer, these studies combine Transformer and CNN with different structures. However, previous time series forecasting network models based on Transformer cannot make full use of CNN, and they have not been used in a better combination of both. In response to this problem in time series forecasting, we propose the time series forecasting algorithm based on convolution Transformer. (1) ES attention mechanism: Combine external attention with traditional self-attention mechanism through the two-branch network, the computational cost of self-attention mechanism is reduced, and the higher forecasting accuracy is obtained. (2) Frequency enhanced block: A Frequency Enhanced Block is added in front of the ESAttention module, which can capture important structures in time series through frequency domain mapping. (3) Causal dilated convolution: The self-attention mechanism module is connected by replacing the traditional standard convolution layer with a causal dilated convolution layer, so that it obtains the receptive field of exponentially growth without increasing the calculation consumption. (4) Multi-layer feature fusion: The outputs of different self-attention mechanism modules are extracted, and the convolutional layers are used to adjust the size of the feature map for the fusion. The more fine-grained feature information is obtained at negligible computational cost. Experiments on real world datasets show that the time series network forecasting model structure proposed in this paper can greatly improve the real-time forecasting performance of the current state-of-the-art Transformer model, and the calculation and memory costs are significantly lower. Compared with previous algorithms, the proposed algorithm has achieved a greater performance improvement in both effectiveness and forecasting accuracy.

Publication
IEICE TRANSACTIONS on Information Vol.E106-D No.5 pp.976-985
Publication Date
2023/05/01
Publicized
2023/02/15
Online ISSN
1745-1361
DOI
10.1587/transinf.2022EDP7136
Type of Manuscript
PAPER
Category
Fundamentals of Information Systems

Authors

Na WANG
  Nanjing University of Aeronautics and Astronautics,Nanjing Audit University Jinshen College
Xianglian ZHAO
  Nanjing University of Aeronautics and Astronautics

Keyword