Loading…

Revolutionizing Time Series Data Preprocessing with a Novel Cycling Layer in Self-Attention Mechanisms

This paper introduces an innovative method for enhancing time series data preprocessing by integrating a cycling layer into a self-attention mechanism. Traditional approaches often fail to capture the cyclical patterns inherent to time series data, which affects the predictive model accuracy. The pr...

Full description

Saved in:
Bibliographic Details
Published in:Applied sciences 2024-10, Vol.14 (19), p.8922
Main Authors: Chen, Jiyan, Yang, Zijiang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This paper introduces an innovative method for enhancing time series data preprocessing by integrating a cycling layer into a self-attention mechanism. Traditional approaches often fail to capture the cyclical patterns inherent to time series data, which affects the predictive model accuracy. The proposed method aims to improve models’ ability to identify and leverage these cyclical patterns, as demonstrated using the Jena Climate dataset from the Max Planck Institute for Biogeochemistry. Empirical results show that the proposed method enhances forecast accuracy and speeds up model fitting compared to the conventional techniques. This paper contributes to the field of time series analysis by providing a more effective preprocessing approach.
ISSN:2076-3417
2076-3417
DOI:10.3390/app14198922