Loading…

Preserving Seasonal and Trend Information: A Variational Autoencoder-Latent Space Arithmetic Based Approach for Non-Stationary Learning

AI models have garnered significant research attention towards predictive task automation. However, a stationary training environment is an underlying assumption for most models and such models simply do not work on non-stationary data since a stationary relationship is learned. The existing solutio...

Full description

Saved in:
Bibliographic Details
Main Authors: Wasswa, Hassan, Nanyonga, Aziida, Lynar, Timothy
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:AI models have garnered significant research attention towards predictive task automation. However, a stationary training environment is an underlying assumption for most models and such models simply do not work on non-stationary data since a stationary relationship is learned. The existing solutions propose making data stationary prior to model training and evaluation. This leads to loss of trend and seasonal patterns which are vital components for learning temporal dependencies of the system under study. This research aims to address this limitation by proposing a method for enforcing stationary behavior within the latent space while preserving trend and seasonal information. The method deploys techniques including Differencing, Time-series decomposition, and Latent Space Arithmetic (LSA), to learn information vital for efficient approximation of trend and seasonal information which is then stored as embeddings within the latent space of a Variational Autoencoder (VAE). The approach's ability to preserve trend and seasonal information was evaluated on two time-series non-stationary datasets. For predictive performance evaluation, four deep learning models were trained on the latent vector representations of the datasets after application of the proposed method and all models produced competitive results in comparison with state-of-the-art techniques using RMSE as the performance metric.
ISSN:2642-6102
DOI:10.1109/TENSYMP61132.2024.10752272