Loading…

Multi-step ahead time series forecasting via sparse coding and dictionary based techniques

Block diagram for the proposed local learning framework for forecasting using sparse coding. [Display omitted] •Time series prediction approaches are either global or local. Local techniques are flexible in the face of changing characteristics of the data with time.•KNN is the most popular and succe...

Full description

Saved in:
Bibliographic Details
Published in:Applied soft computing 2018-08, Vol.69, p.464-474
Main Authors: Helmi, Ahmed, Fakhr, Mohamed W., Atiya, Amir F.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Block diagram for the proposed local learning framework for forecasting using sparse coding. [Display omitted] •Time series prediction approaches are either global or local. Local techniques are flexible in the face of changing characteristics of the data with time.•KNN is the most popular and successful local forecasting technique, however, K has to be fixed and the local prediction is estimated using a heuristic to combine targets of the K nearest neighbors.•On the other hand, machine learning and statistical techniques require long training and are harder to adapt to changing data characteristics.•This paper proposes the use of sparse coding to perform local predictions, where for each test vector, a variable number of training atoms is used, and their combining weights are estimated optimally.•In this paper, many sparse coding formulations are tested, and applying sparse coding to multi-step ahead forecasting is presented.•Several experiments are performed using the M3 data set, an extensive data set consisting of 1045 time series, and typically used in multi-step ahead time series forecasting literature as a standard data benchmark.•Comparisons are done between the proposed approach and many other state of the art techniques like neural networks, support vector regression, decision trees, and Gaussian processes as well as statistical techniques.•The comparisons show that sparse coding using Lasso and Elastic Net formulations give better results than most of the other techniques on the M3 competition data sets, particularly for longer horizons. Sparse coding is based on the concept of having a large dictionary of candidate basis vectors. Any given vector is expressed as a sparse linear combination of the dictionary vectors. It has been developed in the signal processing field, and has many applications in data compression and image processing. In this paper we propose applying sparse coding to the time series forecasting field. Specifically, the paper investigates different dictionary based local learning techniques for building predictive models for the time series forecasting problem. The proposed methodology is based on a local learning framework whereby the query point is embedded and coded in terms of a sparse combination of the training dictionary atoms (vectors). Then this embedding is used for estimating the target value of the query point, by applying the same embedding to the target vectors of the dictionary training atoms. We present an expe
ISSN:1568-4946
1872-9681
DOI:10.1016/j.asoc.2018.04.017