Loading…

Variable Split Convolutional Attention: A novel Deep Learning model applied to the household electric power consumption

The accurate prediction of electric power consumption in the residential sector is a desirable action to minimize potential energy losses and maximize social welfare. The goal of this study is to propose a new Deep Learning Neural Network architecture for multivariate time series problems, which inc...

Full description

Saved in:
Bibliographic Details
Published in:Energy (Oxford) 2023-07, Vol.274, p.127321, Article 127321
Main Authors: Gonçalves, Rui, Ribeiro, Vitor Miguel, Pereira, Fernando Lobo
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The accurate prediction of electric power consumption in the residential sector is a desirable action to minimize potential energy losses and maximize social welfare. The goal of this study is to propose a new Deep Learning Neural Network architecture for multivariate time series problems, which includes a novel attention mechanism applied to the Convolutional Long Short-Term Memory Network model. The new attention mechanism is implemented with convolutional layers, splits the data by explanatory variable, incorporates the cyclical segmentation of data by day, and uses causal and roll padding to ensure proper information augmentation before convolutional operations. The output of the attention block is a bi-dimensional context map for each explanatory variable. Considering the Household Electric Power Consumption data set provided by the repository of the University of California at Irvine, the proposed Variable Split Convolutional Attention model is trained, tested, and compared with several alternatives. The main result of this study reveals that the innovative model exhibits the lowest forecasting error. •Transform a 2D input map into a 3D input map for cyclic segmentation.•Roll padding for multivariate time-series analysis.•2D convolutional attention inside the attention block.•2D maps of attention weights generated per variable.
ISSN:0360-5442
DOI:10.1016/j.energy.2023.127321