Loading…

Missing-Insensitive Short-Term Load Forecasting Leveraging Autoencoder and LSTM

In most deep learning-based load forecasting, an intact dataset is required. Since many real-world datasets contain missing values for various reasons, missing imputation using deep learning is actively studied. However, missing imputation and load forecasting have been considered independently so f...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2020, Vol.8, p.206039-206048
Main Authors: Park, Kyungnam, Jeong, Jaeik, Kim, Dongjoo, Kim, Hongseok
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In most deep learning-based load forecasting, an intact dataset is required. Since many real-world datasets contain missing values for various reasons, missing imputation using deep learning is actively studied. However, missing imputation and load forecasting have been considered independently so far. In this article, we provide a deep learning framework that jointly considers missing imputation and load forecasting. We consider a family of autoencoder/long short-term memory (LSTM) combined models for missing-insensitive load forecasting. Specifically, autoencoder (AE), denoising autoencoder (DAE), convolutional autoencoder (CAE), and denoising convolutional autoencoder (DCAE) are considered for extracting features, of which the encoded outputs are fed into the input of LSTM. Our experiments show that the proposed DCAE/LSTM combined model significantly improves forecasting accuracy no matter what missing rate or type (random missing, consecutive block missing) occurs compared to the baseline LSTM.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2020.3036885