Loading…

Optimizing electricity demand scheduling in microgrids using deep reinforcement learning for cost‐efficiency

Renewable energy sources (RES) are increasingly being developed and used to address the energy crisis and protect the environment. However, the large‐scale integration of wind and solar energy into the power grid is still challenging and limits the adoption of these new energy sources. Microgrids (M...

Full description

Saved in:
Bibliographic Details
Published in:IET generation, transmission & distribution transmission & distribution, 2023-06, Vol.17 (11), p.2535-2544
Main Authors: Xiong, Baoyin, Guo, Yiguo, Zhang, Liyang, Li, Jianbin, Liu, Xiufeng, Cheng, Long
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Renewable energy sources (RES) are increasingly being developed and used to address the energy crisis and protect the environment. However, the large‐scale integration of wind and solar energy into the power grid is still challenging and limits the adoption of these new energy sources. Microgrids (MGs) are small‐scale power generation and distribution systems that can effectively integrate renewable energy, electric loads, and energy storage systems (ESS). By using MGs, it is possible to consume renewable energy locally and reduce energy losses from long‐distance transmission. This paper proposes a deep reinforcement learning (DRL)‐based energy management system (EMS) called DRL‐MG to process and schedule energy purchase requests from customers in real‐time. Specifically, the aim of this paper is to enhance the quality of service (QoS) for customers and reduce their electricity costs by proposing an approach that utilizes a Deep Q‐learning Network (DQN) model. The experimental results indicate that the proposed method outperforms commonly used real‐time scheduling methods significantly. This paper proposes a deep reinforcement learning (DRL)‐based energy management system (EMS) called DRL‐MG for scheduling and processing customers energy purchase requests in real time. The approach uses a Deep Q‐learning Network (DQN) model to provide quality of service (QoS) to customers and reduce their electricity costs. The results show that the proposed method performs significantly better than the commonly used real‐time scheduling methods.
ISSN:1751-8687
1751-8695
DOI:10.1049/gtd2.12866