Loading…

State-of-Health Prediction of Lithium-Ion Batteries Using Exponential Smoothing Transformer With Seasonal and Growth Embedding

In the world of modern energy, Lithium-Ion batteries reign supreme, offering rechargeability, sustainability, and long-term energy storage. However, their lifespan is not infinite, calling for accurate prediction of remaining life under various conditions. Deep learning shines in this domain, with t...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2024, Vol.12, p.14659-14670
Main Authors: Fauzi, Muhammad Rifqi, Yudistira, Novanto, Mahmudy, Wayan Firdaus
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In the world of modern energy, Lithium-Ion batteries reign supreme, offering rechargeability, sustainability, and long-term energy storage. However, their lifespan is not infinite, calling for accurate prediction of remaining life under various conditions. Deep learning shines in this domain, with the Transformer architecture blossoming as a powerful tool for time series forecasting. This research dives into data collection, processing, model design, training, and evaluation, making key methodological contributions to battery life prediction. Notably, the SGEformer model, a Transformer enhanced with growth and seasonal embedding, emerges as a groundbreaking innovation. Comparing SGEformer to ETSformer, Informer, Reformer, Transformer, and LSTM reveals its unique strengths. With an impressive MSE score of 0.000117, SGEformer establishes itself as a highly effective tool for battery life prediction, highlighting the value of growth and seasonal embedding in boosting accuracy. This research propels the state-of-the-art Lithium-Ion battery state-of-health prediction, offering a robust methodological foundation for precise and reliable forecasts. Code can be accessed at https://github.com/MRifqiFz/SGEformer .
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3357736