Loading…

Novel training algorithms for long short-term memory neural network

More recently, due to the enormous potential of long short-term memory (LSTM) neural network in various fields, some efficient training algorithms have been developed, including the extended Kalman filter (EKF)-based training algorithm and particle filter (PF)-based training algorithm. However, it s...

Full description

Saved in:
Bibliographic Details
Published in:IET signal processing 2019-05, Vol.13 (3), p.304-308
Main Authors: Li, Xiaodong, Yu, Changjun, Su, Fulin, Quan, Taifan, Yang, Xuguang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:More recently, due to the enormous potential of long short-term memory (LSTM) neural network in various fields, some efficient training algorithms have been developed, including the extended Kalman filter (EKF)-based training algorithm and particle filter (PF)-based training algorithm. However, it should be noted that if the system is highly non-linear, the linearisation employed in the EKF may cause instability. Moreover, the PF usually suffers from the particle degeneracy. Therefore, the PF-based training algorithm may only find a poor local optimum. To solve these problems, an unscented Kalman filter (UKF)-based training algorithm is proposed. The UKF employs a deterministic sampling method; hence, there is no linearisation in it and it does not have the degeneracy problem. Moreover, the computational complexity of the UKF is the same order as that of the EKF. To further reduce the computational complexity, the authors propose a minimum norm UKF (MN-UKF) to obtain a good trade-off between performance and complexity. To the best of the authors’ knowledge, this is the first reported solution to this problem. Simulations using both benchmark synthetic signal and real-world signal illustrate the potential of the algorithms developed.
ISSN:1751-9675
1751-9683
1751-9683
DOI:10.1049/iet-spr.2018.5240