Loading…

A Novel Self-Organizing Fuzzy Neural Network to Learn and Mimic Habitual Sequential Tasks

In this article, a new self-organizing fuzzy neural network (FNN) model is presented which is able to simultaneously and accurately learn and reproduce different sequences. Multiple sequence learning is important in performing habitual and skillful tasks, such as writing, signing signatures, and pla...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on cybernetics 2022-01, Vol.52 (1), p.323-332
Main Authors: Salimi-Badr, Armin, Ebadzadeh, Mohammad Mehdi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this article, a new self-organizing fuzzy neural network (FNN) model is presented which is able to simultaneously and accurately learn and reproduce different sequences. Multiple sequence learning is important in performing habitual and skillful tasks, such as writing, signing signatures, and playing piano. Generally, it is indispensable for pattern generation applications. Since multiple sequences have similar parts, local information such as some previous samples is not sufficient to efficiently reproduce them. Instead, it is necessary to consider global and discriminative information, maybe in the very initial samples of each sequence, to first recognize them, and then predict their next sample based on the current local information. Therefore, the structure of the proposed network consists of two parts: 1) sequence identifier, which computes a novel sequence identity value based on initial samples of a sequence, and detects the sequence identity based on proper fuzzy rules and 2) sequence locator , which locates the input sample in the sequence. Therefore, by integrating outputs of these two parts in fuzzy rules, the network is able to produce the proper output based on the current state of each sequence. To learn the proposed structure, a gradual learning procedure is proposed. First, learning is performed by adding new fuzzy rules, based on coverage measure, using available correct data. Next, the initialized parameters are fine-tuned, by the gradient descent algorithm, based on fed back approximated network output as the next input. The proposed method has a dynamic structure able to learn new sequences online. Finally, to investigate the effectiveness of the presented approach, it is used to simultaneously learn and reproduce multiple sequences in different applications, including sequences with similar parts, different patterns, and writing different letters. The performance of the proposed method is evaluated and compared with other existing methods, including the adaptive network-based fuzzy inference system, GDFNN, CFNN, and long short-term memory (LSTM). According to these experiments, the proposed method outperforms traditional FNNs and LSTM in learning multiple sequences.
ISSN:2168-2267
2168-2275
DOI:10.1109/TCYB.2020.2984646