Loading…
Probabilistic Adaptive Slow Feature Analysis for State Estimation and Classification
Slow feature analysis (SFA), as a method for learning slowly varying features in classification and signal analysis, has attracted increasing attention in recent years. It is guaranteed to find the optimal solution within a family of transformations and can learn to extract a large number of such &q...
Saved in:
Published in: | IEEE transactions on instrumentation and measurement 2024, Vol.73, p.1-15 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Slow feature analysis (SFA), as a method for learning slowly varying features in classification and signal analysis, has attracted increasing attention in recent years. It is guaranteed to find the optimal solution within a family of transformations and can learn to extract a large number of such "slow" features that are additionally orthogonal. While being effective, SFA suffers from three major shortcomings. First, it is not adaptive, in that the notion of "slowness" is fixed to be the average of the squared derivative of a signal. Second, its formulation assumes deterministic features/observations, which renders it ineffective for applications involving stochastic features/measurement uncertainties. Third, SFA is not necessarily suited for nonlinear processes. In this work, all these issues are addressed by proposing adaptive SFA and its probabilistic version, termed the ASFA and probabilistic ASFA (PASFA), respectively. The ASFA essentially introduces an optimally tunable filter that accommodates task-dependent slowness. We provide guidelines and a procedure for tuning the ASFA filter for a given application. We prove that linear SFA is a special case of ASFA for a specific choice of filter and demonstrate that it outperforms the standard SFA. PASFA extends ASFA to accommodate measurement and process noises, wherein the features are modeled as filtered Gaussian white noise (GWN) sequences, and the measurement noise is assumed to be GWN. Furthermore, it accommodates nonlinear generative models. A variational expectation-maximization (EM) algorithm is used for estimating the filter parameters and the statistical properties of noise sequences. Frequency-domain interpretations of ASFA are also provided to enhance interpretability. Case studies involving applications to signal analysis and classification are presented to demonstrate the effectiveness and superiority of the proposed methods. |
---|---|
ISSN: | 0018-9456 1557-9662 |
DOI: | 10.1109/TIM.2024.3353267 |