Loading…
An online sparse estimation-based classification approach for real-time monitoring in advanced manufacturing processes from heterogeneous sensor data
The objective of this work is to realize real-time monitoring of process conditions in advanced manufacturing using multiple heterogeneous sensor signals. To achieve this objective we propose an approach invoking the concept of sparse estimation called online sparse estimation-based classification (...
Saved in:
Published in: | IIE transactions 2016-07, Vol.48 (7), p.579-598 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The objective of this work is to realize real-time monitoring of process conditions in advanced manufacturing using multiple heterogeneous sensor signals. To achieve this objective we propose an approach invoking the concept of sparse estimation called online sparse estimation-based classification (OSEC). The novelty of the OSEC approach is in representing data from sensor signals as an underdetermined linear system of equations and subsequently solving the underdetermined linear system using a newly developed greedy Bayesian estimation method. We apply the OSEC approach to two advanced manufacturing scenarios, namely, a fused filament fabrication additive manufacturing process and an ultraprecision semiconductor chemical-mechanical planarization process. Using the proposed OSEC approach, process drifts are detected and classified with higher accuracy compared with popular machine learning techniques. Process drifts were detected and classified with a fidelity approaching 90% (F-score) using OSEC. In comparison, conventional signal analysis techniques-e.g., neural networks, support vector machines, quadratic discriminant analysis, naïve Bayes-were evaluated with F-score in the range of 40% to 70%. |
---|---|
ISSN: | 0740-817X 2472-5854 1545-8830 2472-5862 |
DOI: | 10.1080/0740817X.2015.1122254 |