Loading…
Classification of primitive manufacturing tasks from filtered event data
Collaborative robots are increasingly present in industry to support human activities. However, to make the human–robot collaborative process more effective, there are several challenges to be addressed. Collaborative robotic systems need to be aware of the human activities to (1) anticipate collabo...
Saved in:
Published in: | Journal of manufacturing systems 2023-06, Vol.68, p.12-24 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Collaborative robots are increasingly present in industry to support human activities. However, to make the human–robot collaborative process more effective, there are several challenges to be addressed. Collaborative robotic systems need to be aware of the human activities to (1) anticipate collaborative/assistive actions, (2) learn by demonstration, and (3) activate safety procedures in shared workspace. This study proposes an action classification system to recognize primitive assembly tasks from human motion events data captured by a Dynamic and Active-pixel Vision Sensor (DAVIS). Several filters are compared and combined to remove event data noise. Task patterns are classified from a continuous stream of event data using advanced deep learning and recurrent networks to classify spatial and temporal features. Experiments were conducted on a novel dataset, the dataset of manufacturing tasks (DMT22), featuring 5 classes of representative manufacturing primitives (PickUp, Place, Screw, Hold, Idle) from 5 participants. Results show that the proposed filters remove about 65% of all events (noise) per recording, conducting to a classification accuracy up to 99,37% for subjects that trained the system and 97.08% for new subjects. Data from a left-handed subject were successfully classified using only right-handed training data. These results are object independent.
[Display omitted]
•Classification of primitive assembly tasks from event data.•Combination of multiple event filters to remove noise.•A novel event-based dataset for manufacturing primitive tasks.•Deep learning and recurrent networks to classify spatial and temporal features.•Classification independent from the objects being manipulated. |
---|---|
ISSN: | 0278-6125 1878-6642 |
DOI: | 10.1016/j.jmsy.2023.03.001 |