Loading…
Heart Rate and Accelerometer Data Fusion for Activity Assessment of Rescuers During Emergency Interventions
The current state of the art in wearable electronics is the integration of very small devices into textile fabrics, the so-called ¿smart garment.¿ The ProeTEX project is one of many initiatives dedicated to the development of smart garments specifically designed for people who risk their lives in th...
Saved in:
Published in: | IEEE journal of biomedical and health informatics 2010-05, Vol.14 (3), p.702-710 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The current state of the art in wearable electronics is the integration of very small devices into textile fabrics, the so-called ¿smart garment.¿ The ProeTEX project is one of many initiatives dedicated to the development of smart garments specifically designed for people who risk their lives in the line of duty such as fire fighters and Civil Protection rescuers. These garments have integrated multipurpose sensors that monitor their activities while in action. To this aim, we have developed an algorithm that combines both features extracted from the signal of a triaxial accelerometer and one ECG lead. Microprocessors integrated in the garments detect the signal magnitude area of inertial acceleration, step frequency, trunk inclination, heart rate (HR), and HR trend in real time. Given these inputs, a classifier assigns these signals to nine classes differentiating between certain physical activities (walking, running, moving on site), intensities (intense, mild, or at rest) and postures (lying down, standing up). Specific classes will be identified as dangerous to the rescuer during operation, such as, ¿subject motionless lying down¿ or ¿subject resting with abnormal HR.¿ Laboratory tests were carried out on seven healthy adult subjects with the collection of over 4.5 h of data. The results were very positive, achieving an overall classification accuracy of 88.8%. |
---|---|
ISSN: | 1089-7771 2168-2194 1558-0032 2168-2208 |
DOI: | 10.1109/TITB.2010.2047727 |