Loading…
Video-based Indoor Human Gait Recognition Using Depth Imaging and Hidden Markov Model: A Smart System for Smart Home
Smart homes that are capable of home healthcare and e-Health services are receiving much attention due to their potential for better care of the elderly and disabled in an indoor environment. Recently the Center for Sustainable Healthy Buildings at Kyung Hee University has developed a novel indoor h...
Saved in:
Published in: | Indoor + built environment 2011-02, Vol.20 (1), p.120-128 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Smart homes that are capable of home healthcare and e-Health services are receiving much attention due to their potential for better care of the elderly and disabled in an indoor environment. Recently the Center for Sustainable Healthy Buildings at Kyung Hee University has developed a novel indoor human activity recognition methodology based on depth imaging of a user’s activities. This system utilizes Independent Component Analysis to extract spatiotemporal features from a series of depth silhouettes of various activities. To recognise the activities from the spatiotemporal features, trained Hidden Markov Models of the activities would be used. In this study, this technique has been extended to recognise human gaits (including normal and abnormal). Since this system could be of great significance for the caring of the elderly, to promote and preserve their health and independence, the gait recognition system would be considered a primary function of the smart system for smart homes. The indoor gait recognition system is trained to detect abnormal gait patterns and generate warnings. The system works in real-time and is aimed to be installed at smart homes. This paper provides the information for further development of the system for their application in the future. |
---|---|
ISSN: | 1420-326X 1423-0070 |
DOI: | 10.1177/1420326X10391140 |