Loading…
Gait-based person fall prediction using deep learning approach
Technology development and digital techniques provide wide opportunities to develop automatic systems. With the help of automated assessment systems, fall prediction for elders or persons with walking disabilities can be identified. Instead of conventional manual video assessment, the prediction acc...
Saved in:
Published in: | Soft computing (Berlin, Germany) Germany), 2022-12, Vol.26 (23), p.12933-12941 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Technology development and digital techniques provide wide opportunities to develop automatic systems. With the help of automated assessment systems, fall prediction for elders or persons with walking disabilities can be identified. Instead of conventional manual video assessment, the prediction accuracy can be improved if the person's gait model is analyzed. Various gait analysis models are evolved in the recent era that uses support vector machine, artificial neural network, and backpropagation neural network for analysis. However, analyzing gait energy images for fall prediction is addressed in limited research works. Considering this research gap this research work proposed a gait-based fall prediction model using a deep learning approach and identifies the early fall of persons with walking disabilities. Gait energy image is used as input for the proposed deep convolutional neural network (DCNN) to predict the early fall. The proposed DCNN model attains a classification accuracy of 99.1% and prediction ratio of 98.64% which is much better than conventional ResNet 50 and CNN method-based gait analysis. Few other parameters, such as specificity, sensitivity, and detection accuracy, are also analyzed to validate the proposed model performance. |
---|---|
ISSN: | 1432-7643 1433-7479 |
DOI: | 10.1007/s00500-021-06125-1 |