Loading…
Efficiency investigation from shallow to deep neural network techniques in human activity recognition
In the last years, several researchers measured different recognition rates with different artificial neural network (ANN) techniques on public data sets in the human activity recognition (HAR) problem. However an overall investigation does not exist in the literature and the efficiency of complex a...
Saved in:
Published in: | Cognitive systems research 2019-05, Vol.54, p.37-49 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In the last years, several researchers measured different recognition rates with different artificial neural network (ANN) techniques on public data sets in the human activity recognition (HAR) problem. However an overall investigation does not exist in the literature and the efficiency of complex and deeper ANNs over shallow networks is not clear. The purpose of this paper is to investigate the recognition rate and time requirement of different kinds of ANN approaches in HAR. This work examines the performance of shallow ANN architectures with different hyper-parameters, ANN ensembles, binary ANN classifier groups, and convolutional neural networks on two public databases. Although the popularity of binary classifiers, classifier ensembles and deep learning have been significantly increasing, this study shows that shallow ANNs with appropriate hyper-parameters in combination with extracted features can reach similar or higher recognition rate in less time than other artificial neural network methods in HAR. With a well-tuned ANN we outperformed all previous results on two public databases. Consequently, instead of the more complex ANN techniques, the usage of simple ANN with two or three layers can be an appropriate choice for activity recognition. |
---|---|
ISSN: | 1389-0417 1389-0417 |
DOI: | 10.1016/j.cogsys.2018.11.009 |