Loading…

Motion Classification and Features Recognition of a Traditional Chinese Sport (Baduanjin) Using Sampled-Based Methods

This study recognized the motions and assessed the motion accuracy of a traditional Chinese sport (Baduanjin), using the data from the inertial sensor measurement system (IMU) and sampled-based methods. Fifty-three participants were recruited in two batches to participate in the study. Motion data o...

Full description

Saved in:
Bibliographic Details
Published in:Applied sciences 2021-08, Vol.11 (16), p.7630
Main Authors: Li, Hai, Yap, Hwa Jen, Khoo, Selina
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study recognized the motions and assessed the motion accuracy of a traditional Chinese sport (Baduanjin), using the data from the inertial sensor measurement system (IMU) and sampled-based methods. Fifty-three participants were recruited in two batches to participate in the study. Motion data of participants practicing Baduanjin were captured by IMU. By extracting features from motion data and benchmarking with the teacher’s assessment of motion accuracy, this study verifies the effectiveness of assessment on different classifiers for motion accuracy of Baduanjin. Moreover, based on the extracted features, the effectiveness of Baduanjin motion recognition on different classifiers was verified. The k-Nearest Neighbor (k-NN), as a classifier, has advantages in accuracy (more than 85%) and a short average processing time (0.008 s) during assessment. In terms of recognizing motions, the classifier One-dimensional Convolutional Neural Network (1D-CNN) has the highest accuracy among all verified classifiers (99.74%). The results show, using the extracted features of the motion data captained by IMU, that selecting an appropriate classifier can effectively recognize the motions and, hence, assess the motion accuracy of Baduanjin.
ISSN:2076-3417
2076-3417
DOI:10.3390/app11167630