Loading…

Data-driven Online Learning Engagement Detection via Facial Expression and Mouse Behavior Recognition Technology

Online learning engagement detection is a fundamental problem in educational information technology. Efficient detection of students’ learning situations can provide information to teachers to help them identify students having trouble in real time. To improve the accuracy of learning engagement det...

Full description

Saved in:
Bibliographic Details
Published in:Journal of educational computing research 2020-03, Vol.58 (1), p.63-86
Main Authors: Zhang, Zhaoli, Li, Zhenhua, Liu, Hai, Cao, Taihe, Liu, Sannyuya
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Online learning engagement detection is a fundamental problem in educational information technology. Efficient detection of students’ learning situations can provide information to teachers to help them identify students having trouble in real time. To improve the accuracy of learning engagement detection, we have collected two aspects of students’ behavior data: face data (using adaptive weighted Local Gray Code Patterns for facial expression recognition) and mouse interaction. In this article, we propose a novel learning engagement detection algorithm based on the collected data (students’ behavior), which come from the cameras and the mouse in the online learning environment. The cameras were utilized to capture students’ face images, while the mouse movement data were captured simultaneously. In the process of image data labeling, we built two datasets for classifier training and testing. One took the mouse movement data as a reference, while the other did not. We performed experiments on two datasets using several methods and found that the classifier trained by the former dataset had a better performance, and its recognition rate is higher than that of the latter one (94.60% vs. 91.51%).
ISSN:0735-6331
1541-4140
DOI:10.1177/0735633119825575