Loading…

An Experimental Platform for Real-Time Students Engagement Measurements from Video in STEM Classrooms

The ability to measure students' engagement in an educational setting may facilitate timely intervention in both the learning and the teaching process in a variety of classroom settings. In this paper, a real-time automatic student engagement measure is proposed through investigating two of the...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2023-02, Vol.23 (3), p.1614
Main Authors: Alkabbany, Islam, Ali, Asem M, Foreman, Chris, Tretter, Thomas, Hindy, Nicholas, Farag, Aly
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c508t-31d15188878bfada6a715a4f459a39165434e42148d121affa9c6da868d52c213
cites cdi_FETCH-LOGICAL-c508t-31d15188878bfada6a715a4f459a39165434e42148d121affa9c6da868d52c213
container_end_page
container_issue 3
container_start_page 1614
container_title Sensors (Basel, Switzerland)
container_volume 23
creator Alkabbany, Islam
Ali, Asem M
Foreman, Chris
Tretter, Thomas
Hindy, Nicholas
Farag, Aly
description The ability to measure students' engagement in an educational setting may facilitate timely intervention in both the learning and the teaching process in a variety of classroom settings. In this paper, a real-time automatic student engagement measure is proposed through investigating two of the main components of engagement: the behavioral engagement and the emotional engagement. A biometric sensor network (BSN) consisting of web cameras, a wall-mounted camera and a high-performance computing machine was designed to capture students' head poses, eye gaze, body movements, and facial emotions. These low-level features are used to train an AI-based model to estimate the behavioral and emotional engagement in the class environment. A set of experiments was conducted to compare the proposed technology with the state-of-the-art frameworks. The proposed framework shows better accuracy in estimating both behavioral and emotional engagement. In addition, it offers superior flexibility to work in any educational environment. Further, this approach allows a quantitative comparison of teaching methods.
doi_str_mv 10.3390/s23031614
format article
fullrecord <record><control><sourceid>gale_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_4f609bc33d77473f841e31a50832c7e1</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A743368194</galeid><doaj_id>oai_doaj_org_article_4f609bc33d77473f841e31a50832c7e1</doaj_id><sourcerecordid>A743368194</sourcerecordid><originalsourceid>FETCH-LOGICAL-c508t-31d15188878bfada6a715a4f459a39165434e42148d121affa9c6da868d52c213</originalsourceid><addsrcrecordid>eNpdUttuEzEQXSEQLYEHfgBZ4gUetvi2vrwgRVGASq1ANPBqTXxZNtpdB3sXwd_jNCVqkSV7ND5zfI5nquolwReMafwuU4YZEYQ_qs4Jp7xWlOLH9-Kz6lnOO4wpY0w9rc6YkJKKhp9Xfjmi9e-9T93gxwl69KWHKcQ0oLKhrx76elOu0M00uwLIaD220PoDGF17yHO6jTMKKQ7oe-d8RN2Ibjbra7TqIecU45CfV08C9Nm_uDsX1bcP683qU331-ePlanlV2warqWbEkYYopaTaBnAgQJIGeOCNBqZJEcy455Rw5QglEAJoKxwooVxDLSVsUV0eeV2EndkXU5D-mAiduU3E1BpIU2d7b3gQWG8tY05KLllQnHhGoOhg1Ep_4Hp_5NrP28E7W1wm6B-QPrwZux-mjb-M1kRzKgrBmzuCFH_OPk9m6LL1fQ-jj3M2VMpG0NIJXKCv_4Pu4pzG8lUHFNcS69K5RXVxRLVQDHRjiOVdW5bzQ2fj6ENX8kvJGROqaCgFb48FNsXSCR9O6gk2h8kxp8kp2Ff37Z6Q_0aF_QU6w7w8</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2774970933</pqid></control><display><type>article</type><title>An Experimental Platform for Real-Time Students Engagement Measurements from Video in STEM Classrooms</title><source>NCBI_PubMed Central(免费)</source><source>ProQuest - Publicly Available Content Database</source><creator>Alkabbany, Islam ; Ali, Asem M ; Foreman, Chris ; Tretter, Thomas ; Hindy, Nicholas ; Farag, Aly</creator><creatorcontrib>Alkabbany, Islam ; Ali, Asem M ; Foreman, Chris ; Tretter, Thomas ; Hindy, Nicholas ; Farag, Aly</creatorcontrib><description>The ability to measure students' engagement in an educational setting may facilitate timely intervention in both the learning and the teaching process in a variety of classroom settings. In this paper, a real-time automatic student engagement measure is proposed through investigating two of the main components of engagement: the behavioral engagement and the emotional engagement. A biometric sensor network (BSN) consisting of web cameras, a wall-mounted camera and a high-performance computing machine was designed to capture students' head poses, eye gaze, body movements, and facial emotions. These low-level features are used to train an AI-based model to estimate the behavioral and emotional engagement in the class environment. A set of experiments was conducted to compare the proposed technology with the state-of-the-art frameworks. The proposed framework shows better accuracy in estimating both behavioral and emotional engagement. In addition, it offers superior flexibility to work in any educational environment. Further, this approach allows a quantitative comparison of teaching methods.</description><identifier>ISSN: 1424-8220</identifier><identifier>EISSN: 1424-8220</identifier><identifier>DOI: 10.3390/s23031614</identifier><identifier>PMID: 36772654</identifier><language>eng</language><publisher>Switzerland: MDPI AG</publisher><subject>Affect (Psychology) ; Automation ; Behavior ; behavioral engagement ; Biometrics ; Classrooms ; College students ; Education ; emotional engagement ; Emotions ; Engineering ; Eye movements ; Feedback ; Head movement ; Humans ; Learning ; Mathematics ; Methods ; Real time ; School environment ; Sensors ; student engagement ; Student participation ; Student retention ; Students ; Students - psychology ; Teaching ; Teaching methods ; Time measurement</subject><ispartof>Sensors (Basel, Switzerland), 2023-02, Vol.23 (3), p.1614</ispartof><rights>COPYRIGHT 2023 MDPI AG</rights><rights>2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2023 by the authors. 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c508t-31d15188878bfada6a715a4f459a39165434e42148d121affa9c6da868d52c213</citedby><cites>FETCH-LOGICAL-c508t-31d15188878bfada6a715a4f459a39165434e42148d121affa9c6da868d52c213</cites><orcidid>0000-0002-0503-4838</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2774970933/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2774970933?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793,75126</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36772654$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Alkabbany, Islam</creatorcontrib><creatorcontrib>Ali, Asem M</creatorcontrib><creatorcontrib>Foreman, Chris</creatorcontrib><creatorcontrib>Tretter, Thomas</creatorcontrib><creatorcontrib>Hindy, Nicholas</creatorcontrib><creatorcontrib>Farag, Aly</creatorcontrib><title>An Experimental Platform for Real-Time Students Engagement Measurements from Video in STEM Classrooms</title><title>Sensors (Basel, Switzerland)</title><addtitle>Sensors (Basel)</addtitle><description>The ability to measure students' engagement in an educational setting may facilitate timely intervention in both the learning and the teaching process in a variety of classroom settings. In this paper, a real-time automatic student engagement measure is proposed through investigating two of the main components of engagement: the behavioral engagement and the emotional engagement. A biometric sensor network (BSN) consisting of web cameras, a wall-mounted camera and a high-performance computing machine was designed to capture students' head poses, eye gaze, body movements, and facial emotions. These low-level features are used to train an AI-based model to estimate the behavioral and emotional engagement in the class environment. A set of experiments was conducted to compare the proposed technology with the state-of-the-art frameworks. The proposed framework shows better accuracy in estimating both behavioral and emotional engagement. In addition, it offers superior flexibility to work in any educational environment. Further, this approach allows a quantitative comparison of teaching methods.</description><subject>Affect (Psychology)</subject><subject>Automation</subject><subject>Behavior</subject><subject>behavioral engagement</subject><subject>Biometrics</subject><subject>Classrooms</subject><subject>College students</subject><subject>Education</subject><subject>emotional engagement</subject><subject>Emotions</subject><subject>Engineering</subject><subject>Eye movements</subject><subject>Feedback</subject><subject>Head movement</subject><subject>Humans</subject><subject>Learning</subject><subject>Mathematics</subject><subject>Methods</subject><subject>Real time</subject><subject>School environment</subject><subject>Sensors</subject><subject>student engagement</subject><subject>Student participation</subject><subject>Student retention</subject><subject>Students</subject><subject>Students - psychology</subject><subject>Teaching</subject><subject>Teaching methods</subject><subject>Time measurement</subject><issn>1424-8220</issn><issn>1424-8220</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNpdUttuEzEQXSEQLYEHfgBZ4gUetvi2vrwgRVGASq1ANPBqTXxZNtpdB3sXwd_jNCVqkSV7ND5zfI5nquolwReMafwuU4YZEYQ_qs4Jp7xWlOLH9-Kz6lnOO4wpY0w9rc6YkJKKhp9Xfjmi9e-9T93gxwl69KWHKcQ0oLKhrx76elOu0M00uwLIaD220PoDGF17yHO6jTMKKQ7oe-d8RN2Ibjbra7TqIecU45CfV08C9Nm_uDsX1bcP683qU331-ePlanlV2warqWbEkYYopaTaBnAgQJIGeOCNBqZJEcy455Rw5QglEAJoKxwooVxDLSVsUV0eeV2EndkXU5D-mAiduU3E1BpIU2d7b3gQWG8tY05KLllQnHhGoOhg1Ep_4Hp_5NrP28E7W1wm6B-QPrwZux-mjb-M1kRzKgrBmzuCFH_OPk9m6LL1fQ-jj3M2VMpG0NIJXKCv_4Pu4pzG8lUHFNcS69K5RXVxRLVQDHRjiOVdW5bzQ2fj6ENX8kvJGROqaCgFb48FNsXSCR9O6gk2h8kxp8kp2Ff37Z6Q_0aF_QU6w7w8</recordid><startdate>20230202</startdate><enddate>20230202</enddate><creator>Alkabbany, Islam</creator><creator>Ali, Asem M</creator><creator>Foreman, Chris</creator><creator>Tretter, Thomas</creator><creator>Hindy, Nicholas</creator><creator>Farag, Aly</creator><general>MDPI AG</general><general>MDPI</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-0503-4838</orcidid></search><sort><creationdate>20230202</creationdate><title>An Experimental Platform for Real-Time Students Engagement Measurements from Video in STEM Classrooms</title><author>Alkabbany, Islam ; Ali, Asem M ; Foreman, Chris ; Tretter, Thomas ; Hindy, Nicholas ; Farag, Aly</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c508t-31d15188878bfada6a715a4f459a39165434e42148d121affa9c6da868d52c213</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Affect (Psychology)</topic><topic>Automation</topic><topic>Behavior</topic><topic>behavioral engagement</topic><topic>Biometrics</topic><topic>Classrooms</topic><topic>College students</topic><topic>Education</topic><topic>emotional engagement</topic><topic>Emotions</topic><topic>Engineering</topic><topic>Eye movements</topic><topic>Feedback</topic><topic>Head movement</topic><topic>Humans</topic><topic>Learning</topic><topic>Mathematics</topic><topic>Methods</topic><topic>Real time</topic><topic>School environment</topic><topic>Sensors</topic><topic>student engagement</topic><topic>Student participation</topic><topic>Student retention</topic><topic>Students</topic><topic>Students - psychology</topic><topic>Teaching</topic><topic>Teaching methods</topic><topic>Time measurement</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Alkabbany, Islam</creatorcontrib><creatorcontrib>Ali, Asem M</creatorcontrib><creatorcontrib>Foreman, Chris</creatorcontrib><creatorcontrib>Tretter, Thomas</creatorcontrib><creatorcontrib>Hindy, Nicholas</creatorcontrib><creatorcontrib>Farag, Aly</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Health and Medical</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>ProQuest - Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Sensors (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Alkabbany, Islam</au><au>Ali, Asem M</au><au>Foreman, Chris</au><au>Tretter, Thomas</au><au>Hindy, Nicholas</au><au>Farag, Aly</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An Experimental Platform for Real-Time Students Engagement Measurements from Video in STEM Classrooms</atitle><jtitle>Sensors (Basel, Switzerland)</jtitle><addtitle>Sensors (Basel)</addtitle><date>2023-02-02</date><risdate>2023</risdate><volume>23</volume><issue>3</issue><spage>1614</spage><pages>1614-</pages><issn>1424-8220</issn><eissn>1424-8220</eissn><abstract>The ability to measure students' engagement in an educational setting may facilitate timely intervention in both the learning and the teaching process in a variety of classroom settings. In this paper, a real-time automatic student engagement measure is proposed through investigating two of the main components of engagement: the behavioral engagement and the emotional engagement. A biometric sensor network (BSN) consisting of web cameras, a wall-mounted camera and a high-performance computing machine was designed to capture students' head poses, eye gaze, body movements, and facial emotions. These low-level features are used to train an AI-based model to estimate the behavioral and emotional engagement in the class environment. A set of experiments was conducted to compare the proposed technology with the state-of-the-art frameworks. The proposed framework shows better accuracy in estimating both behavioral and emotional engagement. In addition, it offers superior flexibility to work in any educational environment. Further, this approach allows a quantitative comparison of teaching methods.</abstract><cop>Switzerland</cop><pub>MDPI AG</pub><pmid>36772654</pmid><doi>10.3390/s23031614</doi><orcidid>https://orcid.org/0000-0002-0503-4838</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1424-8220
ispartof Sensors (Basel, Switzerland), 2023-02, Vol.23 (3), p.1614
issn 1424-8220
1424-8220
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_4f609bc33d77473f841e31a50832c7e1
source NCBI_PubMed Central(免费); ProQuest - Publicly Available Content Database
subjects Affect (Psychology)
Automation
Behavior
behavioral engagement
Biometrics
Classrooms
College students
Education
emotional engagement
Emotions
Engineering
Eye movements
Feedback
Head movement
Humans
Learning
Mathematics
Methods
Real time
School environment
Sensors
student engagement
Student participation
Student retention
Students
Students - psychology
Teaching
Teaching methods
Time measurement
title An Experimental Platform for Real-Time Students Engagement Measurements from Video in STEM Classrooms
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T07%3A10%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20Experimental%20Platform%20for%20Real-Time%20Students%20Engagement%20Measurements%20from%20Video%20in%20STEM%20Classrooms&rft.jtitle=Sensors%20(Basel,%20Switzerland)&rft.au=Alkabbany,%20Islam&rft.date=2023-02-02&rft.volume=23&rft.issue=3&rft.spage=1614&rft.pages=1614-&rft.issn=1424-8220&rft.eissn=1424-8220&rft_id=info:doi/10.3390/s23031614&rft_dat=%3Cgale_doaj_%3EA743368194%3C/gale_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c508t-31d15188878bfada6a715a4f459a39165434e42148d121affa9c6da868d52c213%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2774970933&rft_id=info:pmid/36772654&rft_galeid=A743368194&rfr_iscdi=true