Loading…
Automatic Neonatal Alertness State Classification Based on Facial Expression Recognition
Premature babies are admitted to the neonatal intensive care unit (NICU) for several weeks and are generally placed under high medical supervision. The NICU environment is considered to have a bad influence on the formation of the sleep-wake cycle of the neonate, known as the circadian rhythm, becau...
Saved in:
Published in: | Journal of advanced computational intelligence and intelligent informatics 2022-03, Vol.26 (2), p.188-195 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-c420t-6b5ebfd64095bda666be8f413f127baaed7b9157ba00626b6ba43525d51e47743 |
container_end_page | 195 |
container_issue | 2 |
container_start_page | 188 |
container_title | Journal of advanced computational intelligence and intelligent informatics |
container_volume | 26 |
creator | Morita, Kento Shirai, Nobu C. Shinkoda, Harumi Matsumoto, Asami Noguchi, Yukari Shiramizu, Masako Wakabayashi, Tetsushi |
description | Premature babies are admitted to the neonatal intensive care unit (NICU) for several weeks and are generally placed under high medical supervision. The NICU environment is considered to have a bad influence on the formation of the sleep-wake cycle of the neonate, known as the circadian rhythm, because patient monitoring and treatment equipment emit light and noise throughout the day. In order to improve the neonatal environment, researchers have investigated the effect of light and noise on neonates. There are some methods and devices to measure neonatal alertness, but they place on additional burden on neonatal patients or nurses. Therefore, this study proposes an automatic non-contact neonatal alertness state classification method using video images. The proposed method consists of a face region of interest (ROI) location normalization method, histogram of oriented gradients (HOG) and gradient feature-based feature extraction methods, and a neonatal alertness state classification method using machine learning. Comparison experiments using 14 video images of 7 neonatal subjects showed that the weighted support vector machine (w-SVM) using the HOG feature and averaging merge achieved the highest classification performance (micro-F1 of 0.732). In clinical situations, body movement is evaluated primarily to classify waking states. The additional 4 class classification experiments are conducted by combining waking states into a single class, with results that suggest that the proposed facial expression based classification is suitable for the detailed classification of sleeping states. |
doi_str_mv | 10.20965/jaciii.2022.p0188 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2640559304</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2640559304</sourcerecordid><originalsourceid>FETCH-LOGICAL-c420t-6b5ebfd64095bda666be8f413f127baaed7b9157ba00626b6ba43525d51e47743</originalsourceid><addsrcrecordid>eNotkN1LwzAUxYMoOOb-AZ8KPnfefLZ9nGNzwlDwA3wLSZtKRtfUJAX97003X-49l_vjHDgI3WJYEqgEvz-o2lqbDkKWA-CyvECzNGleAmaXSVNGc8AUrtEihANA0kQAwzP0uRqjO6po6-zZuF5F1WWrzvjYmxCyt6iiydadCsG2tk6Y67MHFUyTJbFNsQnf_Aw-wdPr1dTuq7cTdoOuWtUFs_jfc_Sx3byvd_n-5fFpvdrnNSMQc6G50W0jGFRcN0oIoU3ZMkxbTAqtlGkKXWGeJIAgQgutGOWENxwbVhSMztHd2Xfw7ns0IcqDG32fIiVJrpxXFCaKnKnauxC8aeXg7VH5X4lBnkqU5xLlVKI8lUj_ABFpZwk</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2640559304</pqid></control><display><type>article</type><title>Automatic Neonatal Alertness State Classification Based on Facial Expression Recognition</title><source>Directory of Open Access Journals</source><creator>Morita, Kento ; Shirai, Nobu C. ; Shinkoda, Harumi ; Matsumoto, Asami ; Noguchi, Yukari ; Shiramizu, Masako ; Wakabayashi, Tetsushi</creator><creatorcontrib>Morita, Kento ; Shirai, Nobu C. ; Shinkoda, Harumi ; Matsumoto, Asami ; Noguchi, Yukari ; Shiramizu, Masako ; Wakabayashi, Tetsushi ; Kagoshima Immaculate Heart University 2365 Amatatsu-cho, Satsumasendai, Kagoshima 895-0011, Japan ; St. Mary College 422 Tubuku-Honmachi, Kurume, Fukuoka 830-8558, Japan ; Graduate School of Engineering, Mie University 1577 Kurimamachiya-cho, Tsu, Mie 514-8507, Japan ; Suzuka University of Medical Science 3500-3 Minamitamagaki, Suzuka, Mie 513-8670, Japan ; Center for Information Technologies and Networks, Mie University 1577 Kurimamachiya-cho, Tsu, Mie 514-8507, Japan ; Kyushu University Hospital 3-5-25 Maidashi, Higashi-ku, Fukuoka, Fukuoka 812-8582, Japan</creatorcontrib><description>Premature babies are admitted to the neonatal intensive care unit (NICU) for several weeks and are generally placed under high medical supervision. The NICU environment is considered to have a bad influence on the formation of the sleep-wake cycle of the neonate, known as the circadian rhythm, because patient monitoring and treatment equipment emit light and noise throughout the day. In order to improve the neonatal environment, researchers have investigated the effect of light and noise on neonates. There are some methods and devices to measure neonatal alertness, but they place on additional burden on neonatal patients or nurses. Therefore, this study proposes an automatic non-contact neonatal alertness state classification method using video images. The proposed method consists of a face region of interest (ROI) location normalization method, histogram of oriented gradients (HOG) and gradient feature-based feature extraction methods, and a neonatal alertness state classification method using machine learning. Comparison experiments using 14 video images of 7 neonatal subjects showed that the weighted support vector machine (w-SVM) using the HOG feature and averaging merge achieved the highest classification performance (micro-F1 of 0.732). In clinical situations, body movement is evaluated primarily to classify waking states. The additional 4 class classification experiments are conducted by combining waking states into a single class, with results that suggest that the proposed facial expression based classification is suitable for the detailed classification of sleeping states.</description><identifier>ISSN: 1343-0130</identifier><identifier>EISSN: 1883-8014</identifier><identifier>DOI: 10.20965/jaciii.2022.p0188</identifier><language>eng</language><publisher>Tokyo: Fuji Technology Press Co. Ltd</publisher><subject>Alertness ; Behavior rating scales ; Circadian rhythms ; Classification ; Face recognition ; Feature extraction ; Histograms ; Image classification ; Machine learning ; Medical imaging ; Support vector machines ; Video data</subject><ispartof>Journal of advanced computational intelligence and intelligent informatics, 2022-03, Vol.26 (2), p.188-195</ispartof><rights>Copyright © 2022 Fuji Technology Press Ltd.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c420t-6b5ebfd64095bda666be8f413f127baaed7b9157ba00626b6ba43525d51e47743</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,864,27923,27924</link.rule.ids></links><search><creatorcontrib>Morita, Kento</creatorcontrib><creatorcontrib>Shirai, Nobu C.</creatorcontrib><creatorcontrib>Shinkoda, Harumi</creatorcontrib><creatorcontrib>Matsumoto, Asami</creatorcontrib><creatorcontrib>Noguchi, Yukari</creatorcontrib><creatorcontrib>Shiramizu, Masako</creatorcontrib><creatorcontrib>Wakabayashi, Tetsushi</creatorcontrib><creatorcontrib>Kagoshima Immaculate Heart University 2365 Amatatsu-cho, Satsumasendai, Kagoshima 895-0011, Japan</creatorcontrib><creatorcontrib>St. Mary College 422 Tubuku-Honmachi, Kurume, Fukuoka 830-8558, Japan</creatorcontrib><creatorcontrib>Graduate School of Engineering, Mie University 1577 Kurimamachiya-cho, Tsu, Mie 514-8507, Japan</creatorcontrib><creatorcontrib>Suzuka University of Medical Science 3500-3 Minamitamagaki, Suzuka, Mie 513-8670, Japan</creatorcontrib><creatorcontrib>Center for Information Technologies and Networks, Mie University 1577 Kurimamachiya-cho, Tsu, Mie 514-8507, Japan</creatorcontrib><creatorcontrib>Kyushu University Hospital 3-5-25 Maidashi, Higashi-ku, Fukuoka, Fukuoka 812-8582, Japan</creatorcontrib><title>Automatic Neonatal Alertness State Classification Based on Facial Expression Recognition</title><title>Journal of advanced computational intelligence and intelligent informatics</title><description>Premature babies are admitted to the neonatal intensive care unit (NICU) for several weeks and are generally placed under high medical supervision. The NICU environment is considered to have a bad influence on the formation of the sleep-wake cycle of the neonate, known as the circadian rhythm, because patient monitoring and treatment equipment emit light and noise throughout the day. In order to improve the neonatal environment, researchers have investigated the effect of light and noise on neonates. There are some methods and devices to measure neonatal alertness, but they place on additional burden on neonatal patients or nurses. Therefore, this study proposes an automatic non-contact neonatal alertness state classification method using video images. The proposed method consists of a face region of interest (ROI) location normalization method, histogram of oriented gradients (HOG) and gradient feature-based feature extraction methods, and a neonatal alertness state classification method using machine learning. Comparison experiments using 14 video images of 7 neonatal subjects showed that the weighted support vector machine (w-SVM) using the HOG feature and averaging merge achieved the highest classification performance (micro-F1 of 0.732). In clinical situations, body movement is evaluated primarily to classify waking states. The additional 4 class classification experiments are conducted by combining waking states into a single class, with results that suggest that the proposed facial expression based classification is suitable for the detailed classification of sleeping states.</description><subject>Alertness</subject><subject>Behavior rating scales</subject><subject>Circadian rhythms</subject><subject>Classification</subject><subject>Face recognition</subject><subject>Feature extraction</subject><subject>Histograms</subject><subject>Image classification</subject><subject>Machine learning</subject><subject>Medical imaging</subject><subject>Support vector machines</subject><subject>Video data</subject><issn>1343-0130</issn><issn>1883-8014</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNotkN1LwzAUxYMoOOb-AZ8KPnfefLZ9nGNzwlDwA3wLSZtKRtfUJAX97003X-49l_vjHDgI3WJYEqgEvz-o2lqbDkKWA-CyvECzNGleAmaXSVNGc8AUrtEihANA0kQAwzP0uRqjO6po6-zZuF5F1WWrzvjYmxCyt6iiydadCsG2tk6Y67MHFUyTJbFNsQnf_Aw-wdPr1dTuq7cTdoOuWtUFs_jfc_Sx3byvd_n-5fFpvdrnNSMQc6G50W0jGFRcN0oIoU3ZMkxbTAqtlGkKXWGeJIAgQgutGOWENxwbVhSMztHd2Xfw7ns0IcqDG32fIiVJrpxXFCaKnKnauxC8aeXg7VH5X4lBnkqU5xLlVKI8lUj_ABFpZwk</recordid><startdate>20220301</startdate><enddate>20220301</enddate><creator>Morita, Kento</creator><creator>Shirai, Nobu C.</creator><creator>Shinkoda, Harumi</creator><creator>Matsumoto, Asami</creator><creator>Noguchi, Yukari</creator><creator>Shiramizu, Masako</creator><creator>Wakabayashi, Tetsushi</creator><general>Fuji Technology Press Co. Ltd</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P5Z</scope><scope>P62</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope></search><sort><creationdate>20220301</creationdate><title>Automatic Neonatal Alertness State Classification Based on Facial Expression Recognition</title><author>Morita, Kento ; Shirai, Nobu C. ; Shinkoda, Harumi ; Matsumoto, Asami ; Noguchi, Yukari ; Shiramizu, Masako ; Wakabayashi, Tetsushi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c420t-6b5ebfd64095bda666be8f413f127baaed7b9157ba00626b6ba43525d51e47743</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Alertness</topic><topic>Behavior rating scales</topic><topic>Circadian rhythms</topic><topic>Classification</topic><topic>Face recognition</topic><topic>Feature extraction</topic><topic>Histograms</topic><topic>Image classification</topic><topic>Machine learning</topic><topic>Medical imaging</topic><topic>Support vector machines</topic><topic>Video data</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Morita, Kento</creatorcontrib><creatorcontrib>Shirai, Nobu C.</creatorcontrib><creatorcontrib>Shinkoda, Harumi</creatorcontrib><creatorcontrib>Matsumoto, Asami</creatorcontrib><creatorcontrib>Noguchi, Yukari</creatorcontrib><creatorcontrib>Shiramizu, Masako</creatorcontrib><creatorcontrib>Wakabayashi, Tetsushi</creatorcontrib><creatorcontrib>Kagoshima Immaculate Heart University 2365 Amatatsu-cho, Satsumasendai, Kagoshima 895-0011, Japan</creatorcontrib><creatorcontrib>St. Mary College 422 Tubuku-Honmachi, Kurume, Fukuoka 830-8558, Japan</creatorcontrib><creatorcontrib>Graduate School of Engineering, Mie University 1577 Kurimamachiya-cho, Tsu, Mie 514-8507, Japan</creatorcontrib><creatorcontrib>Suzuka University of Medical Science 3500-3 Minamitamagaki, Suzuka, Mie 513-8670, Japan</creatorcontrib><creatorcontrib>Center for Information Technologies and Networks, Mie University 1577 Kurimamachiya-cho, Tsu, Mie 514-8507, Japan</creatorcontrib><creatorcontrib>Kyushu University Hospital 3-5-25 Maidashi, Higashi-ku, Fukuoka, Fukuoka 812-8582, Japan</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Journal of advanced computational intelligence and intelligent informatics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Morita, Kento</au><au>Shirai, Nobu C.</au><au>Shinkoda, Harumi</au><au>Matsumoto, Asami</au><au>Noguchi, Yukari</au><au>Shiramizu, Masako</au><au>Wakabayashi, Tetsushi</au><aucorp>Kagoshima Immaculate Heart University 2365 Amatatsu-cho, Satsumasendai, Kagoshima 895-0011, Japan</aucorp><aucorp>St. Mary College 422 Tubuku-Honmachi, Kurume, Fukuoka 830-8558, Japan</aucorp><aucorp>Graduate School of Engineering, Mie University 1577 Kurimamachiya-cho, Tsu, Mie 514-8507, Japan</aucorp><aucorp>Suzuka University of Medical Science 3500-3 Minamitamagaki, Suzuka, Mie 513-8670, Japan</aucorp><aucorp>Center for Information Technologies and Networks, Mie University 1577 Kurimamachiya-cho, Tsu, Mie 514-8507, Japan</aucorp><aucorp>Kyushu University Hospital 3-5-25 Maidashi, Higashi-ku, Fukuoka, Fukuoka 812-8582, Japan</aucorp><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Automatic Neonatal Alertness State Classification Based on Facial Expression Recognition</atitle><jtitle>Journal of advanced computational intelligence and intelligent informatics</jtitle><date>2022-03-01</date><risdate>2022</risdate><volume>26</volume><issue>2</issue><spage>188</spage><epage>195</epage><pages>188-195</pages><issn>1343-0130</issn><eissn>1883-8014</eissn><abstract>Premature babies are admitted to the neonatal intensive care unit (NICU) for several weeks and are generally placed under high medical supervision. The NICU environment is considered to have a bad influence on the formation of the sleep-wake cycle of the neonate, known as the circadian rhythm, because patient monitoring and treatment equipment emit light and noise throughout the day. In order to improve the neonatal environment, researchers have investigated the effect of light and noise on neonates. There are some methods and devices to measure neonatal alertness, but they place on additional burden on neonatal patients or nurses. Therefore, this study proposes an automatic non-contact neonatal alertness state classification method using video images. The proposed method consists of a face region of interest (ROI) location normalization method, histogram of oriented gradients (HOG) and gradient feature-based feature extraction methods, and a neonatal alertness state classification method using machine learning. Comparison experiments using 14 video images of 7 neonatal subjects showed that the weighted support vector machine (w-SVM) using the HOG feature and averaging merge achieved the highest classification performance (micro-F1 of 0.732). In clinical situations, body movement is evaluated primarily to classify waking states. The additional 4 class classification experiments are conducted by combining waking states into a single class, with results that suggest that the proposed facial expression based classification is suitable for the detailed classification of sleeping states.</abstract><cop>Tokyo</cop><pub>Fuji Technology Press Co. Ltd</pub><doi>10.20965/jaciii.2022.p0188</doi><tpages>8</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1343-0130 |
ispartof | Journal of advanced computational intelligence and intelligent informatics, 2022-03, Vol.26 (2), p.188-195 |
issn | 1343-0130 1883-8014 |
language | eng |
recordid | cdi_proquest_journals_2640559304 |
source | Directory of Open Access Journals |
subjects | Alertness Behavior rating scales Circadian rhythms Classification Face recognition Feature extraction Histograms Image classification Machine learning Medical imaging Support vector machines Video data |
title | Automatic Neonatal Alertness State Classification Based on Facial Expression Recognition |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-12T16%3A12%3A13IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Automatic%20Neonatal%20Alertness%20State%20Classification%20Based%20on%20Facial%20Expression%20Recognition&rft.jtitle=Journal%20of%20advanced%20computational%20intelligence%20and%20intelligent%20informatics&rft.au=Morita,%20Kento&rft.aucorp=Kagoshima%20Immaculate%20Heart%20University%202365%20Amatatsu-cho,%20Satsumasendai,%20Kagoshima%20895-0011,%20Japan&rft.date=2022-03-01&rft.volume=26&rft.issue=2&rft.spage=188&rft.epage=195&rft.pages=188-195&rft.issn=1343-0130&rft.eissn=1883-8014&rft_id=info:doi/10.20965/jaciii.2022.p0188&rft_dat=%3Cproquest_cross%3E2640559304%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c420t-6b5ebfd64095bda666be8f413f127baaed7b9157ba00626b6ba43525d51e47743%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2640559304&rft_id=info:pmid/&rfr_iscdi=true |