Loading…
An Investigation of Olfactory-enhanced Video on EEG-based Emotion Recognition
Collecting emotional physiological signals is significant in building affective Human-Computer Interactions (HCI). However, how to evoke subjects' emotions efficiently in EEG-related emotional experiments is still a challenge. In this work, we developed a novel experimental paradigm that allows...
Saved in:
Published in: | IEEE transactions on neural systems and rehabilitation engineering 2023-01, Vol.31, p.1-1 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c511t-72b6b448a5c8f619facc04a7941dcb04129d85d364c59eeb3b417dbc71216e123 |
---|---|
cites | cdi_FETCH-LOGICAL-c511t-72b6b448a5c8f619facc04a7941dcb04129d85d364c59eeb3b417dbc71216e123 |
container_end_page | 1 |
container_issue | |
container_start_page | 1 |
container_title | IEEE transactions on neural systems and rehabilitation engineering |
container_volume | 31 |
creator | Wu, Minchao Teng, Wei Fan, Cunhang Pei, Shengbing Li, Ping Lv, Zhao |
description | Collecting emotional physiological signals is significant in building affective Human-Computer Interactions (HCI). However, how to evoke subjects' emotions efficiently in EEG-related emotional experiments is still a challenge. In this work, we developed a novel experimental paradigm that allows odors dynamically participate in different stages of video-evoked emotions, to investigate the efficiency of olfactory-enhanced videos in inducing subjects' emotions; According to the period that the odors participated in, the stimuli were divided into four patterns, i.e., the olfactory-enhanced video in early/ later stimulus periods (OVEP/ OVLP), and the traditional videos in early/ later stimulus periods (TVEP/ TVLP). The differential entropy (DE) feature and four classifiers were employed to test the efficiency of emotion recognition. The best average accuracies of the OVEP, OVLP, TVEP, and TVLP were 50.54%, 51.49%, 40.22%, and 57.55%, respectively. The experimental results indicated that the OVEP significantly outperformed the TVEP on classification performance, while there was no significant difference between the OVLP and TVLP. Besides, olfactory-enhanced videos achieved higher efficiency in evoking negative emotions than traditional videos. Moreover, we found that the neural patterns in response to emotions under different stimulus methods were stable, and for Fp1, FP2, and F7, there existed significant differences in whether adopt the odors. |
doi_str_mv | 10.1109/TNSRE.2023.3253866 |
format | article |
fullrecord | <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_10063978</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10063978</ieee_id><doaj_id>oai_doaj_org_article_22142868cd704b6bafff34e7c33efdf2</doaj_id><sourcerecordid>2786985227</sourcerecordid><originalsourceid>FETCH-LOGICAL-c511t-72b6b448a5c8f619facc04a7941dcb04129d85d364c59eeb3b417dbc71216e123</originalsourceid><addsrcrecordid>eNpdkU9r3DAQxU1paNK0X6CUYsilF2_0X_IxBDddSBNI0l6FLI22WrxWKnkL-faVdzch9KRh9HtvRnpV9QmjBcaoPX-4ub_rFgQRuqCEUyXEm-oEc64aRDB6O9eUNYwSdFy9z3mNEJaCy3fVMZWIKMrZSfXjYqyX41_IU1iZKcSxjr6-HbyxU0xPDYy_zWjB1b-Cg1iX6667anqTS6vbxJ3gDmxcjWGuP1RH3gwZPh7O0-rnt-7h8ntzfXu1vLy4bizHeGok6UXPmDLcKi9wW6ZZxIxsGXa2RwyT1inuqGCWtwA97RmWrrcSEywAE3paLfe-Lpq1fkxhY9KTjiboXSOmlTZpCnYATQhmRAllnUSsjDXee8pAWkrBOz97fd17Pab4Z1s-Qm9CtjAMZoS4zZrIVkmkOEYFPfsPXcdtGstLC6VEqzghslBkT9kUc07gXxbESM_B6V1weg5OH4Iroi8H622_AfcieU6qAJ_3QACAV45I0FYq-g-pSZr7</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2786985227</pqid></control><display><type>article</type><title>An Investigation of Olfactory-enhanced Video on EEG-based Emotion Recognition</title><source>Alma/SFX Local Collection</source><creator>Wu, Minchao ; Teng, Wei ; Fan, Cunhang ; Pei, Shengbing ; Li, Ping ; Lv, Zhao</creator><creatorcontrib>Wu, Minchao ; Teng, Wei ; Fan, Cunhang ; Pei, Shengbing ; Li, Ping ; Lv, Zhao</creatorcontrib><description>Collecting emotional physiological signals is significant in building affective Human-Computer Interactions (HCI). However, how to evoke subjects' emotions efficiently in EEG-related emotional experiments is still a challenge. In this work, we developed a novel experimental paradigm that allows odors dynamically participate in different stages of video-evoked emotions, to investigate the efficiency of olfactory-enhanced videos in inducing subjects' emotions; According to the period that the odors participated in, the stimuli were divided into four patterns, i.e., the olfactory-enhanced video in early/ later stimulus periods (OVEP/ OVLP), and the traditional videos in early/ later stimulus periods (TVEP/ TVLP). The differential entropy (DE) feature and four classifiers were employed to test the efficiency of emotion recognition. The best average accuracies of the OVEP, OVLP, TVEP, and TVLP were 50.54%, 51.49%, 40.22%, and 57.55%, respectively. The experimental results indicated that the OVEP significantly outperformed the TVEP on classification performance, while there was no significant difference between the OVLP and TVLP. Besides, olfactory-enhanced videos achieved higher efficiency in evoking negative emotions than traditional videos. Moreover, we found that the neural patterns in response to emotions under different stimulus methods were stable, and for Fp1, FP2, and F7, there existed significant differences in whether adopt the odors.</description><identifier>ISSN: 1534-4320</identifier><identifier>EISSN: 1558-0210</identifier><identifier>DOI: 10.1109/TNSRE.2023.3253866</identifier><identifier>PMID: 37028354</identifier><identifier>CODEN: ITNSB3</identifier><language>eng</language><publisher>United States: IEEE</publisher><subject>Brain ; EEG ; Efficiency ; Electroencephalogram (EEG) ; Electroencephalography ; Electroencephalography - methods ; Emotion recognition ; Emotions ; Emotions - physiology ; Entropy ; Human computer interaction ; Human-computer interface ; human-computer interface (HCI) ; Humans ; neural pattern ; Odors ; Olfactory ; olfactory-enhanced video ; Physiology ; Recognition, Psychology ; Video ; Visualization</subject><ispartof>IEEE transactions on neural systems and rehabilitation engineering, 2023-01, Vol.31, p.1-1</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c511t-72b6b448a5c8f619facc04a7941dcb04129d85d364c59eeb3b417dbc71216e123</citedby><cites>FETCH-LOGICAL-c511t-72b6b448a5c8f619facc04a7941dcb04129d85d364c59eeb3b417dbc71216e123</cites><orcidid>0000-0002-8103-4709 ; 0000-0002-7629-7459 ; 0000-0001-9727-366X ; 0000-0003-1343-8768 ; 0000-0003-0057-0807 ; 0000-0001-6318-8803</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27922,27923</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37028354$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Wu, Minchao</creatorcontrib><creatorcontrib>Teng, Wei</creatorcontrib><creatorcontrib>Fan, Cunhang</creatorcontrib><creatorcontrib>Pei, Shengbing</creatorcontrib><creatorcontrib>Li, Ping</creatorcontrib><creatorcontrib>Lv, Zhao</creatorcontrib><title>An Investigation of Olfactory-enhanced Video on EEG-based Emotion Recognition</title><title>IEEE transactions on neural systems and rehabilitation engineering</title><addtitle>TNSRE</addtitle><addtitle>IEEE Trans Neural Syst Rehabil Eng</addtitle><description>Collecting emotional physiological signals is significant in building affective Human-Computer Interactions (HCI). However, how to evoke subjects' emotions efficiently in EEG-related emotional experiments is still a challenge. In this work, we developed a novel experimental paradigm that allows odors dynamically participate in different stages of video-evoked emotions, to investigate the efficiency of olfactory-enhanced videos in inducing subjects' emotions; According to the period that the odors participated in, the stimuli were divided into four patterns, i.e., the olfactory-enhanced video in early/ later stimulus periods (OVEP/ OVLP), and the traditional videos in early/ later stimulus periods (TVEP/ TVLP). The differential entropy (DE) feature and four classifiers were employed to test the efficiency of emotion recognition. The best average accuracies of the OVEP, OVLP, TVEP, and TVLP were 50.54%, 51.49%, 40.22%, and 57.55%, respectively. The experimental results indicated that the OVEP significantly outperformed the TVEP on classification performance, while there was no significant difference between the OVLP and TVLP. Besides, olfactory-enhanced videos achieved higher efficiency in evoking negative emotions than traditional videos. Moreover, we found that the neural patterns in response to emotions under different stimulus methods were stable, and for Fp1, FP2, and F7, there existed significant differences in whether adopt the odors.</description><subject>Brain</subject><subject>EEG</subject><subject>Efficiency</subject><subject>Electroencephalogram (EEG)</subject><subject>Electroencephalography</subject><subject>Electroencephalography - methods</subject><subject>Emotion recognition</subject><subject>Emotions</subject><subject>Emotions - physiology</subject><subject>Entropy</subject><subject>Human computer interaction</subject><subject>Human-computer interface</subject><subject>human-computer interface (HCI)</subject><subject>Humans</subject><subject>neural pattern</subject><subject>Odors</subject><subject>Olfactory</subject><subject>olfactory-enhanced video</subject><subject>Physiology</subject><subject>Recognition, Psychology</subject><subject>Video</subject><subject>Visualization</subject><issn>1534-4320</issn><issn>1558-0210</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>DOA</sourceid><recordid>eNpdkU9r3DAQxU1paNK0X6CUYsilF2_0X_IxBDddSBNI0l6FLI22WrxWKnkL-faVdzch9KRh9HtvRnpV9QmjBcaoPX-4ub_rFgQRuqCEUyXEm-oEc64aRDB6O9eUNYwSdFy9z3mNEJaCy3fVMZWIKMrZSfXjYqyX41_IU1iZKcSxjr6-HbyxU0xPDYy_zWjB1b-Cg1iX6667anqTS6vbxJ3gDmxcjWGuP1RH3gwZPh7O0-rnt-7h8ntzfXu1vLy4bizHeGok6UXPmDLcKi9wW6ZZxIxsGXa2RwyT1inuqGCWtwA97RmWrrcSEywAE3paLfe-Lpq1fkxhY9KTjiboXSOmlTZpCnYATQhmRAllnUSsjDXee8pAWkrBOz97fd17Pab4Z1s-Qm9CtjAMZoS4zZrIVkmkOEYFPfsPXcdtGstLC6VEqzghslBkT9kUc07gXxbESM_B6V1weg5OH4Iroi8H622_AfcieU6qAJ_3QACAV45I0FYq-g-pSZr7</recordid><startdate>20230101</startdate><enddate>20230101</enddate><creator>Wu, Minchao</creator><creator>Teng, Wei</creator><creator>Fan, Cunhang</creator><creator>Pei, Shengbing</creator><creator>Li, Ping</creator><creator>Lv, Zhao</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TK</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>NAPCQ</scope><scope>P64</scope><scope>7X8</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-8103-4709</orcidid><orcidid>https://orcid.org/0000-0002-7629-7459</orcidid><orcidid>https://orcid.org/0000-0001-9727-366X</orcidid><orcidid>https://orcid.org/0000-0003-1343-8768</orcidid><orcidid>https://orcid.org/0000-0003-0057-0807</orcidid><orcidid>https://orcid.org/0000-0001-6318-8803</orcidid></search><sort><creationdate>20230101</creationdate><title>An Investigation of Olfactory-enhanced Video on EEG-based Emotion Recognition</title><author>Wu, Minchao ; Teng, Wei ; Fan, Cunhang ; Pei, Shengbing ; Li, Ping ; Lv, Zhao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c511t-72b6b448a5c8f619facc04a7941dcb04129d85d364c59eeb3b417dbc71216e123</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Brain</topic><topic>EEG</topic><topic>Efficiency</topic><topic>Electroencephalogram (EEG)</topic><topic>Electroencephalography</topic><topic>Electroencephalography - methods</topic><topic>Emotion recognition</topic><topic>Emotions</topic><topic>Emotions - physiology</topic><topic>Entropy</topic><topic>Human computer interaction</topic><topic>Human-computer interface</topic><topic>human-computer interface (HCI)</topic><topic>Humans</topic><topic>neural pattern</topic><topic>Odors</topic><topic>Olfactory</topic><topic>olfactory-enhanced video</topic><topic>Physiology</topic><topic>Recognition, Psychology</topic><topic>Video</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wu, Minchao</creatorcontrib><creatorcontrib>Teng, Wei</creatorcontrib><creatorcontrib>Fan, Cunhang</creatorcontrib><creatorcontrib>Pei, Shengbing</creatorcontrib><creatorcontrib>Li, Ping</creatorcontrib><creatorcontrib>Lv, Zhao</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Xplore Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Electronic Library (IEL)</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Neurosciences Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Nursing & Allied Health Premium</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>MEDLINE - Academic</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE transactions on neural systems and rehabilitation engineering</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wu, Minchao</au><au>Teng, Wei</au><au>Fan, Cunhang</au><au>Pei, Shengbing</au><au>Li, Ping</au><au>Lv, Zhao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>An Investigation of Olfactory-enhanced Video on EEG-based Emotion Recognition</atitle><jtitle>IEEE transactions on neural systems and rehabilitation engineering</jtitle><stitle>TNSRE</stitle><addtitle>IEEE Trans Neural Syst Rehabil Eng</addtitle><date>2023-01-01</date><risdate>2023</risdate><volume>31</volume><spage>1</spage><epage>1</epage><pages>1-1</pages><issn>1534-4320</issn><eissn>1558-0210</eissn><coden>ITNSB3</coden><abstract>Collecting emotional physiological signals is significant in building affective Human-Computer Interactions (HCI). However, how to evoke subjects' emotions efficiently in EEG-related emotional experiments is still a challenge. In this work, we developed a novel experimental paradigm that allows odors dynamically participate in different stages of video-evoked emotions, to investigate the efficiency of olfactory-enhanced videos in inducing subjects' emotions; According to the period that the odors participated in, the stimuli were divided into four patterns, i.e., the olfactory-enhanced video in early/ later stimulus periods (OVEP/ OVLP), and the traditional videos in early/ later stimulus periods (TVEP/ TVLP). The differential entropy (DE) feature and four classifiers were employed to test the efficiency of emotion recognition. The best average accuracies of the OVEP, OVLP, TVEP, and TVLP were 50.54%, 51.49%, 40.22%, and 57.55%, respectively. The experimental results indicated that the OVEP significantly outperformed the TVEP on classification performance, while there was no significant difference between the OVLP and TVLP. Besides, olfactory-enhanced videos achieved higher efficiency in evoking negative emotions than traditional videos. Moreover, we found that the neural patterns in response to emotions under different stimulus methods were stable, and for Fp1, FP2, and F7, there existed significant differences in whether adopt the odors.</abstract><cop>United States</cop><pub>IEEE</pub><pmid>37028354</pmid><doi>10.1109/TNSRE.2023.3253866</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-8103-4709</orcidid><orcidid>https://orcid.org/0000-0002-7629-7459</orcidid><orcidid>https://orcid.org/0000-0001-9727-366X</orcidid><orcidid>https://orcid.org/0000-0003-1343-8768</orcidid><orcidid>https://orcid.org/0000-0003-0057-0807</orcidid><orcidid>https://orcid.org/0000-0001-6318-8803</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1534-4320 |
ispartof | IEEE transactions on neural systems and rehabilitation engineering, 2023-01, Vol.31, p.1-1 |
issn | 1534-4320 1558-0210 |
language | eng |
recordid | cdi_ieee_primary_10063978 |
source | Alma/SFX Local Collection |
subjects | Brain EEG Efficiency Electroencephalogram (EEG) Electroencephalography Electroencephalography - methods Emotion recognition Emotions Emotions - physiology Entropy Human computer interaction Human-computer interface human-computer interface (HCI) Humans neural pattern Odors Olfactory olfactory-enhanced video Physiology Recognition, Psychology Video Visualization |
title | An Investigation of Olfactory-enhanced Video on EEG-based Emotion Recognition |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T15%3A20%3A42IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=An%20Investigation%20of%20Olfactory-enhanced%20Video%20on%20EEG-based%20Emotion%20Recognition&rft.jtitle=IEEE%20transactions%20on%20neural%20systems%20and%20rehabilitation%20engineering&rft.au=Wu,%20Minchao&rft.date=2023-01-01&rft.volume=31&rft.spage=1&rft.epage=1&rft.pages=1-1&rft.issn=1534-4320&rft.eissn=1558-0210&rft.coden=ITNSB3&rft_id=info:doi/10.1109/TNSRE.2023.3253866&rft_dat=%3Cproquest_ieee_%3E2786985227%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c511t-72b6b448a5c8f619facc04a7941dcb04129d85d364c59eeb3b417dbc71216e123%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2786985227&rft_id=info:pmid/37028354&rft_ieee_id=10063978&rfr_iscdi=true |