Loading…

CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis

Recognizing emotional state of human using brain signal is an active research domain with several open challenges. In this research, we propose a signal spectrogram image based CNN-XGBoost fusion method for recognising three dimensions of emotion, namely arousal (calm or excitement), valence (positi...

Full description

Saved in:
Bibliographic Details
Published in:Scientific reports 2022-08, Vol.12 (1), p.14122-14122, Article 14122
Main Authors: Khan, Md. Sakib, Salsabil, Nishat, Alam, Md. Golam Rabiul, Dewan, M. Ali Akber, Uddin, Md. Zia
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c517t-fafb287ce6abae66939e32cdb6d127fdd613bd091ead70dd03c6caf4cce6bd113
cites cdi_FETCH-LOGICAL-c517t-fafb287ce6abae66939e32cdb6d127fdd613bd091ead70dd03c6caf4cce6bd113
container_end_page 14122
container_issue 1
container_start_page 14122
container_title Scientific reports
container_volume 12
creator Khan, Md. Sakib
Salsabil, Nishat
Alam, Md. Golam Rabiul
Dewan, M. Ali Akber
Uddin, Md. Zia
description Recognizing emotional state of human using brain signal is an active research domain with several open challenges. In this research, we propose a signal spectrogram image based CNN-XGBoost fusion method for recognising three dimensions of emotion, namely arousal (calm or excitement), valence (positive or negative feeling) and dominance (without control or empowered). We used a benchmark dataset called DREAMER where the EEG signals were collected from multiple stimulus along with self-evaluation ratings. In our proposed method, we first calculate the Short-Time Fourier Transform (STFT) of the EEG signals and convert them into RGB images to obtain the spectrograms. Then we use a two dimensional Convolutional Neural Network (CNN) in order to train the model on the spectrogram images and retrieve the features from the trained layer of the CNN using a dense layer of the neural network. We apply Extreme Gradient Boosting (XGBoost) classifier on extracted CNN features to classify the signals into arousal, valence and dominance of human emotion. We compare our results with the feature fusion-based state-of-the-art approaches of emotion recognition. To do this, we applied various feature extraction techniques on the signals which include Fast Fourier Transformation, Discrete Cosine Transformation, Poincare, Power Spectral Density, Hjorth parameters and some statistical features. Additionally, we use Chi-square and Recursive Feature Elimination techniques to select the discriminative features. We form the feature vectors by applying feature level fusion, and apply Support Vector Machine (SVM) and Extreme Gradient Boosting (XGBoost) classifiers on the fused features to classify different emotion levels. The performance study shows that the proposed spectrogram image based CNN-XGBoost fusion method outperforms the feature fusion-based SVM and XGBoost methods. The proposed method obtained the accuracy of 99.712% for arousal, 99.770% for valence and 99.770% for dominance in human emotion detection.
doi_str_mv 10.1038/s41598-022-18257-x
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_f69de47a93474f42b8f941c95bda7b32</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_f69de47a93474f42b8f941c95bda7b32</doaj_id><sourcerecordid>2704867431</sourcerecordid><originalsourceid>FETCH-LOGICAL-c517t-fafb287ce6abae66939e32cdb6d127fdd613bd091ead70dd03c6caf4cce6bd113</originalsourceid><addsrcrecordid>eNp9ks1u1DAUhSMEolXpC7CKxIaNwX-xkw0SjIahUlU2IMHKuravQ0aZeLCTqn173KYCygJvbF1_5-jo6lTVS0bfMCrat1mypmsJ5Zywljea3DypTjmVDeGC86d_vU-q85z3tJyGd5J1z6sTUaSKqua0-r65uiLfdh9izHMdljzEiVjI6GsIAd08XGOdZ5ixTuhiPw1zIerCTX293e7qfCxQin2CQz0coMcaJhhv85BfVM8CjBnPH-6z6uvH7ZfNJ3L5eXexeX9JXMP0TAIEy1vtUIEFVKoTHQruvFWecR28V0xYTzuG4DX1ngqnHATpisJ6xsRZdbH6-gh7c0wlRbo1EQZzP4ipN5DmwY1oguo8Sg2dkFoGyW0bykJc11gP2gpevN6tXsfFHtA7nOYE4yPTxz_T8MP08dqU1EwoWQxePxik-HPBPJvDkB2OI0wYl2y4prJVWoq73K_-QfdxSWV5K8UEbVhTKL5SLsWcE4bfYRg1d0UwaxFMKYK5L4K5KSKxinKBpx7TH-v_qH4BHX-23w</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2704130515</pqid></control><display><type>article</type><title>CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis</title><source>Publicly Available Content Database</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><source>Springer Nature - nature.com Journals - Fully Open Access</source><creator>Khan, Md. Sakib ; Salsabil, Nishat ; Alam, Md. Golam Rabiul ; Dewan, M. Ali Akber ; Uddin, Md. Zia</creator><creatorcontrib>Khan, Md. Sakib ; Salsabil, Nishat ; Alam, Md. Golam Rabiul ; Dewan, M. Ali Akber ; Uddin, Md. Zia</creatorcontrib><description>Recognizing emotional state of human using brain signal is an active research domain with several open challenges. In this research, we propose a signal spectrogram image based CNN-XGBoost fusion method for recognising three dimensions of emotion, namely arousal (calm or excitement), valence (positive or negative feeling) and dominance (without control or empowered). We used a benchmark dataset called DREAMER where the EEG signals were collected from multiple stimulus along with self-evaluation ratings. In our proposed method, we first calculate the Short-Time Fourier Transform (STFT) of the EEG signals and convert them into RGB images to obtain the spectrograms. Then we use a two dimensional Convolutional Neural Network (CNN) in order to train the model on the spectrogram images and retrieve the features from the trained layer of the CNN using a dense layer of the neural network. We apply Extreme Gradient Boosting (XGBoost) classifier on extracted CNN features to classify the signals into arousal, valence and dominance of human emotion. We compare our results with the feature fusion-based state-of-the-art approaches of emotion recognition. To do this, we applied various feature extraction techniques on the signals which include Fast Fourier Transformation, Discrete Cosine Transformation, Poincare, Power Spectral Density, Hjorth parameters and some statistical features. Additionally, we use Chi-square and Recursive Feature Elimination techniques to select the discriminative features. We form the feature vectors by applying feature level fusion, and apply Support Vector Machine (SVM) and Extreme Gradient Boosting (XGBoost) classifiers on the fused features to classify different emotion levels. The performance study shows that the proposed spectrogram image based CNN-XGBoost fusion method outperforms the feature fusion-based SVM and XGBoost methods. The proposed method obtained the accuracy of 99.712% for arousal, 99.770% for valence and 99.770% for dominance in human emotion detection.</description><identifier>ISSN: 2045-2322</identifier><identifier>EISSN: 2045-2322</identifier><identifier>DOI: 10.1038/s41598-022-18257-x</identifier><identifier>PMID: 35986065</identifier><language>eng</language><publisher>London: Nature Publishing Group UK</publisher><subject>639/166/985 ; 639/705 ; Arousal ; Dominance ; EEG ; Emotional behavior ; Fourier analysis ; Fourier transforms ; Humanities and Social Sciences ; Image processing ; multidisciplinary ; Neural networks ; Science ; Science (multidisciplinary) ; Support vector machines</subject><ispartof>Scientific reports, 2022-08, Vol.12 (1), p.14122-14122, Article 14122</ispartof><rights>The Author(s) 2022</rights><rights>The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c517t-fafb287ce6abae66939e32cdb6d127fdd613bd091ead70dd03c6caf4cce6bd113</citedby><cites>FETCH-LOGICAL-c517t-fafb287ce6abae66939e32cdb6d127fdd613bd091ead70dd03c6caf4cce6bd113</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2704130515/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2704130515?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,723,776,780,881,25732,27903,27904,36991,36992,44569,53769,53771,74872</link.rule.ids></links><search><creatorcontrib>Khan, Md. Sakib</creatorcontrib><creatorcontrib>Salsabil, Nishat</creatorcontrib><creatorcontrib>Alam, Md. Golam Rabiul</creatorcontrib><creatorcontrib>Dewan, M. Ali Akber</creatorcontrib><creatorcontrib>Uddin, Md. Zia</creatorcontrib><title>CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis</title><title>Scientific reports</title><addtitle>Sci Rep</addtitle><description>Recognizing emotional state of human using brain signal is an active research domain with several open challenges. In this research, we propose a signal spectrogram image based CNN-XGBoost fusion method for recognising three dimensions of emotion, namely arousal (calm or excitement), valence (positive or negative feeling) and dominance (without control or empowered). We used a benchmark dataset called DREAMER where the EEG signals were collected from multiple stimulus along with self-evaluation ratings. In our proposed method, we first calculate the Short-Time Fourier Transform (STFT) of the EEG signals and convert them into RGB images to obtain the spectrograms. Then we use a two dimensional Convolutional Neural Network (CNN) in order to train the model on the spectrogram images and retrieve the features from the trained layer of the CNN using a dense layer of the neural network. We apply Extreme Gradient Boosting (XGBoost) classifier on extracted CNN features to classify the signals into arousal, valence and dominance of human emotion. We compare our results with the feature fusion-based state-of-the-art approaches of emotion recognition. To do this, we applied various feature extraction techniques on the signals which include Fast Fourier Transformation, Discrete Cosine Transformation, Poincare, Power Spectral Density, Hjorth parameters and some statistical features. Additionally, we use Chi-square and Recursive Feature Elimination techniques to select the discriminative features. We form the feature vectors by applying feature level fusion, and apply Support Vector Machine (SVM) and Extreme Gradient Boosting (XGBoost) classifiers on the fused features to classify different emotion levels. The performance study shows that the proposed spectrogram image based CNN-XGBoost fusion method outperforms the feature fusion-based SVM and XGBoost methods. The proposed method obtained the accuracy of 99.712% for arousal, 99.770% for valence and 99.770% for dominance in human emotion detection.</description><subject>639/166/985</subject><subject>639/705</subject><subject>Arousal</subject><subject>Dominance</subject><subject>EEG</subject><subject>Emotional behavior</subject><subject>Fourier analysis</subject><subject>Fourier transforms</subject><subject>Humanities and Social Sciences</subject><subject>Image processing</subject><subject>multidisciplinary</subject><subject>Neural networks</subject><subject>Science</subject><subject>Science (multidisciplinary)</subject><subject>Support vector machines</subject><issn>2045-2322</issn><issn>2045-2322</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNp9ks1u1DAUhSMEolXpC7CKxIaNwX-xkw0SjIahUlU2IMHKuravQ0aZeLCTqn173KYCygJvbF1_5-jo6lTVS0bfMCrat1mypmsJ5Zywljea3DypTjmVDeGC86d_vU-q85z3tJyGd5J1z6sTUaSKqua0-r65uiLfdh9izHMdljzEiVjI6GsIAd08XGOdZ5ixTuhiPw1zIerCTX293e7qfCxQin2CQz0coMcaJhhv85BfVM8CjBnPH-6z6uvH7ZfNJ3L5eXexeX9JXMP0TAIEy1vtUIEFVKoTHQruvFWecR28V0xYTzuG4DX1ngqnHATpisJ6xsRZdbH6-gh7c0wlRbo1EQZzP4ipN5DmwY1oguo8Sg2dkFoGyW0bykJc11gP2gpevN6tXsfFHtA7nOYE4yPTxz_T8MP08dqU1EwoWQxePxik-HPBPJvDkB2OI0wYl2y4prJVWoq73K_-QfdxSWV5K8UEbVhTKL5SLsWcE4bfYRg1d0UwaxFMKYK5L4K5KSKxinKBpx7TH-v_qH4BHX-23w</recordid><startdate>20220819</startdate><enddate>20220819</enddate><creator>Khan, Md. Sakib</creator><creator>Salsabil, Nishat</creator><creator>Alam, Md. Golam Rabiul</creator><creator>Dewan, M. Ali Akber</creator><creator>Uddin, Md. Zia</creator><general>Nature Publishing Group UK</general><general>Nature Publishing Group</general><general>Nature Portfolio</general><scope>C6C</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88A</scope><scope>88E</scope><scope>88I</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M2P</scope><scope>M7P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20220819</creationdate><title>CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis</title><author>Khan, Md. Sakib ; Salsabil, Nishat ; Alam, Md. Golam Rabiul ; Dewan, M. Ali Akber ; Uddin, Md. Zia</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c517t-fafb287ce6abae66939e32cdb6d127fdd613bd091ead70dd03c6caf4cce6bd113</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>639/166/985</topic><topic>639/705</topic><topic>Arousal</topic><topic>Dominance</topic><topic>EEG</topic><topic>Emotional behavior</topic><topic>Fourier analysis</topic><topic>Fourier transforms</topic><topic>Humanities and Social Sciences</topic><topic>Image processing</topic><topic>multidisciplinary</topic><topic>Neural networks</topic><topic>Science</topic><topic>Science (multidisciplinary)</topic><topic>Support vector machines</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Khan, Md. Sakib</creatorcontrib><creatorcontrib>Salsabil, Nishat</creatorcontrib><creatorcontrib>Alam, Md. Golam Rabiul</creatorcontrib><creatorcontrib>Dewan, M. Ali Akber</creatorcontrib><creatorcontrib>Uddin, Md. Zia</creatorcontrib><collection>SpringerOpen</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Biology Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Biological Sciences</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Science Database</collection><collection>Biological Science Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Scientific reports</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Khan, Md. Sakib</au><au>Salsabil, Nishat</au><au>Alam, Md. Golam Rabiul</au><au>Dewan, M. Ali Akber</au><au>Uddin, Md. Zia</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis</atitle><jtitle>Scientific reports</jtitle><stitle>Sci Rep</stitle><date>2022-08-19</date><risdate>2022</risdate><volume>12</volume><issue>1</issue><spage>14122</spage><epage>14122</epage><pages>14122-14122</pages><artnum>14122</artnum><issn>2045-2322</issn><eissn>2045-2322</eissn><abstract>Recognizing emotional state of human using brain signal is an active research domain with several open challenges. In this research, we propose a signal spectrogram image based CNN-XGBoost fusion method for recognising three dimensions of emotion, namely arousal (calm or excitement), valence (positive or negative feeling) and dominance (without control or empowered). We used a benchmark dataset called DREAMER where the EEG signals were collected from multiple stimulus along with self-evaluation ratings. In our proposed method, we first calculate the Short-Time Fourier Transform (STFT) of the EEG signals and convert them into RGB images to obtain the spectrograms. Then we use a two dimensional Convolutional Neural Network (CNN) in order to train the model on the spectrogram images and retrieve the features from the trained layer of the CNN using a dense layer of the neural network. We apply Extreme Gradient Boosting (XGBoost) classifier on extracted CNN features to classify the signals into arousal, valence and dominance of human emotion. We compare our results with the feature fusion-based state-of-the-art approaches of emotion recognition. To do this, we applied various feature extraction techniques on the signals which include Fast Fourier Transformation, Discrete Cosine Transformation, Poincare, Power Spectral Density, Hjorth parameters and some statistical features. Additionally, we use Chi-square and Recursive Feature Elimination techniques to select the discriminative features. We form the feature vectors by applying feature level fusion, and apply Support Vector Machine (SVM) and Extreme Gradient Boosting (XGBoost) classifiers on the fused features to classify different emotion levels. The performance study shows that the proposed spectrogram image based CNN-XGBoost fusion method outperforms the feature fusion-based SVM and XGBoost methods. The proposed method obtained the accuracy of 99.712% for arousal, 99.770% for valence and 99.770% for dominance in human emotion detection.</abstract><cop>London</cop><pub>Nature Publishing Group UK</pub><pmid>35986065</pmid><doi>10.1038/s41598-022-18257-x</doi><tpages>1</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2045-2322
ispartof Scientific reports, 2022-08, Vol.12 (1), p.14122-14122, Article 14122
issn 2045-2322
2045-2322
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_f69de47a93474f42b8f941c95bda7b32
source Publicly Available Content Database; PubMed Central; Free Full-Text Journals in Chemistry; Springer Nature - nature.com Journals - Fully Open Access
subjects 639/166/985
639/705
Arousal
Dominance
EEG
Emotional behavior
Fourier analysis
Fourier transforms
Humanities and Social Sciences
Image processing
multidisciplinary
Neural networks
Science
Science (multidisciplinary)
Support vector machines
title CNN-XGBoost fusion-based affective state recognition using EEG spectrogram image analysis
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-25T23%3A28%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=CNN-XGBoost%20fusion-based%20affective%20state%20recognition%20using%20EEG%20spectrogram%20image%20analysis&rft.jtitle=Scientific%20reports&rft.au=Khan,%20Md.%20Sakib&rft.date=2022-08-19&rft.volume=12&rft.issue=1&rft.spage=14122&rft.epage=14122&rft.pages=14122-14122&rft.artnum=14122&rft.issn=2045-2322&rft.eissn=2045-2322&rft_id=info:doi/10.1038/s41598-022-18257-x&rft_dat=%3Cproquest_doaj_%3E2704867431%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c517t-fafb287ce6abae66939e32cdb6d127fdd613bd091ead70dd03c6caf4cce6bd113%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2704130515&rft_id=info:pmid/35986065&rfr_iscdi=true