Loading…

Fusion of Deep Features from 2D-DOST of fNIRS Signals for Subject-Independent Classification of Motor Execution Tasks

Functional near-infrared spectroscopy (fNIRS) is a low-cost and noninvasive method to measure the hemodynamic responses of cortical brain activities and has received great attention in brain-computer interface (BCI) applications. In this paper, we present a method based on deep learning and the time...

Full description

Saved in:
Bibliographic Details
Published in:International journal of intelligent systems 2023, Vol.2023 (1)
Main Authors: Khani, Pouya, Solouk, Vahid, Kalbkhani, Hashem, Ahmadi, Farid
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c294t-9574f07cb72e671e0af999a3092464e7cdd45e51f82b8757164402450fc0f3563
container_end_page
container_issue 1
container_start_page
container_title International journal of intelligent systems
container_volume 2023
creator Khani, Pouya
Solouk, Vahid
Kalbkhani, Hashem
Ahmadi, Farid
description Functional near-infrared spectroscopy (fNIRS) is a low-cost and noninvasive method to measure the hemodynamic responses of cortical brain activities and has received great attention in brain-computer interface (BCI) applications. In this paper, we present a method based on deep learning and the time-frequency map (TFM) of fNIRS signals to classify the three motor execution tasks including right-hand tapping, left-hand tapping, and foot tapping. To simultaneously obtain the TFM and consider the correlation among channels, we propose to utilize the two-dimensional discrete orthonormal Stockwell transform (2D-DOST). The TFMs for oxygenated hemoglobin (HbO), reduced hemoglobin (HbR), and two linear combinations of them are obtained and then we propose three fusion schemes for combining their deep information extracted by the convolutional neural network (CNN). Two CNNs, LeNet and MobileNet, are considered and their structures are modified to maximize the accuracy. Due to the lack of enough signals for training CNNs, data augmentation based on the Wasserstein generative adversarial network (WGAN) is performed. Several simulations are performed to assess the performance of the proposed method in three-class and binary scenarios. The results present the efficiency of the proposed method in different scenarios. Also, the proposed method outperforms the recently introduced methods.
doi_str_mv 10.1155/2023/3178284
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2914324473</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2914324473</sourcerecordid><originalsourceid>FETCH-LOGICAL-c294t-9574f07cb72e671e0af999a3092464e7cdd45e51f82b8757164402450fc0f3563</originalsourceid><addsrcrecordid>eNp9kF1LwzAUhoMoOKd3_oCAl1qXzya5lH3oYDqwE7wrWZpo59bUpEX997Zu196cA-d9eOE8AFxidIsx5yOCCB1RLCSR7AgMMFIywRi_HoMBkpIlEgt6Cs5i3CCEsWB8ANpZG0tfQe_gxNoazqxu2mAjdMHvIJkkk2W26lP3NH_OYFa-VXrbpT7ArF1vrGmSeVXY2najauB4q2MsXWl0c2h99E3HTr-taf9OKx0_4jk4cV2NvTjsIXiZTVfjh2SxvJ-P7xaJIYo1ieKCOSTMWhCbCmyRdkopTZEiLGVWmKJg3HLsJFlLwQVOGUOEceQMcpSndAiu9r118J-tjU2-8W3oP8iJwowSxgTtqJs9ZYKPMViX16Hc6fCTY5T3YvNebH4Q2-HXe_y9rAr9Vf5P_wKeQnYF</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2914324473</pqid></control><display><type>article</type><title>Fusion of Deep Features from 2D-DOST of fNIRS Signals for Subject-Independent Classification of Motor Execution Tasks</title><source>Wiley Online Library Open Access</source><source>Publicly Available Content (ProQuest)</source><creator>Khani, Pouya ; Solouk, Vahid ; Kalbkhani, Hashem ; Ahmadi, Farid</creator><contributor>Khosravi, Mohammad R. ; Mohammad R Khosravi</contributor><creatorcontrib>Khani, Pouya ; Solouk, Vahid ; Kalbkhani, Hashem ; Ahmadi, Farid ; Khosravi, Mohammad R. ; Mohammad R Khosravi</creatorcontrib><description>Functional near-infrared spectroscopy (fNIRS) is a low-cost and noninvasive method to measure the hemodynamic responses of cortical brain activities and has received great attention in brain-computer interface (BCI) applications. In this paper, we present a method based on deep learning and the time-frequency map (TFM) of fNIRS signals to classify the three motor execution tasks including right-hand tapping, left-hand tapping, and foot tapping. To simultaneously obtain the TFM and consider the correlation among channels, we propose to utilize the two-dimensional discrete orthonormal Stockwell transform (2D-DOST). The TFMs for oxygenated hemoglobin (HbO), reduced hemoglobin (HbR), and two linear combinations of them are obtained and then we propose three fusion schemes for combining their deep information extracted by the convolutional neural network (CNN). Two CNNs, LeNet and MobileNet, are considered and their structures are modified to maximize the accuracy. Due to the lack of enough signals for training CNNs, data augmentation based on the Wasserstein generative adversarial network (WGAN) is performed. Several simulations are performed to assess the performance of the proposed method in three-class and binary scenarios. The results present the efficiency of the proposed method in different scenarios. Also, the proposed method outperforms the recently introduced methods.</description><identifier>ISSN: 0884-8173</identifier><identifier>EISSN: 1098-111X</identifier><identifier>DOI: 10.1155/2023/3178284</identifier><language>eng</language><publisher>New York: Hindawi</publisher><subject>Accuracy ; Artificial neural networks ; Brain research ; Classification ; Data augmentation ; Datasets ; Decomposition ; Deep learning ; Discriminant analysis ; Electroencephalography ; Fourier transforms ; Generative adversarial networks ; Hemodynamic responses ; Hemodynamics ; Hemoglobin ; Human-computer interface ; Infrared spectra ; Machine learning ; Medical imaging ; Near infrared radiation ; Neural networks ; Neurosciences ; Signal classification ; Signal processing ; Support vector machines ; Wavelet transforms</subject><ispartof>International journal of intelligent systems, 2023, Vol.2023 (1)</ispartof><rights>Copyright © 2023 Pouya Khani et al.</rights><rights>Copyright © 2023 Pouya Khani et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c294t-9574f07cb72e671e0af999a3092464e7cdd45e51f82b8757164402450fc0f3563</cites><orcidid>0000-0001-8304-6394 ; 0009-0001-9963-7508 ; 0000-0003-2431-4920 ; 0000-0002-4291-1748</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2914324473/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2914324473?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,777,781,4010,25734,27904,27905,27906,36993,44571,74875</link.rule.ids></links><search><contributor>Khosravi, Mohammad R.</contributor><contributor>Mohammad R Khosravi</contributor><creatorcontrib>Khani, Pouya</creatorcontrib><creatorcontrib>Solouk, Vahid</creatorcontrib><creatorcontrib>Kalbkhani, Hashem</creatorcontrib><creatorcontrib>Ahmadi, Farid</creatorcontrib><title>Fusion of Deep Features from 2D-DOST of fNIRS Signals for Subject-Independent Classification of Motor Execution Tasks</title><title>International journal of intelligent systems</title><description>Functional near-infrared spectroscopy (fNIRS) is a low-cost and noninvasive method to measure the hemodynamic responses of cortical brain activities and has received great attention in brain-computer interface (BCI) applications. In this paper, we present a method based on deep learning and the time-frequency map (TFM) of fNIRS signals to classify the three motor execution tasks including right-hand tapping, left-hand tapping, and foot tapping. To simultaneously obtain the TFM and consider the correlation among channels, we propose to utilize the two-dimensional discrete orthonormal Stockwell transform (2D-DOST). The TFMs for oxygenated hemoglobin (HbO), reduced hemoglobin (HbR), and two linear combinations of them are obtained and then we propose three fusion schemes for combining their deep information extracted by the convolutional neural network (CNN). Two CNNs, LeNet and MobileNet, are considered and their structures are modified to maximize the accuracy. Due to the lack of enough signals for training CNNs, data augmentation based on the Wasserstein generative adversarial network (WGAN) is performed. Several simulations are performed to assess the performance of the proposed method in three-class and binary scenarios. The results present the efficiency of the proposed method in different scenarios. Also, the proposed method outperforms the recently introduced methods.</description><subject>Accuracy</subject><subject>Artificial neural networks</subject><subject>Brain research</subject><subject>Classification</subject><subject>Data augmentation</subject><subject>Datasets</subject><subject>Decomposition</subject><subject>Deep learning</subject><subject>Discriminant analysis</subject><subject>Electroencephalography</subject><subject>Fourier transforms</subject><subject>Generative adversarial networks</subject><subject>Hemodynamic responses</subject><subject>Hemodynamics</subject><subject>Hemoglobin</subject><subject>Human-computer interface</subject><subject>Infrared spectra</subject><subject>Machine learning</subject><subject>Medical imaging</subject><subject>Near infrared radiation</subject><subject>Neural networks</subject><subject>Neurosciences</subject><subject>Signal classification</subject><subject>Signal processing</subject><subject>Support vector machines</subject><subject>Wavelet transforms</subject><issn>0884-8173</issn><issn>1098-111X</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNp9kF1LwzAUhoMoOKd3_oCAl1qXzya5lH3oYDqwE7wrWZpo59bUpEX997Zu196cA-d9eOE8AFxidIsx5yOCCB1RLCSR7AgMMFIywRi_HoMBkpIlEgt6Cs5i3CCEsWB8ANpZG0tfQe_gxNoazqxu2mAjdMHvIJkkk2W26lP3NH_OYFa-VXrbpT7ArF1vrGmSeVXY2najauB4q2MsXWl0c2h99E3HTr-taf9OKx0_4jk4cV2NvTjsIXiZTVfjh2SxvJ-P7xaJIYo1ieKCOSTMWhCbCmyRdkopTZEiLGVWmKJg3HLsJFlLwQVOGUOEceQMcpSndAiu9r118J-tjU2-8W3oP8iJwowSxgTtqJs9ZYKPMViX16Hc6fCTY5T3YvNebH4Q2-HXe_y9rAr9Vf5P_wKeQnYF</recordid><startdate>2023</startdate><enddate>2023</enddate><creator>Khani, Pouya</creator><creator>Solouk, Vahid</creator><creator>Kalbkhani, Hashem</creator><creator>Ahmadi, Farid</creator><general>Hindawi</general><general>Hindawi Limited</general><scope>RHU</scope><scope>RHW</scope><scope>RHX</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7XB</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0001-8304-6394</orcidid><orcidid>https://orcid.org/0009-0001-9963-7508</orcidid><orcidid>https://orcid.org/0000-0003-2431-4920</orcidid><orcidid>https://orcid.org/0000-0002-4291-1748</orcidid></search><sort><creationdate>2023</creationdate><title>Fusion of Deep Features from 2D-DOST of fNIRS Signals for Subject-Independent Classification of Motor Execution Tasks</title><author>Khani, Pouya ; Solouk, Vahid ; Kalbkhani, Hashem ; Ahmadi, Farid</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c294t-9574f07cb72e671e0af999a3092464e7cdd45e51f82b8757164402450fc0f3563</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>Artificial neural networks</topic><topic>Brain research</topic><topic>Classification</topic><topic>Data augmentation</topic><topic>Datasets</topic><topic>Decomposition</topic><topic>Deep learning</topic><topic>Discriminant analysis</topic><topic>Electroencephalography</topic><topic>Fourier transforms</topic><topic>Generative adversarial networks</topic><topic>Hemodynamic responses</topic><topic>Hemodynamics</topic><topic>Hemoglobin</topic><topic>Human-computer interface</topic><topic>Infrared spectra</topic><topic>Machine learning</topic><topic>Medical imaging</topic><topic>Near infrared radiation</topic><topic>Neural networks</topic><topic>Neurosciences</topic><topic>Signal classification</topic><topic>Signal processing</topic><topic>Support vector machines</topic><topic>Wavelet transforms</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Khani, Pouya</creatorcontrib><creatorcontrib>Solouk, Vahid</creatorcontrib><creatorcontrib>Kalbkhani, Hashem</creatorcontrib><creatorcontrib>Ahmadi, Farid</creatorcontrib><collection>Hindawi Publishing Complete</collection><collection>Hindawi Publishing Subscription Journals</collection><collection>Hindawi Publishing Open Access Journals</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer science database</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>ProQuest advanced technologies &amp; aerospace journals</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>International journal of intelligent systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Khani, Pouya</au><au>Solouk, Vahid</au><au>Kalbkhani, Hashem</au><au>Ahmadi, Farid</au><au>Khosravi, Mohammad R.</au><au>Mohammad R Khosravi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Fusion of Deep Features from 2D-DOST of fNIRS Signals for Subject-Independent Classification of Motor Execution Tasks</atitle><jtitle>International journal of intelligent systems</jtitle><date>2023</date><risdate>2023</risdate><volume>2023</volume><issue>1</issue><issn>0884-8173</issn><eissn>1098-111X</eissn><abstract>Functional near-infrared spectroscopy (fNIRS) is a low-cost and noninvasive method to measure the hemodynamic responses of cortical brain activities and has received great attention in brain-computer interface (BCI) applications. In this paper, we present a method based on deep learning and the time-frequency map (TFM) of fNIRS signals to classify the three motor execution tasks including right-hand tapping, left-hand tapping, and foot tapping. To simultaneously obtain the TFM and consider the correlation among channels, we propose to utilize the two-dimensional discrete orthonormal Stockwell transform (2D-DOST). The TFMs for oxygenated hemoglobin (HbO), reduced hemoglobin (HbR), and two linear combinations of them are obtained and then we propose three fusion schemes for combining their deep information extracted by the convolutional neural network (CNN). Two CNNs, LeNet and MobileNet, are considered and their structures are modified to maximize the accuracy. Due to the lack of enough signals for training CNNs, data augmentation based on the Wasserstein generative adversarial network (WGAN) is performed. Several simulations are performed to assess the performance of the proposed method in three-class and binary scenarios. The results present the efficiency of the proposed method in different scenarios. Also, the proposed method outperforms the recently introduced methods.</abstract><cop>New York</cop><pub>Hindawi</pub><doi>10.1155/2023/3178284</doi><orcidid>https://orcid.org/0000-0001-8304-6394</orcidid><orcidid>https://orcid.org/0009-0001-9963-7508</orcidid><orcidid>https://orcid.org/0000-0003-2431-4920</orcidid><orcidid>https://orcid.org/0000-0002-4291-1748</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0884-8173
ispartof International journal of intelligent systems, 2023, Vol.2023 (1)
issn 0884-8173
1098-111X
language eng
recordid cdi_proquest_journals_2914324473
source Wiley Online Library Open Access; Publicly Available Content (ProQuest)
subjects Accuracy
Artificial neural networks
Brain research
Classification
Data augmentation
Datasets
Decomposition
Deep learning
Discriminant analysis
Electroencephalography
Fourier transforms
Generative adversarial networks
Hemodynamic responses
Hemodynamics
Hemoglobin
Human-computer interface
Infrared spectra
Machine learning
Medical imaging
Near infrared radiation
Neural networks
Neurosciences
Signal classification
Signal processing
Support vector machines
Wavelet transforms
title Fusion of Deep Features from 2D-DOST of fNIRS Signals for Subject-Independent Classification of Motor Execution Tasks
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T06%3A38%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Fusion%20of%20Deep%20Features%20from%202D-DOST%20of%20fNIRS%20Signals%20for%20Subject-Independent%20Classification%20of%20Motor%20Execution%20Tasks&rft.jtitle=International%20journal%20of%20intelligent%20systems&rft.au=Khani,%20Pouya&rft.date=2023&rft.volume=2023&rft.issue=1&rft.issn=0884-8173&rft.eissn=1098-111X&rft_id=info:doi/10.1155/2023/3178284&rft_dat=%3Cproquest_cross%3E2914324473%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c294t-9574f07cb72e671e0af999a3092464e7cdd45e51f82b8757164402450fc0f3563%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2914324473&rft_id=info:pmid/&rfr_iscdi=true