Loading…

EEG-Based Brain-Computer Interface for Decoding Motor Imagery Tasks within the Same Hand Using Choi-Williams Time-Frequency Distribution

This paper presents an EEG-based brain-computer interface system for classifying eleven motor imagery (MI) tasks within the same hand. The proposed system utilizes the Choi-Williams time-frequency distribution (CWD) to construct a time-frequency representation (TFR) of the EEG signals. The construct...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2017-08, Vol.17 (9), p.1937
Main Authors: Alazrai, Rami, Alwanni, Hisham, Baslan, Yara, Alnuman, Nasim, Daoud, Mohammad I
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c469t-e2d967f11f039aee090048f4badec238a021a0197554d6668431aef5b095c07a3
cites cdi_FETCH-LOGICAL-c469t-e2d967f11f039aee090048f4badec238a021a0197554d6668431aef5b095c07a3
container_end_page
container_issue 9
container_start_page 1937
container_title Sensors (Basel, Switzerland)
container_volume 17
creator Alazrai, Rami
Alwanni, Hisham
Baslan, Yara
Alnuman, Nasim
Daoud, Mohammad I
description This paper presents an EEG-based brain-computer interface system for classifying eleven motor imagery (MI) tasks within the same hand. The proposed system utilizes the Choi-Williams time-frequency distribution (CWD) to construct a time-frequency representation (TFR) of the EEG signals. The constructed TFR is used to extract five categories of time-frequency features (TFFs). The TFFs are processed using a hierarchical classification model to identify the MI task encapsulated within the EEG signals. To evaluate the performance of the proposed approach, EEG data were recorded for eighteen intact subjects and four amputated subjects while imagining to perform each of the eleven hand MI tasks. Two performance evaluation analyses, namely channel- and TFF-based analyses, are conducted to identify the best subset of EEG channels and the TFFs category, respectively, that enable the highest classification accuracy between the MI tasks. In each evaluation analysis, the hierarchical classification model is trained using two training procedures, namely subject-dependent and subject-independent procedures. These two training procedures quantify the capability of the proposed approach to capture both intra- and inter-personal variations in the EEG signals for different MI tasks within the same hand. The results demonstrate the efficacy of the approach for classifying the MI tasks within the same hand. In particular, the classification accuracies obtained for the intact and amputated subjects are as high as 88 . 8 % and 90 . 2 % , respectively, for the subject-dependent training procedure, and 80 . 8 % and 87 . 8 % , respectively, for the subject-independent training procedure. These results suggest the feasibility of applying the proposed approach to control dexterous prosthetic hands, which can be of great benefit for individuals suffering from hand amputations.
doi_str_mv 10.3390/s17091937
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_47ebb31fa32e401d8ae4c80739556284</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_47ebb31fa32e401d8ae4c80739556284</doaj_id><sourcerecordid>1952104863</sourcerecordid><originalsourceid>FETCH-LOGICAL-c469t-e2d967f11f039aee090048f4badec238a021a0197554d6668431aef5b095c07a3</originalsourceid><addsrcrecordid>eNpdkstuEzEUhkcIREtgwQsgS2xgMeDbXLxBomnaRipiQSqW1hnPmcRhxg72DChvwGPjNiVq2fj6-dOv45Nlrxn9IISiHyOrqGJKVE-yUya5zGvO6dMH65PsRYxbSrkQon6enfC6Frxg4jT7s1hc5mcQsSVnAazL537YTSMGsnRp7MAg6Xwg52h8a92afPFj2i4HWGPYkxXEH5H8tuPGOjJukHyDAckVuJbcxFt8vvE2_2773sIQycoOmF8E_DmhM3tybuMYbDON1ruX2bMO-oiv7udZdnOxWM2v8uuvl8v55-vcyFKNOfJWlVXHWEeFAkSqKJV1Jxto0XBRA-UMKFNVUci2LMtaCgbYFQ1VhaEViFm2PHhbD1u9C3aAsNcerL478GGtIYzW9KhlhU0jWAeCo6SsrQGlqWklVFGUPJln2aeDazc1A7YG3RigfyR9fOPsRq_9L52esxQ7Cd7dC4JPNYmjHmw02Pfg0E9Rpz_lrJR1WSX07X_o1k_BpVIlqrjTlSJR7w-UCT7GgN0xDKP6tlf0sVcS--Zh-iP5rznEX1yZuPU</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1952104863</pqid></control><display><type>article</type><title>EEG-Based Brain-Computer Interface for Decoding Motor Imagery Tasks within the Same Hand Using Choi-Williams Time-Frequency Distribution</title><source>Publicly Available Content (ProQuest)</source><source>PubMed Central</source><creator>Alazrai, Rami ; Alwanni, Hisham ; Baslan, Yara ; Alnuman, Nasim ; Daoud, Mohammad I</creator><creatorcontrib>Alazrai, Rami ; Alwanni, Hisham ; Baslan, Yara ; Alnuman, Nasim ; Daoud, Mohammad I</creatorcontrib><description>This paper presents an EEG-based brain-computer interface system for classifying eleven motor imagery (MI) tasks within the same hand. The proposed system utilizes the Choi-Williams time-frequency distribution (CWD) to construct a time-frequency representation (TFR) of the EEG signals. The constructed TFR is used to extract five categories of time-frequency features (TFFs). The TFFs are processed using a hierarchical classification model to identify the MI task encapsulated within the EEG signals. To evaluate the performance of the proposed approach, EEG data were recorded for eighteen intact subjects and four amputated subjects while imagining to perform each of the eleven hand MI tasks. Two performance evaluation analyses, namely channel- and TFF-based analyses, are conducted to identify the best subset of EEG channels and the TFFs category, respectively, that enable the highest classification accuracy between the MI tasks. In each evaluation analysis, the hierarchical classification model is trained using two training procedures, namely subject-dependent and subject-independent procedures. These two training procedures quantify the capability of the proposed approach to capture both intra- and inter-personal variations in the EEG signals for different MI tasks within the same hand. The results demonstrate the efficacy of the approach for classifying the MI tasks within the same hand. In particular, the classification accuracies obtained for the intact and amputated subjects are as high as 88 . 8 % and 90 . 2 % , respectively, for the subject-dependent training procedure, and 80 . 8 % and 87 . 8 % , respectively, for the subject-independent training procedure. These results suggest the feasibility of applying the proposed approach to control dexterous prosthetic hands, which can be of great benefit for individuals suffering from hand amputations.</description><identifier>ISSN: 1424-8220</identifier><identifier>EISSN: 1424-8220</identifier><identifier>DOI: 10.3390/s17091937</identifier><identifier>PMID: 28832513</identifier><language>eng</language><publisher>Switzerland: MDPI AG</publisher><subject>Brain-Computer Interfaces ; Choi-Williams time-frequency distribution ; Classification ; Decoding ; Electroencephalography ; Frequency distribution ; Hand ; hierarchical classification ; Human-computer interface ; Humans ; Imagery ; Imagination ; motor imagery ; Performance evaluation ; Prostheses ; subject-independent analysis ; support vector machines ; time-frequency features ; User-Computer Interface</subject><ispartof>Sensors (Basel, Switzerland), 2017-08, Vol.17 (9), p.1937</ispartof><rights>Copyright MDPI AG 2017</rights><rights>2017 by the authors. 2017</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c469t-e2d967f11f039aee090048f4badec238a021a0197554d6668431aef5b095c07a3</citedby><cites>FETCH-LOGICAL-c469t-e2d967f11f039aee090048f4badec238a021a0197554d6668431aef5b095c07a3</cites><orcidid>0000-0002-1296-0231</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/1952104863/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/1952104863?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25752,27923,27924,37011,37012,44589,53790,53792,74897</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/28832513$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Alazrai, Rami</creatorcontrib><creatorcontrib>Alwanni, Hisham</creatorcontrib><creatorcontrib>Baslan, Yara</creatorcontrib><creatorcontrib>Alnuman, Nasim</creatorcontrib><creatorcontrib>Daoud, Mohammad I</creatorcontrib><title>EEG-Based Brain-Computer Interface for Decoding Motor Imagery Tasks within the Same Hand Using Choi-Williams Time-Frequency Distribution</title><title>Sensors (Basel, Switzerland)</title><addtitle>Sensors (Basel)</addtitle><description>This paper presents an EEG-based brain-computer interface system for classifying eleven motor imagery (MI) tasks within the same hand. The proposed system utilizes the Choi-Williams time-frequency distribution (CWD) to construct a time-frequency representation (TFR) of the EEG signals. The constructed TFR is used to extract five categories of time-frequency features (TFFs). The TFFs are processed using a hierarchical classification model to identify the MI task encapsulated within the EEG signals. To evaluate the performance of the proposed approach, EEG data were recorded for eighteen intact subjects and four amputated subjects while imagining to perform each of the eleven hand MI tasks. Two performance evaluation analyses, namely channel- and TFF-based analyses, are conducted to identify the best subset of EEG channels and the TFFs category, respectively, that enable the highest classification accuracy between the MI tasks. In each evaluation analysis, the hierarchical classification model is trained using two training procedures, namely subject-dependent and subject-independent procedures. These two training procedures quantify the capability of the proposed approach to capture both intra- and inter-personal variations in the EEG signals for different MI tasks within the same hand. The results demonstrate the efficacy of the approach for classifying the MI tasks within the same hand. In particular, the classification accuracies obtained for the intact and amputated subjects are as high as 88 . 8 % and 90 . 2 % , respectively, for the subject-dependent training procedure, and 80 . 8 % and 87 . 8 % , respectively, for the subject-independent training procedure. These results suggest the feasibility of applying the proposed approach to control dexterous prosthetic hands, which can be of great benefit for individuals suffering from hand amputations.</description><subject>Brain-Computer Interfaces</subject><subject>Choi-Williams time-frequency distribution</subject><subject>Classification</subject><subject>Decoding</subject><subject>Electroencephalography</subject><subject>Frequency distribution</subject><subject>Hand</subject><subject>hierarchical classification</subject><subject>Human-computer interface</subject><subject>Humans</subject><subject>Imagery</subject><subject>Imagination</subject><subject>motor imagery</subject><subject>Performance evaluation</subject><subject>Prostheses</subject><subject>subject-independent analysis</subject><subject>support vector machines</subject><subject>time-frequency features</subject><subject>User-Computer Interface</subject><issn>1424-8220</issn><issn>1424-8220</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNpdkstuEzEUhkcIREtgwQsgS2xgMeDbXLxBomnaRipiQSqW1hnPmcRhxg72DChvwGPjNiVq2fj6-dOv45Nlrxn9IISiHyOrqGJKVE-yUya5zGvO6dMH65PsRYxbSrkQon6enfC6Frxg4jT7s1hc5mcQsSVnAazL537YTSMGsnRp7MAg6Xwg52h8a92afPFj2i4HWGPYkxXEH5H8tuPGOjJukHyDAckVuJbcxFt8vvE2_2773sIQycoOmF8E_DmhM3tybuMYbDON1ruX2bMO-oiv7udZdnOxWM2v8uuvl8v55-vcyFKNOfJWlVXHWEeFAkSqKJV1Jxto0XBRA-UMKFNVUci2LMtaCgbYFQ1VhaEViFm2PHhbD1u9C3aAsNcerL478GGtIYzW9KhlhU0jWAeCo6SsrQGlqWklVFGUPJln2aeDazc1A7YG3RigfyR9fOPsRq_9L52esxQ7Cd7dC4JPNYmjHmw02Pfg0E9Rpz_lrJR1WSX07X_o1k_BpVIlqrjTlSJR7w-UCT7GgN0xDKP6tlf0sVcS--Zh-iP5rznEX1yZuPU</recordid><startdate>20170823</startdate><enddate>20170823</enddate><creator>Alazrai, Rami</creator><creator>Alwanni, Hisham</creator><creator>Baslan, Yara</creator><creator>Alnuman, Nasim</creator><creator>Daoud, Mohammad I</creator><general>MDPI AG</general><general>MDPI</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-1296-0231</orcidid></search><sort><creationdate>20170823</creationdate><title>EEG-Based Brain-Computer Interface for Decoding Motor Imagery Tasks within the Same Hand Using Choi-Williams Time-Frequency Distribution</title><author>Alazrai, Rami ; Alwanni, Hisham ; Baslan, Yara ; Alnuman, Nasim ; Daoud, Mohammad I</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c469t-e2d967f11f039aee090048f4badec238a021a0197554d6668431aef5b095c07a3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Brain-Computer Interfaces</topic><topic>Choi-Williams time-frequency distribution</topic><topic>Classification</topic><topic>Decoding</topic><topic>Electroencephalography</topic><topic>Frequency distribution</topic><topic>Hand</topic><topic>hierarchical classification</topic><topic>Human-computer interface</topic><topic>Humans</topic><topic>Imagery</topic><topic>Imagination</topic><topic>motor imagery</topic><topic>Performance evaluation</topic><topic>Prostheses</topic><topic>subject-independent analysis</topic><topic>support vector machines</topic><topic>time-frequency features</topic><topic>User-Computer Interface</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Alazrai, Rami</creatorcontrib><creatorcontrib>Alwanni, Hisham</creatorcontrib><creatorcontrib>Baslan, Yara</creatorcontrib><creatorcontrib>Alnuman, Nasim</creatorcontrib><creatorcontrib>Daoud, Mohammad I</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Sensors (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Alazrai, Rami</au><au>Alwanni, Hisham</au><au>Baslan, Yara</au><au>Alnuman, Nasim</au><au>Daoud, Mohammad I</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>EEG-Based Brain-Computer Interface for Decoding Motor Imagery Tasks within the Same Hand Using Choi-Williams Time-Frequency Distribution</atitle><jtitle>Sensors (Basel, Switzerland)</jtitle><addtitle>Sensors (Basel)</addtitle><date>2017-08-23</date><risdate>2017</risdate><volume>17</volume><issue>9</issue><spage>1937</spage><pages>1937-</pages><issn>1424-8220</issn><eissn>1424-8220</eissn><abstract>This paper presents an EEG-based brain-computer interface system for classifying eleven motor imagery (MI) tasks within the same hand. The proposed system utilizes the Choi-Williams time-frequency distribution (CWD) to construct a time-frequency representation (TFR) of the EEG signals. The constructed TFR is used to extract five categories of time-frequency features (TFFs). The TFFs are processed using a hierarchical classification model to identify the MI task encapsulated within the EEG signals. To evaluate the performance of the proposed approach, EEG data were recorded for eighteen intact subjects and four amputated subjects while imagining to perform each of the eleven hand MI tasks. Two performance evaluation analyses, namely channel- and TFF-based analyses, are conducted to identify the best subset of EEG channels and the TFFs category, respectively, that enable the highest classification accuracy between the MI tasks. In each evaluation analysis, the hierarchical classification model is trained using two training procedures, namely subject-dependent and subject-independent procedures. These two training procedures quantify the capability of the proposed approach to capture both intra- and inter-personal variations in the EEG signals for different MI tasks within the same hand. The results demonstrate the efficacy of the approach for classifying the MI tasks within the same hand. In particular, the classification accuracies obtained for the intact and amputated subjects are as high as 88 . 8 % and 90 . 2 % , respectively, for the subject-dependent training procedure, and 80 . 8 % and 87 . 8 % , respectively, for the subject-independent training procedure. These results suggest the feasibility of applying the proposed approach to control dexterous prosthetic hands, which can be of great benefit for individuals suffering from hand amputations.</abstract><cop>Switzerland</cop><pub>MDPI AG</pub><pmid>28832513</pmid><doi>10.3390/s17091937</doi><orcidid>https://orcid.org/0000-0002-1296-0231</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1424-8220
ispartof Sensors (Basel, Switzerland), 2017-08, Vol.17 (9), p.1937
issn 1424-8220
1424-8220
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_47ebb31fa32e401d8ae4c80739556284
source Publicly Available Content (ProQuest); PubMed Central
subjects Brain-Computer Interfaces
Choi-Williams time-frequency distribution
Classification
Decoding
Electroencephalography
Frequency distribution
Hand
hierarchical classification
Human-computer interface
Humans
Imagery
Imagination
motor imagery
Performance evaluation
Prostheses
subject-independent analysis
support vector machines
time-frequency features
User-Computer Interface
title EEG-Based Brain-Computer Interface for Decoding Motor Imagery Tasks within the Same Hand Using Choi-Williams Time-Frequency Distribution
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-13T03%3A56%3A37IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=EEG-Based%20Brain-Computer%20Interface%20for%20Decoding%20Motor%20Imagery%20Tasks%20within%20the%20Same%20Hand%20Using%20Choi-Williams%20Time-Frequency%20Distribution&rft.jtitle=Sensors%20(Basel,%20Switzerland)&rft.au=Alazrai,%20Rami&rft.date=2017-08-23&rft.volume=17&rft.issue=9&rft.spage=1937&rft.pages=1937-&rft.issn=1424-8220&rft.eissn=1424-8220&rft_id=info:doi/10.3390/s17091937&rft_dat=%3Cproquest_doaj_%3E1952104863%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c469t-e2d967f11f039aee090048f4badec238a021a0197554d6668431aef5b095c07a3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1952104863&rft_id=info:pmid/28832513&rfr_iscdi=true