Loading…

A Deep Transfer Convolutional Neural Network Framework for EEG Signal Classification

Nowadays, motor imagery (MI) electroencephalogram (EEG) signal classification has become a hotspot in the research field of brain computer interface (BCI). More recently, deep learning has emerged as a promising technique to automatically extract features of raw MI EEG signals and then classify them...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.112767-112776
Main Authors: Xu, Gaowei, Shen, Xiaoang, Chen, Sirui, Zong, Yongshuo, Zhang, Canyang, Yue, Hongyang, Liu, Min, Chen, Fei, Che, Wenliang
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c524t-9dc6ef5ffa48364fa468b93723581193f2903c6b9cfd2cf461c756b2c6f0dafd3
cites cdi_FETCH-LOGICAL-c524t-9dc6ef5ffa48364fa468b93723581193f2903c6b9cfd2cf461c756b2c6f0dafd3
container_end_page 112776
container_issue
container_start_page 112767
container_title IEEE access
container_volume 7
creator Xu, Gaowei
Shen, Xiaoang
Chen, Sirui
Zong, Yongshuo
Zhang, Canyang
Yue, Hongyang
Liu, Min
Chen, Fei
Che, Wenliang
description Nowadays, motor imagery (MI) electroencephalogram (EEG) signal classification has become a hotspot in the research field of brain computer interface (BCI). More recently, deep learning has emerged as a promising technique to automatically extract features of raw MI EEG signals and then classify them. However, deep learning-based methods still face two challenging problems in practical MI EEG signal classification applications: (1) Generally, training a deep learning model successfully needs a large amount of labeled data. However, most of the EEG signal data is unlabeled and it is quite difficult or even impossible for human experts to label all the signal samples manually. (2) It is extremely time-consuming and computationally expensive to train a deep learning model from scratch. To cope with these two challenges, a deep transfer convolutional neural network (CNN) framework based on VGG-16 is proposed for EEG signal classification. The proposed framework consists of a VGG-16 CNN model pre-trained on the ImageNet and a target CNN model which shares the same structure with VGG-16 except for the softmax output layer. The parameters of the pre-trained VGG-16 CNN model are directly transferred to the target CNN model used for MI EEG signal classification. Then, front-layers parameters in the target model are frozen, while later-layers parameters are fine-tuned by the target MI dataset. The target dataset is composed of time-frequency spectrum images of EEG signals. The performance of the proposed framework is verified on the public benchmark dataset 2b from the BCI competition IV. The experimental results show that the proposed framework improves the accuracy and efficiency performance of EEG signal classification compared with traditional methods, including support vector machine (SVM), artificial neural network (ANN), and standard CNN.
doi_str_mv 10.1109/ACCESS.2019.2930958
format article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_8772136</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8772136</ieee_id><doaj_id>oai_doaj_org_article_3c85f2d607bd4e1ba67e004cd51b63a3</doaj_id><sourcerecordid>2455608952</sourcerecordid><originalsourceid>FETCH-LOGICAL-c524t-9dc6ef5ffa48364fa468b93723581193f2903c6b9cfd2cf461c756b2c6f0dafd3</originalsourceid><addsrcrecordid>eNpNUctOwzAQtBBIVKVf0Eskzil-xE58rEJaKlVwaDlbjh-VS1oXOwXx9yQNqtjLrEYzs1oNAFMEZwhB_jQvy2qzmWGI-AxzAjktbsAII8ZTQgm7_bffg0mMe9hN0VE0H4HtPHk25pRsgzxGa0JS-uOXb86t80fZJK_mHC7QfvvwkSyCPJjLZn1IqmqZbNyu15WNjNFZp2RvfAB3VjbRTP5wDN4X1bZ8Sddvy1U5X6eK4qxNuVbMWGqtzArCsg5YUXOSY0ILhDixmEOiWM2V1VjZjCGVU1ZjxSzU0moyBqshV3u5F6fgDjL8CC-duBA-7IQMrVONEUQV1GLNYF7rzKBastxAmClNUc2IJF3W45B1Cv7zbGIr9v4cut-iwBmlDBac4k5FBpUKPsZg7PUqgqJvQwxtiL4N8ddG55oOLmeMuTqKPMeIMPILnwGFgw</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2455608952</pqid></control><display><type>article</type><title>A Deep Transfer Convolutional Neural Network Framework for EEG Signal Classification</title><source>IEEE Xplore Open Access Journals</source><creator>Xu, Gaowei ; Shen, Xiaoang ; Chen, Sirui ; Zong, Yongshuo ; Zhang, Canyang ; Yue, Hongyang ; Liu, Min ; Chen, Fei ; Che, Wenliang</creator><creatorcontrib>Xu, Gaowei ; Shen, Xiaoang ; Chen, Sirui ; Zong, Yongshuo ; Zhang, Canyang ; Yue, Hongyang ; Liu, Min ; Chen, Fei ; Che, Wenliang</creatorcontrib><description>Nowadays, motor imagery (MI) electroencephalogram (EEG) signal classification has become a hotspot in the research field of brain computer interface (BCI). More recently, deep learning has emerged as a promising technique to automatically extract features of raw MI EEG signals and then classify them. However, deep learning-based methods still face two challenging problems in practical MI EEG signal classification applications: (1) Generally, training a deep learning model successfully needs a large amount of labeled data. However, most of the EEG signal data is unlabeled and it is quite difficult or even impossible for human experts to label all the signal samples manually. (2) It is extremely time-consuming and computationally expensive to train a deep learning model from scratch. To cope with these two challenges, a deep transfer convolutional neural network (CNN) framework based on VGG-16 is proposed for EEG signal classification. The proposed framework consists of a VGG-16 CNN model pre-trained on the ImageNet and a target CNN model which shares the same structure with VGG-16 except for the softmax output layer. The parameters of the pre-trained VGG-16 CNN model are directly transferred to the target CNN model used for MI EEG signal classification. Then, front-layers parameters in the target model are frozen, while later-layers parameters are fine-tuned by the target MI dataset. The target dataset is composed of time-frequency spectrum images of EEG signals. The performance of the proposed framework is verified on the public benchmark dataset 2b from the BCI competition IV. The experimental results show that the proposed framework improves the accuracy and efficiency performance of EEG signal classification compared with traditional methods, including support vector machine (SVM), artificial neural network (ANN), and standard CNN.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2019.2930958</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>Artificial neural networks ; Brain modeling ; Classification ; Computational modeling ; Datasets ; Deep learning ; electroencephalogram (EEG) ; Electroencephalography ; Feature extraction ; Frequency spectrum ; Human-computer interface ; Image classification ; Learning theory ; Machine learning ; Mathematical models ; Motor imagery (MI) ; Neural networks ; Parameters ; short time Fourier transform (STFT) ; Signal classification ; Support vector machines ; Target recognition ; Task analysis ; Time-frequency analysis ; transfer learning ; VGG-16</subject><ispartof>IEEE access, 2019, Vol.7, p.112767-112776</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c524t-9dc6ef5ffa48364fa468b93723581193f2903c6b9cfd2cf461c756b2c6f0dafd3</citedby><cites>FETCH-LOGICAL-c524t-9dc6ef5ffa48364fa468b93723581193f2903c6b9cfd2cf461c756b2c6f0dafd3</cites><orcidid>0000-0002-8902-5460 ; 0000-0003-3752-7749</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8772136$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,4024,27633,27923,27924,27925,54933</link.rule.ids></links><search><creatorcontrib>Xu, Gaowei</creatorcontrib><creatorcontrib>Shen, Xiaoang</creatorcontrib><creatorcontrib>Chen, Sirui</creatorcontrib><creatorcontrib>Zong, Yongshuo</creatorcontrib><creatorcontrib>Zhang, Canyang</creatorcontrib><creatorcontrib>Yue, Hongyang</creatorcontrib><creatorcontrib>Liu, Min</creatorcontrib><creatorcontrib>Chen, Fei</creatorcontrib><creatorcontrib>Che, Wenliang</creatorcontrib><title>A Deep Transfer Convolutional Neural Network Framework for EEG Signal Classification</title><title>IEEE access</title><addtitle>Access</addtitle><description>Nowadays, motor imagery (MI) electroencephalogram (EEG) signal classification has become a hotspot in the research field of brain computer interface (BCI). More recently, deep learning has emerged as a promising technique to automatically extract features of raw MI EEG signals and then classify them. However, deep learning-based methods still face two challenging problems in practical MI EEG signal classification applications: (1) Generally, training a deep learning model successfully needs a large amount of labeled data. However, most of the EEG signal data is unlabeled and it is quite difficult or even impossible for human experts to label all the signal samples manually. (2) It is extremely time-consuming and computationally expensive to train a deep learning model from scratch. To cope with these two challenges, a deep transfer convolutional neural network (CNN) framework based on VGG-16 is proposed for EEG signal classification. The proposed framework consists of a VGG-16 CNN model pre-trained on the ImageNet and a target CNN model which shares the same structure with VGG-16 except for the softmax output layer. The parameters of the pre-trained VGG-16 CNN model are directly transferred to the target CNN model used for MI EEG signal classification. Then, front-layers parameters in the target model are frozen, while later-layers parameters are fine-tuned by the target MI dataset. The target dataset is composed of time-frequency spectrum images of EEG signals. The performance of the proposed framework is verified on the public benchmark dataset 2b from the BCI competition IV. The experimental results show that the proposed framework improves the accuracy and efficiency performance of EEG signal classification compared with traditional methods, including support vector machine (SVM), artificial neural network (ANN), and standard CNN.</description><subject>Artificial neural networks</subject><subject>Brain modeling</subject><subject>Classification</subject><subject>Computational modeling</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>electroencephalogram (EEG)</subject><subject>Electroencephalography</subject><subject>Feature extraction</subject><subject>Frequency spectrum</subject><subject>Human-computer interface</subject><subject>Image classification</subject><subject>Learning theory</subject><subject>Machine learning</subject><subject>Mathematical models</subject><subject>Motor imagery (MI)</subject><subject>Neural networks</subject><subject>Parameters</subject><subject>short time Fourier transform (STFT)</subject><subject>Signal classification</subject><subject>Support vector machines</subject><subject>Target recognition</subject><subject>Task analysis</subject><subject>Time-frequency analysis</subject><subject>transfer learning</subject><subject>VGG-16</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>DOA</sourceid><recordid>eNpNUctOwzAQtBBIVKVf0Eskzil-xE58rEJaKlVwaDlbjh-VS1oXOwXx9yQNqtjLrEYzs1oNAFMEZwhB_jQvy2qzmWGI-AxzAjktbsAII8ZTQgm7_bffg0mMe9hN0VE0H4HtPHk25pRsgzxGa0JS-uOXb86t80fZJK_mHC7QfvvwkSyCPJjLZn1IqmqZbNyu15WNjNFZp2RvfAB3VjbRTP5wDN4X1bZ8Sddvy1U5X6eK4qxNuVbMWGqtzArCsg5YUXOSY0ILhDixmEOiWM2V1VjZjCGVU1ZjxSzU0moyBqshV3u5F6fgDjL8CC-duBA-7IQMrVONEUQV1GLNYF7rzKBastxAmClNUc2IJF3W45B1Cv7zbGIr9v4cut-iwBmlDBac4k5FBpUKPsZg7PUqgqJvQwxtiL4N8ddG55oOLmeMuTqKPMeIMPILnwGFgw</recordid><startdate>2019</startdate><enddate>2019</enddate><creator>Xu, Gaowei</creator><creator>Shen, Xiaoang</creator><creator>Chen, Sirui</creator><creator>Zong, Yongshuo</creator><creator>Zhang, Canyang</creator><creator>Yue, Hongyang</creator><creator>Liu, Min</creator><creator>Chen, Fei</creator><creator>Che, Wenliang</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-8902-5460</orcidid><orcidid>https://orcid.org/0000-0003-3752-7749</orcidid></search><sort><creationdate>2019</creationdate><title>A Deep Transfer Convolutional Neural Network Framework for EEG Signal Classification</title><author>Xu, Gaowei ; Shen, Xiaoang ; Chen, Sirui ; Zong, Yongshuo ; Zhang, Canyang ; Yue, Hongyang ; Liu, Min ; Chen, Fei ; Che, Wenliang</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c524t-9dc6ef5ffa48364fa468b93723581193f2903c6b9cfd2cf461c756b2c6f0dafd3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Artificial neural networks</topic><topic>Brain modeling</topic><topic>Classification</topic><topic>Computational modeling</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>electroencephalogram (EEG)</topic><topic>Electroencephalography</topic><topic>Feature extraction</topic><topic>Frequency spectrum</topic><topic>Human-computer interface</topic><topic>Image classification</topic><topic>Learning theory</topic><topic>Machine learning</topic><topic>Mathematical models</topic><topic>Motor imagery (MI)</topic><topic>Neural networks</topic><topic>Parameters</topic><topic>short time Fourier transform (STFT)</topic><topic>Signal classification</topic><topic>Support vector machines</topic><topic>Target recognition</topic><topic>Task analysis</topic><topic>Time-frequency analysis</topic><topic>transfer learning</topic><topic>VGG-16</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Xu, Gaowei</creatorcontrib><creatorcontrib>Shen, Xiaoang</creatorcontrib><creatorcontrib>Chen, Sirui</creatorcontrib><creatorcontrib>Zong, Yongshuo</creatorcontrib><creatorcontrib>Zhang, Canyang</creatorcontrib><creatorcontrib>Yue, Hongyang</creatorcontrib><creatorcontrib>Liu, Min</creatorcontrib><creatorcontrib>Chen, Fei</creatorcontrib><creatorcontrib>Che, Wenliang</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Xplore Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998–Present</collection><collection>IEEE Electronic Library Online</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Xu, Gaowei</au><au>Shen, Xiaoang</au><au>Chen, Sirui</au><au>Zong, Yongshuo</au><au>Zhang, Canyang</au><au>Yue, Hongyang</au><au>Liu, Min</au><au>Chen, Fei</au><au>Che, Wenliang</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Deep Transfer Convolutional Neural Network Framework for EEG Signal Classification</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2019</date><risdate>2019</risdate><volume>7</volume><spage>112767</spage><epage>112776</epage><pages>112767-112776</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Nowadays, motor imagery (MI) electroencephalogram (EEG) signal classification has become a hotspot in the research field of brain computer interface (BCI). More recently, deep learning has emerged as a promising technique to automatically extract features of raw MI EEG signals and then classify them. However, deep learning-based methods still face two challenging problems in practical MI EEG signal classification applications: (1) Generally, training a deep learning model successfully needs a large amount of labeled data. However, most of the EEG signal data is unlabeled and it is quite difficult or even impossible for human experts to label all the signal samples manually. (2) It is extremely time-consuming and computationally expensive to train a deep learning model from scratch. To cope with these two challenges, a deep transfer convolutional neural network (CNN) framework based on VGG-16 is proposed for EEG signal classification. The proposed framework consists of a VGG-16 CNN model pre-trained on the ImageNet and a target CNN model which shares the same structure with VGG-16 except for the softmax output layer. The parameters of the pre-trained VGG-16 CNN model are directly transferred to the target CNN model used for MI EEG signal classification. Then, front-layers parameters in the target model are frozen, while later-layers parameters are fine-tuned by the target MI dataset. The target dataset is composed of time-frequency spectrum images of EEG signals. The performance of the proposed framework is verified on the public benchmark dataset 2b from the BCI competition IV. The experimental results show that the proposed framework improves the accuracy and efficiency performance of EEG signal classification compared with traditional methods, including support vector machine (SVM), artificial neural network (ANN), and standard CNN.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2019.2930958</doi><tpages>10</tpages><orcidid>https://orcid.org/0000-0002-8902-5460</orcidid><orcidid>https://orcid.org/0000-0003-3752-7749</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2019, Vol.7, p.112767-112776
issn 2169-3536
2169-3536
language eng
recordid cdi_ieee_primary_8772136
source IEEE Xplore Open Access Journals
subjects Artificial neural networks
Brain modeling
Classification
Computational modeling
Datasets
Deep learning
electroencephalogram (EEG)
Electroencephalography
Feature extraction
Frequency spectrum
Human-computer interface
Image classification
Learning theory
Machine learning
Mathematical models
Motor imagery (MI)
Neural networks
Parameters
short time Fourier transform (STFT)
Signal classification
Support vector machines
Target recognition
Task analysis
Time-frequency analysis
transfer learning
VGG-16
title A Deep Transfer Convolutional Neural Network Framework for EEG Signal Classification
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T12%3A35%3A39IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Deep%20Transfer%20Convolutional%20Neural%20Network%20Framework%20for%20EEG%20Signal%20Classification&rft.jtitle=IEEE%20access&rft.au=Xu,%20Gaowei&rft.date=2019&rft.volume=7&rft.spage=112767&rft.epage=112776&rft.pages=112767-112776&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2019.2930958&rft_dat=%3Cproquest_ieee_%3E2455608952%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c524t-9dc6ef5ffa48364fa468b93723581193f2903c6b9cfd2cf461c756b2c6f0dafd3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2455608952&rft_id=info:pmid/&rft_ieee_id=8772136&rfr_iscdi=true