Loading…

Uncertainty-Aware Multi-view Arrhythmia Classification from ECG

We propose a deep neural architecture that performs uncertainty-aware multi-view classification of arrhythmia from ECG. Our method learns two different views (1D and 2D) of single-lead ECG to capture different types of information. We use a fusion technique to reduce the conflict between the differe...

Full description

Saved in:
Bibliographic Details
Main Authors: Ashhad, Mohd, Rahmani, Sana, Fayiz, Mohammed, Etemad, Ali, Hashemi, Javad
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 6
container_issue
container_start_page 1
container_title
container_volume
creator Ashhad, Mohd
Rahmani, Sana
Fayiz, Mohammed
Etemad, Ali
Hashemi, Javad
description We propose a deep neural architecture that performs uncertainty-aware multi-view classification of arrhythmia from ECG. Our method learns two different views (1D and 2D) of single-lead ECG to capture different types of information. We use a fusion technique to reduce the conflict between the different views caused by noise and artifacts in ECG data, thus incorporating uncertainty to obtain stronger final predictions. Our framework contains the following three modules (1) a time-series module to learn the morphological features from ECG; (2) an image-space learning module to learn the spatiotemporal features; and (3) the uncertainty-aware fusion module to fuse the information from the two different views. Experimental results on two real-world datasets demonstrate that our framework not only improves the performance on arrhythmia classification compared to the state-of-the-art but also shows better robustness to noise and artifacts present in ECG.
doi_str_mv 10.1109/IJCNN60899.2024.10650766
format conference_proceeding
fullrecord <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_10650766</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10650766</ieee_id><sourcerecordid>10650766</sourcerecordid><originalsourceid>FETCH-ieee_primary_106507663</originalsourceid><addsrcrecordid>eNqFzrsKwjAUgOEoCNbLGzjkBVpPkjY1k5RSb2AnnSVIikfaKkm09O1ddHb6h2_5CaEMIsZALfeHvCwlrJSKOPA4YiATSKUckLlK1UokIBIlGB-SgDPJwjiGdEwmzt0BuFBKBGR9bq_Geo2t78Os09bQ46v2GL7RdDSz9tb7W4Oa5rV2Diu8ao-Pllb20dAi387IqNK1M_Nvp2SxKU75LkRjzOVpsdG2v_zGxB_-AM4dPC0</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Uncertainty-Aware Multi-view Arrhythmia Classification from ECG</title><source>IEEE Xplore All Conference Series</source><creator>Ashhad, Mohd ; Rahmani, Sana ; Fayiz, Mohammed ; Etemad, Ali ; Hashemi, Javad</creator><creatorcontrib>Ashhad, Mohd ; Rahmani, Sana ; Fayiz, Mohammed ; Etemad, Ali ; Hashemi, Javad</creatorcontrib><description>We propose a deep neural architecture that performs uncertainty-aware multi-view classification of arrhythmia from ECG. Our method learns two different views (1D and 2D) of single-lead ECG to capture different types of information. We use a fusion technique to reduce the conflict between the different views caused by noise and artifacts in ECG data, thus incorporating uncertainty to obtain stronger final predictions. Our framework contains the following three modules (1) a time-series module to learn the morphological features from ECG; (2) an image-space learning module to learn the spatiotemporal features; and (3) the uncertainty-aware fusion module to fuse the information from the two different views. Experimental results on two real-world datasets demonstrate that our framework not only improves the performance on arrhythmia classification compared to the state-of-the-art but also shows better robustness to noise and artifacts present in ECG.</description><identifier>EISSN: 2161-4407</identifier><identifier>EISBN: 9798350359312</identifier><identifier>DOI: 10.1109/IJCNN60899.2024.10650766</identifier><language>eng</language><publisher>IEEE</publisher><subject>Arrhythmia ; ECG ; Electrocardiography ; Fuses ; heartbeat classification ; multi-view learning ; Neural networks ; Noise ; Robustness ; Uncertainty ; uncertainty-aware fusion</subject><ispartof>2024 International Joint Conference on Neural Networks (IJCNN), 2024, p.1-6</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10650766$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,27925,54555,54932</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/10650766$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Ashhad, Mohd</creatorcontrib><creatorcontrib>Rahmani, Sana</creatorcontrib><creatorcontrib>Fayiz, Mohammed</creatorcontrib><creatorcontrib>Etemad, Ali</creatorcontrib><creatorcontrib>Hashemi, Javad</creatorcontrib><title>Uncertainty-Aware Multi-view Arrhythmia Classification from ECG</title><title>2024 International Joint Conference on Neural Networks (IJCNN)</title><addtitle>IJCNN</addtitle><description>We propose a deep neural architecture that performs uncertainty-aware multi-view classification of arrhythmia from ECG. Our method learns two different views (1D and 2D) of single-lead ECG to capture different types of information. We use a fusion technique to reduce the conflict between the different views caused by noise and artifacts in ECG data, thus incorporating uncertainty to obtain stronger final predictions. Our framework contains the following three modules (1) a time-series module to learn the morphological features from ECG; (2) an image-space learning module to learn the spatiotemporal features; and (3) the uncertainty-aware fusion module to fuse the information from the two different views. Experimental results on two real-world datasets demonstrate that our framework not only improves the performance on arrhythmia classification compared to the state-of-the-art but also shows better robustness to noise and artifacts present in ECG.</description><subject>Arrhythmia</subject><subject>ECG</subject><subject>Electrocardiography</subject><subject>Fuses</subject><subject>heartbeat classification</subject><subject>multi-view learning</subject><subject>Neural networks</subject><subject>Noise</subject><subject>Robustness</subject><subject>Uncertainty</subject><subject>uncertainty-aware fusion</subject><issn>2161-4407</issn><isbn>9798350359312</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2024</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNqFzrsKwjAUgOEoCNbLGzjkBVpPkjY1k5RSb2AnnSVIikfaKkm09O1ddHb6h2_5CaEMIsZALfeHvCwlrJSKOPA4YiATSKUckLlK1UokIBIlGB-SgDPJwjiGdEwmzt0BuFBKBGR9bq_Geo2t78Os09bQ46v2GL7RdDSz9tb7W4Oa5rV2Diu8ao-Pllb20dAi387IqNK1M_Nvp2SxKU75LkRjzOVpsdG2v_zGxB_-AM4dPC0</recordid><startdate>20240630</startdate><enddate>20240630</enddate><creator>Ashhad, Mohd</creator><creator>Rahmani, Sana</creator><creator>Fayiz, Mohammed</creator><creator>Etemad, Ali</creator><creator>Hashemi, Javad</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>20240630</creationdate><title>Uncertainty-Aware Multi-view Arrhythmia Classification from ECG</title><author>Ashhad, Mohd ; Rahmani, Sana ; Fayiz, Mohammed ; Etemad, Ali ; Hashemi, Javad</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-ieee_primary_106507663</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Arrhythmia</topic><topic>ECG</topic><topic>Electrocardiography</topic><topic>Fuses</topic><topic>heartbeat classification</topic><topic>multi-view learning</topic><topic>Neural networks</topic><topic>Noise</topic><topic>Robustness</topic><topic>Uncertainty</topic><topic>uncertainty-aware fusion</topic><toplevel>online_resources</toplevel><creatorcontrib>Ashhad, Mohd</creatorcontrib><creatorcontrib>Rahmani, Sana</creatorcontrib><creatorcontrib>Fayiz, Mohammed</creatorcontrib><creatorcontrib>Etemad, Ali</creatorcontrib><creatorcontrib>Hashemi, Javad</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE/IET Electronic Library</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Ashhad, Mohd</au><au>Rahmani, Sana</au><au>Fayiz, Mohammed</au><au>Etemad, Ali</au><au>Hashemi, Javad</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Uncertainty-Aware Multi-view Arrhythmia Classification from ECG</atitle><btitle>2024 International Joint Conference on Neural Networks (IJCNN)</btitle><stitle>IJCNN</stitle><date>2024-06-30</date><risdate>2024</risdate><spage>1</spage><epage>6</epage><pages>1-6</pages><eissn>2161-4407</eissn><eisbn>9798350359312</eisbn><abstract>We propose a deep neural architecture that performs uncertainty-aware multi-view classification of arrhythmia from ECG. Our method learns two different views (1D and 2D) of single-lead ECG to capture different types of information. We use a fusion technique to reduce the conflict between the different views caused by noise and artifacts in ECG data, thus incorporating uncertainty to obtain stronger final predictions. Our framework contains the following three modules (1) a time-series module to learn the morphological features from ECG; (2) an image-space learning module to learn the spatiotemporal features; and (3) the uncertainty-aware fusion module to fuse the information from the two different views. Experimental results on two real-world datasets demonstrate that our framework not only improves the performance on arrhythmia classification compared to the state-of-the-art but also shows better robustness to noise and artifacts present in ECG.</abstract><pub>IEEE</pub><doi>10.1109/IJCNN60899.2024.10650766</doi></addata></record>
fulltext fulltext_linktorsrc
identifier EISSN: 2161-4407
ispartof 2024 International Joint Conference on Neural Networks (IJCNN), 2024, p.1-6
issn 2161-4407
language eng
recordid cdi_ieee_primary_10650766
source IEEE Xplore All Conference Series
subjects Arrhythmia
ECG
Electrocardiography
Fuses
heartbeat classification
multi-view learning
Neural networks
Noise
Robustness
Uncertainty
uncertainty-aware fusion
title Uncertainty-Aware Multi-view Arrhythmia Classification from ECG
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T00%3A33%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Uncertainty-Aware%20Multi-view%20Arrhythmia%20Classification%20from%20ECG&rft.btitle=2024%20International%20Joint%20Conference%20on%20Neural%20Networks%20(IJCNN)&rft.au=Ashhad,%20Mohd&rft.date=2024-06-30&rft.spage=1&rft.epage=6&rft.pages=1-6&rft.eissn=2161-4407&rft_id=info:doi/10.1109/IJCNN60899.2024.10650766&rft.eisbn=9798350359312&rft_dat=%3Cieee_CHZPO%3E10650766%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-ieee_primary_106507663%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=10650766&rfr_iscdi=true