Loading…

Refined Self-Attention Transformer Model for ECG-Based Arrhythmia Detection

As the length of electrocardiogram (ECG) sequences increases, most current transformer models demand substantial computational resources for ECG arrhythmia detection. Additionally, conventional single-scale tokens encounter difficulties in accommodating various patterns of arrhythmia. Thus, in this...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on instrumentation and measurement 2024, Vol.73, p.1-14
Main Authors: Tao, Yanyun, Xu, Biao, Zhang, Yuzhen
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:As the length of electrocardiogram (ECG) sequences increases, most current transformer models demand substantial computational resources for ECG arrhythmia detection. Additionally, conventional single-scale tokens encounter difficulties in accommodating various patterns of arrhythmia. Thus, in this study, a refined-attention transformer model for arrhythmia detection was proposed. Our model introduces two refined attention mechanisms, namely, refined diag- and gated linear (GAL) attentions, effectively alleviating computational burdens associated with unnecessary correlations between heartbeats. To address rhythmic and beat-pattern arrhythmias, we used two refined transformer models with a collaborative block, leveraging coarse- and fine-grained tokens to capture inter and intraheartbeat correlations. The collaborative block between two models facilitates the exchange of rhythm information, thereby improving the accuracy of beat detection. On the MIT-BIH dataset, our refined attentions yield over a 65% reduction in computational efforts compared with conventional self-attention. Notably, our refined transformer models achieve 96% accuracy for rhythmic detection and rank within the top two performers for all types of heartbeat detection. Moreover, the collaborative block enhances the recall by 8.8% and precision by 3.4% for atrial premature detection.
ISSN:0018-9456
1557-9662
DOI:10.1109/TIM.2024.3400302