Loading…
ECG-Signal Multi-Classification Model Based on Squeeze-and-Excitation Residual Neural Networks
Accurate electrocardiogram (ECG) interpretation is crucial in the clinical ECG workflow because it is most likely associated with a disease that can cause major problems in the body. In this study, we proposed an ECG-signal multi-classification model using deep learning. We used a squeeze-and-excita...
Saved in:
Published in: | Applied sciences 2020-09, Vol.10 (18), p.6495 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Accurate electrocardiogram (ECG) interpretation is crucial in the clinical ECG workflow because it is most likely associated with a disease that can cause major problems in the body. In this study, we proposed an ECG-signal multi-classification model using deep learning. We used a squeeze-and-excitation residual network (SE-ResNet), which is a residual network(ResNet) with a squeeze-and-excitation block. Experiments were performed for seven different types of lead-II ECG data obtained from the Korea University Anam Hospital in South Korea. These seven types are normal sinus rhythm, atrial fibrillation, atrial flutter, sinus bradycardia, sinus tachycardia, premature ventricular contraction and first-degree atrioventricular block. We compared the SE-ResNet with a ResNet, as a baseline model, for various depths of layer (18/34/50/101/152). We confirmed that the SE-ResNet had better classification performance than the ResNet, for all layers. The SE-ResNet classifier with 152 layers achieved F1 scores of 97.05% for seven-class classifications. Our model surpassed the baseline model, ResNet, by +1.40% for the seven-class classifications. For ECG-signal multi-classification, considering the F1 scores, the SE-ResNet might be better than the ResNet baseline model. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app10186495 |