Loading…

MSLANet: multi-scale long attention network for skin lesion classification

Skin cancer is one of the most widespread cancers which is dangerous and fatal. Convolutional neural network (CNN) has been widely used in the classification task of dermoscopy lesions. Despite the amazing breakthroughs, accurate classification of skin lesions remains challenging due to insufficienc...

Full description

Saved in:
Bibliographic Details
Published in:Applied intelligence (Dordrecht, Netherlands) Netherlands), 2023-05, Vol.53 (10), p.12580-12598
Main Authors: Wan, Yecong, Cheng, Yuanshuo, Shao, Mingwen
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Skin cancer is one of the most widespread cancers which is dangerous and fatal. Convolutional neural network (CNN) has been widely used in the classification task of dermoscopy lesions. Despite the amazing breakthroughs, accurate classification of skin lesions remains challenging due to insufficiency of training data, similarity between melanoma and nevus, and weak robustness. To address the above issues, we propose a multi-scale long attention network (MSLANet) for skin lesion classification in dermoscopy images, which is composed of three long attention networks (LANet). Each LANet can fuse the context information and improve discriminative representation ability through the long attention mechanism. Moreover, the multi-scale perspective of lesions can be extracted by self-supervised learning, and no special annotation is needed. Therefore, MSLANet can simultaneously utilize feature-level and instance-level multi-scale information. In addition, we propose a depth data augmentation (DDA) strategy, and training with DDA can further improve the generalization ability of the model. Our method achieves rank-1 average AUC of 93.7% on ISIC 2017 dataset and AUC of 92.4% on SIIM-ISIC 2020 dataset, outperforming the state-of-the-art methods.
ISSN:0924-669X
1573-7497
DOI:10.1007/s10489-022-03320-x