Loading…
MSLANet: multi-scale long attention network for skin lesion classification
Skin cancer is one of the most widespread cancers which is dangerous and fatal. Convolutional neural network (CNN) has been widely used in the classification task of dermoscopy lesions. Despite the amazing breakthroughs, accurate classification of skin lesions remains challenging due to insufficienc...
Saved in:
Published in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2023-05, Vol.53 (10), p.12580-12598 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c319t-affe57254f75b61acaca257bff0d883dc7d2e79da0bb623eb0431077031033b23 |
---|---|
cites | cdi_FETCH-LOGICAL-c319t-affe57254f75b61acaca257bff0d883dc7d2e79da0bb623eb0431077031033b23 |
container_end_page | 12598 |
container_issue | 10 |
container_start_page | 12580 |
container_title | Applied intelligence (Dordrecht, Netherlands) |
container_volume | 53 |
creator | Wan, Yecong Cheng, Yuanshuo Shao, Mingwen |
description | Skin cancer is one of the most widespread cancers which is dangerous and fatal. Convolutional neural network (CNN) has been widely used in the classification task of dermoscopy lesions. Despite the amazing breakthroughs, accurate classification of skin lesions remains challenging due to insufficiency of training data, similarity between melanoma and nevus, and weak robustness. To address the above issues, we propose a multi-scale long attention network (MSLANet) for skin lesion classification in dermoscopy images, which is composed of three long attention networks (LANet). Each LANet can fuse the context information and improve discriminative representation ability through the long attention mechanism. Moreover, the multi-scale perspective of lesions can be extracted by self-supervised learning, and no special annotation is needed. Therefore, MSLANet can simultaneously utilize feature-level and instance-level multi-scale information. In addition, we propose a depth data augmentation (DDA) strategy, and training with DDA can further improve the generalization ability of the model. Our method achieves rank-1 average AUC of 93.7% on ISIC 2017 dataset and AUC of 92.4% on SIIM-ISIC 2020 dataset, outperforming the state-of-the-art methods. |
doi_str_mv | 10.1007/s10489-022-03320-x |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2816233575</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2816233575</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-affe57254f75b61acaca257bff0d883dc7d2e79da0bb623eb0431077031033b23</originalsourceid><addsrcrecordid>eNp9kMFKAzEQhoMoWKsv4CngOTpJdje73krRqlQ9qOAtZHeTsu02qUmK9e1NXcGbDMzA8H0z8CN0TuGSAoirQCErKwKMEeCcAdkdoBHNBSciq8QhGkHFMlIU1fsxOglhCZAwoCP08PgynzzpeI3X2z52JDSq17h3doFVjNrGzllsdfx0foWN8zisOot7Hfb7plchdKZr1B47RUdG9UGf_c4xeru9eZ3ekfnz7H46mZOG0yoSZYzOBcszI_K6oKpJxXJRGwNtWfK2ES3TomoV1HXBuK4h4xSEgNQ5rxkfo4vh7sa7j60OUS7d1tv0UrKSJoXnIk8UG6jGuxC8NnLju7XyX5KC3Gcmh8xkykz-ZCZ3SeKDFBJsF9r_nf7H-gbHIW_f</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2816233575</pqid></control><display><type>article</type><title>MSLANet: multi-scale long attention network for skin lesion classification</title><source>ABI/INFORM Global</source><source>Springer Nature:Jisc Collections:Springer Nature Read and Publish 2023-2025: Springer Reading List</source><creator>Wan, Yecong ; Cheng, Yuanshuo ; Shao, Mingwen</creator><creatorcontrib>Wan, Yecong ; Cheng, Yuanshuo ; Shao, Mingwen</creatorcontrib><description>Skin cancer is one of the most widespread cancers which is dangerous and fatal. Convolutional neural network (CNN) has been widely used in the classification task of dermoscopy lesions. Despite the amazing breakthroughs, accurate classification of skin lesions remains challenging due to insufficiency of training data, similarity between melanoma and nevus, and weak robustness. To address the above issues, we propose a multi-scale long attention network (MSLANet) for skin lesion classification in dermoscopy images, which is composed of three long attention networks (LANet). Each LANet can fuse the context information and improve discriminative representation ability through the long attention mechanism. Moreover, the multi-scale perspective of lesions can be extracted by self-supervised learning, and no special annotation is needed. Therefore, MSLANet can simultaneously utilize feature-level and instance-level multi-scale information. In addition, we propose a depth data augmentation (DDA) strategy, and training with DDA can further improve the generalization ability of the model. Our method achieves rank-1 average AUC of 93.7% on ISIC 2017 dataset and AUC of 92.4% on SIIM-ISIC 2020 dataset, outperforming the state-of-the-art methods.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-022-03320-x</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Annotations ; Artificial Intelligence ; Artificial neural networks ; Cancer ; Classification ; Computer Science ; Data augmentation ; Datasets ; Image classification ; Lesions ; Machine learning ; Machines ; Manufacturing ; Mechanical Engineering ; Processes ; Self-supervised learning ; Skin cancer ; Training</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2023-05, Vol.53 (10), p.12580-12598</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022</rights><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-affe57254f75b61acaca257bff0d883dc7d2e79da0bb623eb0431077031033b23</citedby><cites>FETCH-LOGICAL-c319t-affe57254f75b61acaca257bff0d883dc7d2e79da0bb623eb0431077031033b23</cites><orcidid>0000-0001-7323-5896</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2816233575/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$H</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2816233575?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>314,780,784,11688,27924,27925,36060,44363,74895</link.rule.ids></links><search><creatorcontrib>Wan, Yecong</creatorcontrib><creatorcontrib>Cheng, Yuanshuo</creatorcontrib><creatorcontrib>Shao, Mingwen</creatorcontrib><title>MSLANet: multi-scale long attention network for skin lesion classification</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>Skin cancer is one of the most widespread cancers which is dangerous and fatal. Convolutional neural network (CNN) has been widely used in the classification task of dermoscopy lesions. Despite the amazing breakthroughs, accurate classification of skin lesions remains challenging due to insufficiency of training data, similarity between melanoma and nevus, and weak robustness. To address the above issues, we propose a multi-scale long attention network (MSLANet) for skin lesion classification in dermoscopy images, which is composed of three long attention networks (LANet). Each LANet can fuse the context information and improve discriminative representation ability through the long attention mechanism. Moreover, the multi-scale perspective of lesions can be extracted by self-supervised learning, and no special annotation is needed. Therefore, MSLANet can simultaneously utilize feature-level and instance-level multi-scale information. In addition, we propose a depth data augmentation (DDA) strategy, and training with DDA can further improve the generalization ability of the model. Our method achieves rank-1 average AUC of 93.7% on ISIC 2017 dataset and AUC of 92.4% on SIIM-ISIC 2020 dataset, outperforming the state-of-the-art methods.</description><subject>Annotations</subject><subject>Artificial Intelligence</subject><subject>Artificial neural networks</subject><subject>Cancer</subject><subject>Classification</subject><subject>Computer Science</subject><subject>Data augmentation</subject><subject>Datasets</subject><subject>Image classification</subject><subject>Lesions</subject><subject>Machine learning</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Processes</subject><subject>Self-supervised learning</subject><subject>Skin cancer</subject><subject>Training</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>M0C</sourceid><recordid>eNp9kMFKAzEQhoMoWKsv4CngOTpJdje73krRqlQ9qOAtZHeTsu02qUmK9e1NXcGbDMzA8H0z8CN0TuGSAoirQCErKwKMEeCcAdkdoBHNBSciq8QhGkHFMlIU1fsxOglhCZAwoCP08PgynzzpeI3X2z52JDSq17h3doFVjNrGzllsdfx0foWN8zisOot7Hfb7plchdKZr1B47RUdG9UGf_c4xeru9eZ3ekfnz7H46mZOG0yoSZYzOBcszI_K6oKpJxXJRGwNtWfK2ES3TomoV1HXBuK4h4xSEgNQ5rxkfo4vh7sa7j60OUS7d1tv0UrKSJoXnIk8UG6jGuxC8NnLju7XyX5KC3Gcmh8xkykz-ZCZ3SeKDFBJsF9r_nf7H-gbHIW_f</recordid><startdate>20230501</startdate><enddate>20230501</enddate><creator>Wan, Yecong</creator><creator>Cheng, Yuanshuo</creator><creator>Shao, Mingwen</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0001-7323-5896</orcidid></search><sort><creationdate>20230501</creationdate><title>MSLANet: multi-scale long attention network for skin lesion classification</title><author>Wan, Yecong ; Cheng, Yuanshuo ; Shao, Mingwen</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-affe57254f75b61acaca257bff0d883dc7d2e79da0bb623eb0431077031033b23</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Annotations</topic><topic>Artificial Intelligence</topic><topic>Artificial neural networks</topic><topic>Cancer</topic><topic>Classification</topic><topic>Computer Science</topic><topic>Data augmentation</topic><topic>Datasets</topic><topic>Image classification</topic><topic>Lesions</topic><topic>Machine learning</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Processes</topic><topic>Self-supervised learning</topic><topic>Skin cancer</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wan, Yecong</creatorcontrib><creatorcontrib>Cheng, Yuanshuo</creatorcontrib><creatorcontrib>Shao, Mingwen</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wan, Yecong</au><au>Cheng, Yuanshuo</au><au>Shao, Mingwen</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>MSLANet: multi-scale long attention network for skin lesion classification</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2023-05-01</date><risdate>2023</risdate><volume>53</volume><issue>10</issue><spage>12580</spage><epage>12598</epage><pages>12580-12598</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>Skin cancer is one of the most widespread cancers which is dangerous and fatal. Convolutional neural network (CNN) has been widely used in the classification task of dermoscopy lesions. Despite the amazing breakthroughs, accurate classification of skin lesions remains challenging due to insufficiency of training data, similarity between melanoma and nevus, and weak robustness. To address the above issues, we propose a multi-scale long attention network (MSLANet) for skin lesion classification in dermoscopy images, which is composed of three long attention networks (LANet). Each LANet can fuse the context information and improve discriminative representation ability through the long attention mechanism. Moreover, the multi-scale perspective of lesions can be extracted by self-supervised learning, and no special annotation is needed. Therefore, MSLANet can simultaneously utilize feature-level and instance-level multi-scale information. In addition, we propose a depth data augmentation (DDA) strategy, and training with DDA can further improve the generalization ability of the model. Our method achieves rank-1 average AUC of 93.7% on ISIC 2017 dataset and AUC of 92.4% on SIIM-ISIC 2020 dataset, outperforming the state-of-the-art methods.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-022-03320-x</doi><tpages>19</tpages><orcidid>https://orcid.org/0000-0001-7323-5896</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0924-669X |
ispartof | Applied intelligence (Dordrecht, Netherlands), 2023-05, Vol.53 (10), p.12580-12598 |
issn | 0924-669X 1573-7497 |
language | eng |
recordid | cdi_proquest_journals_2816233575 |
source | ABI/INFORM Global; Springer Nature:Jisc Collections:Springer Nature Read and Publish 2023-2025: Springer Reading List |
subjects | Annotations Artificial Intelligence Artificial neural networks Cancer Classification Computer Science Data augmentation Datasets Image classification Lesions Machine learning Machines Manufacturing Mechanical Engineering Processes Self-supervised learning Skin cancer Training |
title | MSLANet: multi-scale long attention network for skin lesion classification |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T02%3A12%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=MSLANet:%20multi-scale%20long%20attention%20network%20for%20skin%20lesion%20classification&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Wan,%20Yecong&rft.date=2023-05-01&rft.volume=53&rft.issue=10&rft.spage=12580&rft.epage=12598&rft.pages=12580-12598&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-022-03320-x&rft_dat=%3Cproquest_cross%3E2816233575%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c319t-affe57254f75b61acaca257bff0d883dc7d2e79da0bb623eb0431077031033b23%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2816233575&rft_id=info:pmid/&rfr_iscdi=true |