Loading…
Self-Supervised Learning to Increase the Performance of Skin Lesion Classification
To successfully train a deep neural network, a large amount of human-labeled data is required. Unfortunately, in many areas, collecting and labeling data is a difficult and tedious task. Several ways have been developed to mitigate the problem associated with the shortage of data, the most common of...
Saved in:
Published in: | Electronics (Basel) 2020-11, Vol.9 (11), p.1930 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c319t-ebe411cabdb360f3e48148424008d8dd23b1af9117e42ad1558792a2f3bffda93 |
---|---|
cites | cdi_FETCH-LOGICAL-c319t-ebe411cabdb360f3e48148424008d8dd23b1af9117e42ad1558792a2f3bffda93 |
container_end_page | |
container_issue | 11 |
container_start_page | 1930 |
container_title | Electronics (Basel) |
container_volume | 9 |
creator | Kwasigroch, Arkadiusz Grochowski, Michał Mikołajczyk, Agnieszka |
description | To successfully train a deep neural network, a large amount of human-labeled data is required. Unfortunately, in many areas, collecting and labeling data is a difficult and tedious task. Several ways have been developed to mitigate the problem associated with the shortage of data, the most common of which is transfer learning. However, in many cases, the use of transfer learning as the only remedy is insufficient. In this study, we improve deep neural models training and increase the classification accuracy under a scarcity of data by the use of the self-supervised learning technique. Self-supervised learning allows an unlabeled dataset to be used for pretraining the network, as opposed to transfer learning that requires labeled datasets. The pretrained network can be then fine-tuned using the annotated data. Moreover, we investigated the effect of combining the self-supervised learning approach with transfer learning. It is shown that this strategy outperforms network training from scratch or with transfer learning. The tests were conducted on a very important and sensitive application (skin lesion classification), but the presented approach can be applied to a broader family of applications, especially in the medical domain where the scarcity of data is a real problem. |
doi_str_mv | 10.3390/electronics9111930 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2462829696</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2462829696</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-ebe411cabdb360f3e48148424008d8dd23b1af9117e42ad1558792a2f3bffda93</originalsourceid><addsrcrecordid>eNplkEtLxDAUhYMoOIzzB1wFXFfzmjZZSvExMKBYXZc0udGMnaQmreC_tzIuBO_m3gMf93AOQueUXHKuyBX0YMYUgzdZUUoVJ0dowUilCsUUO_5zn6JVzjsyj6JccrJATw30rmimAdKnz2DxFnQKPrziMeJNMAl0Bjy-AX6E5GLa62AAR4ebdx9mOPsYcN3rnL3zRo-zPEMnTvcZVr97iV5ub57r-2L7cLepr7eF4VSNBXQgKDW6sx0vieMgJBVSMEGItNJaxjuq3RyoAsG0peu1rBTTzPHOOasVX6KLw98hxY8J8tju4pTCbNkyUTLJVKnKmWIHyqSYcwLXDsnvdfpqKWl_6mv_18e_AXoiZnI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2462829696</pqid></control><display><type>article</type><title>Self-Supervised Learning to Increase the Performance of Skin Lesion Classification</title><source>Publicly Available Content (ProQuest)</source><creator>Kwasigroch, Arkadiusz ; Grochowski, Michał ; Mikołajczyk, Agnieszka</creator><creatorcontrib>Kwasigroch, Arkadiusz ; Grochowski, Michał ; Mikołajczyk, Agnieszka</creatorcontrib><description>To successfully train a deep neural network, a large amount of human-labeled data is required. Unfortunately, in many areas, collecting and labeling data is a difficult and tedious task. Several ways have been developed to mitigate the problem associated with the shortage of data, the most common of which is transfer learning. However, in many cases, the use of transfer learning as the only remedy is insufficient. In this study, we improve deep neural models training and increase the classification accuracy under a scarcity of data by the use of the self-supervised learning technique. Self-supervised learning allows an unlabeled dataset to be used for pretraining the network, as opposed to transfer learning that requires labeled datasets. The pretrained network can be then fine-tuned using the annotated data. Moreover, we investigated the effect of combining the self-supervised learning approach with transfer learning. It is shown that this strategy outperforms network training from scratch or with transfer learning. The tests were conducted on a very important and sensitive application (skin lesion classification), but the presented approach can be applied to a broader family of applications, especially in the medical domain where the scarcity of data is a real problem.</description><identifier>ISSN: 2079-9292</identifier><identifier>EISSN: 2079-9292</identifier><identifier>DOI: 10.3390/electronics9111930</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Algorithms ; Artificial neural networks ; Classification ; Datasets ; Deep learning ; Machine learning ; Medical research ; Neural networks ; Self-supervised learning ; Training</subject><ispartof>Electronics (Basel), 2020-11, Vol.9 (11), p.1930</ispartof><rights>2020. This work is licensed under http://creativecommons.org/licenses/by/3.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-ebe411cabdb360f3e48148424008d8dd23b1af9117e42ad1558792a2f3bffda93</citedby><cites>FETCH-LOGICAL-c319t-ebe411cabdb360f3e48148424008d8dd23b1af9117e42ad1558792a2f3bffda93</cites><orcidid>0000-0002-8003-6243 ; 0000-0002-7803-0010 ; 0000-0002-2453-2410</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2462829696/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2462829696?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,25731,27901,27902,36989,44566,74869</link.rule.ids></links><search><creatorcontrib>Kwasigroch, Arkadiusz</creatorcontrib><creatorcontrib>Grochowski, Michał</creatorcontrib><creatorcontrib>Mikołajczyk, Agnieszka</creatorcontrib><title>Self-Supervised Learning to Increase the Performance of Skin Lesion Classification</title><title>Electronics (Basel)</title><description>To successfully train a deep neural network, a large amount of human-labeled data is required. Unfortunately, in many areas, collecting and labeling data is a difficult and tedious task. Several ways have been developed to mitigate the problem associated with the shortage of data, the most common of which is transfer learning. However, in many cases, the use of transfer learning as the only remedy is insufficient. In this study, we improve deep neural models training and increase the classification accuracy under a scarcity of data by the use of the self-supervised learning technique. Self-supervised learning allows an unlabeled dataset to be used for pretraining the network, as opposed to transfer learning that requires labeled datasets. The pretrained network can be then fine-tuned using the annotated data. Moreover, we investigated the effect of combining the self-supervised learning approach with transfer learning. It is shown that this strategy outperforms network training from scratch or with transfer learning. The tests were conducted on a very important and sensitive application (skin lesion classification), but the presented approach can be applied to a broader family of applications, especially in the medical domain where the scarcity of data is a real problem.</description><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Machine learning</subject><subject>Medical research</subject><subject>Neural networks</subject><subject>Self-supervised learning</subject><subject>Training</subject><issn>2079-9292</issn><issn>2079-9292</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNplkEtLxDAUhYMoOIzzB1wFXFfzmjZZSvExMKBYXZc0udGMnaQmreC_tzIuBO_m3gMf93AOQueUXHKuyBX0YMYUgzdZUUoVJ0dowUilCsUUO_5zn6JVzjsyj6JccrJATw30rmimAdKnz2DxFnQKPrziMeJNMAl0Bjy-AX6E5GLa62AAR4ebdx9mOPsYcN3rnL3zRo-zPEMnTvcZVr97iV5ub57r-2L7cLepr7eF4VSNBXQgKDW6sx0vieMgJBVSMEGItNJaxjuq3RyoAsG0peu1rBTTzPHOOasVX6KLw98hxY8J8tju4pTCbNkyUTLJVKnKmWIHyqSYcwLXDsnvdfpqKWl_6mv_18e_AXoiZnI</recordid><startdate>20201101</startdate><enddate>20201101</enddate><creator>Kwasigroch, Arkadiusz</creator><creator>Grochowski, Michał</creator><creator>Mikołajczyk, Agnieszka</creator><general>MDPI AG</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L7M</scope><scope>P5Z</scope><scope>P62</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><orcidid>https://orcid.org/0000-0002-8003-6243</orcidid><orcidid>https://orcid.org/0000-0002-7803-0010</orcidid><orcidid>https://orcid.org/0000-0002-2453-2410</orcidid></search><sort><creationdate>20201101</creationdate><title>Self-Supervised Learning to Increase the Performance of Skin Lesion Classification</title><author>Kwasigroch, Arkadiusz ; Grochowski, Michał ; Mikołajczyk, Agnieszka</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-ebe411cabdb360f3e48148424008d8dd23b1af9117e42ad1558792a2f3bffda93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Machine learning</topic><topic>Medical research</topic><topic>Neural networks</topic><topic>Self-supervised learning</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Kwasigroch, Arkadiusz</creatorcontrib><creatorcontrib>Grochowski, Michał</creatorcontrib><creatorcontrib>Mikołajczyk, Agnieszka</creatorcontrib><collection>CrossRef</collection><collection>Electronics & Communications Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Advanced Technologies & Aerospace Database</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><jtitle>Electronics (Basel)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Kwasigroch, Arkadiusz</au><au>Grochowski, Michał</au><au>Mikołajczyk, Agnieszka</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Self-Supervised Learning to Increase the Performance of Skin Lesion Classification</atitle><jtitle>Electronics (Basel)</jtitle><date>2020-11-01</date><risdate>2020</risdate><volume>9</volume><issue>11</issue><spage>1930</spage><pages>1930-</pages><issn>2079-9292</issn><eissn>2079-9292</eissn><abstract>To successfully train a deep neural network, a large amount of human-labeled data is required. Unfortunately, in many areas, collecting and labeling data is a difficult and tedious task. Several ways have been developed to mitigate the problem associated with the shortage of data, the most common of which is transfer learning. However, in many cases, the use of transfer learning as the only remedy is insufficient. In this study, we improve deep neural models training and increase the classification accuracy under a scarcity of data by the use of the self-supervised learning technique. Self-supervised learning allows an unlabeled dataset to be used for pretraining the network, as opposed to transfer learning that requires labeled datasets. The pretrained network can be then fine-tuned using the annotated data. Moreover, we investigated the effect of combining the self-supervised learning approach with transfer learning. It is shown that this strategy outperforms network training from scratch or with transfer learning. The tests were conducted on a very important and sensitive application (skin lesion classification), but the presented approach can be applied to a broader family of applications, especially in the medical domain where the scarcity of data is a real problem.</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/electronics9111930</doi><orcidid>https://orcid.org/0000-0002-8003-6243</orcidid><orcidid>https://orcid.org/0000-0002-7803-0010</orcidid><orcidid>https://orcid.org/0000-0002-2453-2410</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2079-9292 |
ispartof | Electronics (Basel), 2020-11, Vol.9 (11), p.1930 |
issn | 2079-9292 2079-9292 |
language | eng |
recordid | cdi_proquest_journals_2462829696 |
source | Publicly Available Content (ProQuest) |
subjects | Algorithms Artificial neural networks Classification Datasets Deep learning Machine learning Medical research Neural networks Self-supervised learning Training |
title | Self-Supervised Learning to Increase the Performance of Skin Lesion Classification |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-05T23%3A59%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Self-Supervised%20Learning%20to%20Increase%20the%20Performance%20of%20Skin%20Lesion%20Classification&rft.jtitle=Electronics%20(Basel)&rft.au=Kwasigroch,%20Arkadiusz&rft.date=2020-11-01&rft.volume=9&rft.issue=11&rft.spage=1930&rft.pages=1930-&rft.issn=2079-9292&rft.eissn=2079-9292&rft_id=info:doi/10.3390/electronics9111930&rft_dat=%3Cproquest_cross%3E2462829696%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c319t-ebe411cabdb360f3e48148424008d8dd23b1af9117e42ad1558792a2f3bffda93%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2462829696&rft_id=info:pmid/&rfr_iscdi=true |