Loading…

Skin Cancer Diagnosis Based on Neutrosophic Features with a Deep Neural Network

Recent years evidenced an increase in the total number of skin cancer cases, and it is projected to grow exponentially. This paper proposes a computer-aided diagnosis system for the classification of a malignant lesion, where the acquired image is primarily pre-processed using novel methods. Digital...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2022-08, Vol.22 (16), p.6261
Main Authors: Singh, Sumit Kumar, Abolghasemi, Vahid, Anisi, Mohammad Hossein
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c536t-4b8ac2fb88eba8d2dfe48c8867467ef25db343e81d72c978b3cb152cfcd1bd003
cites cdi_FETCH-LOGICAL-c536t-4b8ac2fb88eba8d2dfe48c8867467ef25db343e81d72c978b3cb152cfcd1bd003
container_end_page
container_issue 16
container_start_page 6261
container_title Sensors (Basel, Switzerland)
container_volume 22
creator Singh, Sumit Kumar
Abolghasemi, Vahid
Anisi, Mohammad Hossein
description Recent years evidenced an increase in the total number of skin cancer cases, and it is projected to grow exponentially. This paper proposes a computer-aided diagnosis system for the classification of a malignant lesion, where the acquired image is primarily pre-processed using novel methods. Digital artifacts such as hair follicles and blood vessels are removed, and thereafter, the image is enhanced using a novel method of histogram equalization. Henceforth, the pre-processed image undergoes the segmentation phase, where the suspected lesion is segmented using the Neutrosophic technique. The segmentation method employs a thresholding-based method along with a pentagonal neutrosophic structure to form a segmentation mask of the suspected skin lesion. The paper proposes a deep neural network base on Inception and residual blocks with softmax block after each residual block which makes the layer wider and easier to learn the key features more quickly. The proposed classifier was trained, tested, and validated over PH2, ISIC 2017, ISIC 2018, and ISIC 2019 datasets. The proposed segmentation model yields an accuracy mark of 99.50%, 99.33%, 98.56% and 98.04% for these datasets, respectively. These datasets are augmented to form a total of 103,554 images for training, which make the classifier produce enhanced classification results. Our experimental results confirm that the proposed classifier yields an accuracy score of 99.50%, 99.33%, 98.56%, and 98.04% for PH2, ISIC 2017, 2018, and 2019, respectively, which is better than most of the pre-existing classifiers.
doi_str_mv 10.3390/s22166261
format article
fullrecord <record><control><sourceid>gale_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_723f42f22255486f86d387e38278a770</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A746531259</galeid><doaj_id>oai_doaj_org_article_723f42f22255486f86d387e38278a770</doaj_id><sourcerecordid>A746531259</sourcerecordid><originalsourceid>FETCH-LOGICAL-c536t-4b8ac2fb88eba8d2dfe48c8867467ef25db343e81d72c978b3cb152cfcd1bd003</originalsourceid><addsrcrecordid>eNptksFuEzEQhlcIREvhwAuglbjAIcUee23nglRSCpUqegDOltceJ04369TepeLt8ZISGoR8GGv8zT-a31NVLyk5ZWxO3mUAKgQI-qg6phz4TAGQxw_uR9WznNeEAGNMPa2OmCBUEIDj6vrrTejrhektpvo8mGUfc8j1B5PR1bGvv-A4pJjjdhVsfYFmGBPm-i4Mq9rU54jbiUimK2G4i-nmefXEmy7ji_t4Un2_-Pht8Xl2df3pcnF2NbMNE8OMt8pY8K1S2BrlwHnkyiolJBcSPTSuZZyhok6CnUvVMtvSBqy3jraOEHZSXe50XTRrvU1hY9JPHU3QvxMxLbVJQ7AdagnMc_AA0DRcCa-EY0oiUyCVkXLSer_T2o7tBp3FfigTHYgevvRhpZfxh55zCoLMi8Cbe4EUb0fMg96EbLHrTI9xzBokkYISzmlBX_-DruOY-mLVRAneNIKwv9TSlAFC72PpaydRfVYcahiFZmp7-h-qHIebYGOPPpT8QcHbXYEtX5oT-v2MlOhpk_R-kwr76qEpe_LP6rBfl9_AEg</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2706455603</pqid></control><display><type>article</type><title>Skin Cancer Diagnosis Based on Neutrosophic Features with a Deep Neural Network</title><source>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</source><source>PubMed Central (Open access)</source><creator>Singh, Sumit Kumar ; Abolghasemi, Vahid ; Anisi, Mohammad Hossein</creator><creatorcontrib>Singh, Sumit Kumar ; Abolghasemi, Vahid ; Anisi, Mohammad Hossein</creatorcontrib><description>Recent years evidenced an increase in the total number of skin cancer cases, and it is projected to grow exponentially. This paper proposes a computer-aided diagnosis system for the classification of a malignant lesion, where the acquired image is primarily pre-processed using novel methods. Digital artifacts such as hair follicles and blood vessels are removed, and thereafter, the image is enhanced using a novel method of histogram equalization. Henceforth, the pre-processed image undergoes the segmentation phase, where the suspected lesion is segmented using the Neutrosophic technique. The segmentation method employs a thresholding-based method along with a pentagonal neutrosophic structure to form a segmentation mask of the suspected skin lesion. The paper proposes a deep neural network base on Inception and residual blocks with softmax block after each residual block which makes the layer wider and easier to learn the key features more quickly. The proposed classifier was trained, tested, and validated over PH2, ISIC 2017, ISIC 2018, and ISIC 2019 datasets. The proposed segmentation model yields an accuracy mark of 99.50%, 99.33%, 98.56% and 98.04% for these datasets, respectively. These datasets are augmented to form a total of 103,554 images for training, which make the classifier produce enhanced classification results. Our experimental results confirm that the proposed classifier yields an accuracy score of 99.50%, 99.33%, 98.56%, and 98.04% for PH2, ISIC 2017, 2018, and 2019, respectively, which is better than most of the pre-existing classifiers.</description><identifier>ISSN: 1424-8220</identifier><identifier>EISSN: 1424-8220</identifier><identifier>DOI: 10.3390/s22166261</identifier><identifier>PMID: 36016022</identifier><language>eng</language><publisher>Switzerland: MDPI AG</publisher><subject>Accuracy ; Algorithms ; Cancer ; Classification ; Datasets ; deep neural network ; Dermoscopy - methods ; Diagnosis ; Digital imaging ; Hair ; Histograms ; Humans ; Image acquisition ; Image classification ; Image enhancement ; image processing ; Image Processing, Computer-Assisted - methods ; Medical diagnosis ; Medical imaging ; Melanoma ; Melanoma - diagnosis ; Model accuracy ; Neural networks ; Neural Networks, Computer ; neutrosophic ; Partial differential equations ; Skin cancer ; Skin Neoplasms - diagnosis ; Skin Neoplasms - pathology</subject><ispartof>Sensors (Basel, Switzerland), 2022-08, Vol.22 (16), p.6261</ispartof><rights>COPYRIGHT 2022 MDPI AG</rights><rights>2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2022 by the authors. 2022</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c536t-4b8ac2fb88eba8d2dfe48c8867467ef25db343e81d72c978b3cb152cfcd1bd003</citedby><cites>FETCH-LOGICAL-c536t-4b8ac2fb88eba8d2dfe48c8867467ef25db343e81d72c978b3cb152cfcd1bd003</cites><orcidid>0000-0002-2151-5180</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2706455603/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2706455603?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793,74998</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36016022$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Singh, Sumit Kumar</creatorcontrib><creatorcontrib>Abolghasemi, Vahid</creatorcontrib><creatorcontrib>Anisi, Mohammad Hossein</creatorcontrib><title>Skin Cancer Diagnosis Based on Neutrosophic Features with a Deep Neural Network</title><title>Sensors (Basel, Switzerland)</title><addtitle>Sensors (Basel)</addtitle><description>Recent years evidenced an increase in the total number of skin cancer cases, and it is projected to grow exponentially. This paper proposes a computer-aided diagnosis system for the classification of a malignant lesion, where the acquired image is primarily pre-processed using novel methods. Digital artifacts such as hair follicles and blood vessels are removed, and thereafter, the image is enhanced using a novel method of histogram equalization. Henceforth, the pre-processed image undergoes the segmentation phase, where the suspected lesion is segmented using the Neutrosophic technique. The segmentation method employs a thresholding-based method along with a pentagonal neutrosophic structure to form a segmentation mask of the suspected skin lesion. The paper proposes a deep neural network base on Inception and residual blocks with softmax block after each residual block which makes the layer wider and easier to learn the key features more quickly. The proposed classifier was trained, tested, and validated over PH2, ISIC 2017, ISIC 2018, and ISIC 2019 datasets. The proposed segmentation model yields an accuracy mark of 99.50%, 99.33%, 98.56% and 98.04% for these datasets, respectively. These datasets are augmented to form a total of 103,554 images for training, which make the classifier produce enhanced classification results. Our experimental results confirm that the proposed classifier yields an accuracy score of 99.50%, 99.33%, 98.56%, and 98.04% for PH2, ISIC 2017, 2018, and 2019, respectively, which is better than most of the pre-existing classifiers.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Cancer</subject><subject>Classification</subject><subject>Datasets</subject><subject>deep neural network</subject><subject>Dermoscopy - methods</subject><subject>Diagnosis</subject><subject>Digital imaging</subject><subject>Hair</subject><subject>Histograms</subject><subject>Humans</subject><subject>Image acquisition</subject><subject>Image classification</subject><subject>Image enhancement</subject><subject>image processing</subject><subject>Image Processing, Computer-Assisted - methods</subject><subject>Medical diagnosis</subject><subject>Medical imaging</subject><subject>Melanoma</subject><subject>Melanoma - diagnosis</subject><subject>Model accuracy</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>neutrosophic</subject><subject>Partial differential equations</subject><subject>Skin cancer</subject><subject>Skin Neoplasms - diagnosis</subject><subject>Skin Neoplasms - pathology</subject><issn>1424-8220</issn><issn>1424-8220</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNptksFuEzEQhlcIREvhwAuglbjAIcUee23nglRSCpUqegDOltceJ04369TepeLt8ZISGoR8GGv8zT-a31NVLyk5ZWxO3mUAKgQI-qg6phz4TAGQxw_uR9WznNeEAGNMPa2OmCBUEIDj6vrrTejrhektpvo8mGUfc8j1B5PR1bGvv-A4pJjjdhVsfYFmGBPm-i4Mq9rU54jbiUimK2G4i-nmefXEmy7ji_t4Un2_-Pht8Xl2df3pcnF2NbMNE8OMt8pY8K1S2BrlwHnkyiolJBcSPTSuZZyhok6CnUvVMtvSBqy3jraOEHZSXe50XTRrvU1hY9JPHU3QvxMxLbVJQ7AdagnMc_AA0DRcCa-EY0oiUyCVkXLSer_T2o7tBp3FfigTHYgevvRhpZfxh55zCoLMi8Cbe4EUb0fMg96EbLHrTI9xzBokkYISzmlBX_-DruOY-mLVRAneNIKwv9TSlAFC72PpaydRfVYcahiFZmp7-h-qHIebYGOPPpT8QcHbXYEtX5oT-v2MlOhpk_R-kwr76qEpe_LP6rBfl9_AEg</recordid><startdate>20220801</startdate><enddate>20220801</enddate><creator>Singh, Sumit Kumar</creator><creator>Abolghasemi, Vahid</creator><creator>Anisi, Mohammad Hossein</creator><general>MDPI AG</general><general>MDPI</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-2151-5180</orcidid></search><sort><creationdate>20220801</creationdate><title>Skin Cancer Diagnosis Based on Neutrosophic Features with a Deep Neural Network</title><author>Singh, Sumit Kumar ; Abolghasemi, Vahid ; Anisi, Mohammad Hossein</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c536t-4b8ac2fb88eba8d2dfe48c8867467ef25db343e81d72c978b3cb152cfcd1bd003</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Cancer</topic><topic>Classification</topic><topic>Datasets</topic><topic>deep neural network</topic><topic>Dermoscopy - methods</topic><topic>Diagnosis</topic><topic>Digital imaging</topic><topic>Hair</topic><topic>Histograms</topic><topic>Humans</topic><topic>Image acquisition</topic><topic>Image classification</topic><topic>Image enhancement</topic><topic>image processing</topic><topic>Image Processing, Computer-Assisted - methods</topic><topic>Medical diagnosis</topic><topic>Medical imaging</topic><topic>Melanoma</topic><topic>Melanoma - diagnosis</topic><topic>Model accuracy</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>neutrosophic</topic><topic>Partial differential equations</topic><topic>Skin cancer</topic><topic>Skin Neoplasms - diagnosis</topic><topic>Skin Neoplasms - pathology</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Singh, Sumit Kumar</creatorcontrib><creatorcontrib>Abolghasemi, Vahid</creatorcontrib><creatorcontrib>Anisi, Mohammad Hossein</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>Directory of Open Access Journals</collection><jtitle>Sensors (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Singh, Sumit Kumar</au><au>Abolghasemi, Vahid</au><au>Anisi, Mohammad Hossein</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Skin Cancer Diagnosis Based on Neutrosophic Features with a Deep Neural Network</atitle><jtitle>Sensors (Basel, Switzerland)</jtitle><addtitle>Sensors (Basel)</addtitle><date>2022-08-01</date><risdate>2022</risdate><volume>22</volume><issue>16</issue><spage>6261</spage><pages>6261-</pages><issn>1424-8220</issn><eissn>1424-8220</eissn><abstract>Recent years evidenced an increase in the total number of skin cancer cases, and it is projected to grow exponentially. This paper proposes a computer-aided diagnosis system for the classification of a malignant lesion, where the acquired image is primarily pre-processed using novel methods. Digital artifacts such as hair follicles and blood vessels are removed, and thereafter, the image is enhanced using a novel method of histogram equalization. Henceforth, the pre-processed image undergoes the segmentation phase, where the suspected lesion is segmented using the Neutrosophic technique. The segmentation method employs a thresholding-based method along with a pentagonal neutrosophic structure to form a segmentation mask of the suspected skin lesion. The paper proposes a deep neural network base on Inception and residual blocks with softmax block after each residual block which makes the layer wider and easier to learn the key features more quickly. The proposed classifier was trained, tested, and validated over PH2, ISIC 2017, ISIC 2018, and ISIC 2019 datasets. The proposed segmentation model yields an accuracy mark of 99.50%, 99.33%, 98.56% and 98.04% for these datasets, respectively. These datasets are augmented to form a total of 103,554 images for training, which make the classifier produce enhanced classification results. Our experimental results confirm that the proposed classifier yields an accuracy score of 99.50%, 99.33%, 98.56%, and 98.04% for PH2, ISIC 2017, 2018, and 2019, respectively, which is better than most of the pre-existing classifiers.</abstract><cop>Switzerland</cop><pub>MDPI AG</pub><pmid>36016022</pmid><doi>10.3390/s22166261</doi><orcidid>https://orcid.org/0000-0002-2151-5180</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1424-8220
ispartof Sensors (Basel, Switzerland), 2022-08, Vol.22 (16), p.6261
issn 1424-8220
1424-8220
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_723f42f22255486f86d387e38278a770
source Publicly Available Content Database (Proquest) (PQ_SDU_P3); PubMed Central (Open access)
subjects Accuracy
Algorithms
Cancer
Classification
Datasets
deep neural network
Dermoscopy - methods
Diagnosis
Digital imaging
Hair
Histograms
Humans
Image acquisition
Image classification
Image enhancement
image processing
Image Processing, Computer-Assisted - methods
Medical diagnosis
Medical imaging
Melanoma
Melanoma - diagnosis
Model accuracy
Neural networks
Neural Networks, Computer
neutrosophic
Partial differential equations
Skin cancer
Skin Neoplasms - diagnosis
Skin Neoplasms - pathology
title Skin Cancer Diagnosis Based on Neutrosophic Features with a Deep Neural Network
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T15%3A21%3A58IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Skin%20Cancer%20Diagnosis%20Based%20on%20Neutrosophic%20Features%20with%20a%20Deep%20Neural%20Network&rft.jtitle=Sensors%20(Basel,%20Switzerland)&rft.au=Singh,%20Sumit%20Kumar&rft.date=2022-08-01&rft.volume=22&rft.issue=16&rft.spage=6261&rft.pages=6261-&rft.issn=1424-8220&rft.eissn=1424-8220&rft_id=info:doi/10.3390/s22166261&rft_dat=%3Cgale_doaj_%3EA746531259%3C/gale_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c536t-4b8ac2fb88eba8d2dfe48c8867467ef25db343e81d72c978b3cb152cfcd1bd003%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2706455603&rft_id=info:pmid/36016022&rft_galeid=A746531259&rfr_iscdi=true