Loading…
A Novel Approach for Increased Convolutional Neural Network Performance in Gastric-Cancer Classification Using Endoscopic Images
Gastric cancer is the third-most-common cause of cancer-related deaths in the world. Fortunately, it can be detected using endoscopy equipment. Computer-aided diagnosis (CADx) systems can help clinicians identify cancer from gastric diseases more accurately. In this paper, we present a CADx system t...
Saved in:
Published in: | IEEE access 2021, Vol.9, p.51847-51854 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Gastric cancer is the third-most-common cause of cancer-related deaths in the world. Fortunately, it can be detected using endoscopy equipment. Computer-aided diagnosis (CADx) systems can help clinicians identify cancer from gastric diseases more accurately. In this paper, we present a CADx system that distinguishes and classifies gastric cancer from pre-cancerous conditions, such as gastric polyps, gastric ulcers, gastritis, and bleeding. The system uses a deep-learning model, Xception, which involves depth-wise separable convolutions, to classify cancer and non-cancers. The proposed method consists of two steps: Google's AutoAugment for augmentation and the simple linear iterative clustering (SLIC) superpixel and fast and robust fuzzy C-means (FRFCM) algorithm for image segmentation during preprocessing. These approaches produce a feasible method of distinguishing and classifying cancers from other gastric diseases. Based on biopsy-supported ground truth, the performance metrics of the area under the receiver operating characteristic curve (i.e. Az) are measured on the test sets. Based on the classification results, the Az of the proposed classification model is 0.96, which is 0.06 up from 0.90 which is the Az of the original data. Our methods are fully automated without the manual specification of region-of-interests for the test and with a random selection of images for model training. This methodology may play a crucial role in selecting effective treatment options without the need for a surgical biopsy. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2021.3069747 |