Loading…

Semantic Segmentation of Outcrop Images using Deep Learning Networks Toward Realization of Carbon Capture and Storage

This study was conducted to classify outcrop images using semantic segmentation methods based on deep learning algorithms. Carbon capture and storage (CCS) is an epoch-making approach to reduce greenhouse gases in the atmosphere. This study specifically examines outcrops because geological layer mea...

Full description

Saved in:
Bibliographic Details
Main Authors: Sato, Kodai, Madokoro, Hirokazu, Nagayoshi, Takeshi, Chiyonobu, Shun, Martizzi, Paolo, Nix, Stephanie, Woo, Hanwool, Saito, Takashi K., Sato, Kazuhito
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study was conducted to classify outcrop images using semantic segmentation methods based on deep learning algorithms. Carbon capture and storage (CCS) is an epoch-making approach to reduce greenhouse gases in the atmosphere. This study specifically examines outcrops because geological layer measurements can lead to production of a highly accurate geological model for feasible CCS inspections. Using a digital monocular RGB camera, we obtained 13 outcrop images annotated with four classes along with strata. Subsequently, we compared segmentation accuracies with changing input image sizes of three types and semantic segmentation methods of four backbones: SegNet, U-Net, ResNet-18, and Xception-65. The ResNet-18 and Xception-65 backbones were implemented using DeepLabv3+. Experimentally obtained results demonstrated that data expansion with random sampling improved the accuracy. Regarding evaluation metrics, global accuracy and local accuracy are higher than the mean intersection over union (mIoU) for our outcrop image dataset with unequal numbers of pixels in the respective classes. These experimentally obtained results revealed that resizing for input images is unnecessary for our method.
ISSN:2642-3901
DOI:10.23919/ICCAS52745.2021.9649777