Loading…
Deep learning based neural network application for automatic ultrasonic computed tomographic bone image segmentation
Deep-learning techniques have led to technological progress in the area of medical imaging segmentation especially in the ultrasound domain. In this paper, the main goal of this study is to optimize a deep-learning-based neural network architecture for automatic segmentation in Ultrasonic Computed T...
Saved in:
Published in: | Multimedia tools and applications 2022, Vol.81 (10), p.13537-13562 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Deep-learning techniques have led to technological progress in the area of medical imaging segmentation especially in the ultrasound domain. In this paper, the main goal of this study is to optimize a deep-learning-based neural network architecture for automatic segmentation in Ultrasonic Computed Tomography (USCT) bone images in a short time process. The proposed method is based on an end to end neural network architecture. First, the novelty is shown by the improvement of Variable Structure Model of Neuron (VSMN), which is trained for both USCT noise removal and dataset augmentation. Second, a VGG-SegNet neural network architecture is trained and tested on new USCT images not seen before for automatic bone segmentation. Therefore, we offer a free USCT dataset. In addition, the proposed model is implemented on both the CPU and the GPU, hence overcoming previous works by a value of 97.38% and 96% for training and validation and achieving high segmentation accuracy for testing with a small error of 0.006, in a short time process. The suggested method demonstrates its ability to augment USCT data and then to automatically segment USCT bone structures achieving excellent accuracy outperforming the state of the art. |
---|---|
ISSN: | 1380-7501 1573-7721 |
DOI: | 10.1007/s11042-022-12322-3 |