Loading…
EMED-UNet: An Efficient Multi-Encoder-Decoder Based UNet for Chest X-ray Segmentation
The current state-of-the-art works in chest x-ray seg-mentation are based on the U-Net architecture, originally designed and developed for semantic segmentation. The U-Net base model uses a large number of filters causing high computational complex-ity. The number of parameters in the base U-Net mod...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The current state-of-the-art works in chest x-ray seg-mentation are based on the U-Net architecture, originally designed and developed for semantic segmentation. The U-Net base model uses a large number of filters causing high computational complex-ity. The number of parameters in the base U-Net model reaches 31 million, with very high floating-point operations(FLOPs), and huge model size. Due to this, the real-time implementation of these models is difficult as such big models cannot be deployed. Also, it uses only 3\times 3 kernels for convolution to capture information from the image and generate features. The filters with fixed size kernels throughout the architecture can only be used if the size of salient regions in the images is the same. To overcome these two problems, we propose a Multi-Encoder Decoder UNet architecture, that can extract features at multiple spatial extents in an efficient way. We have evaluated our framework on two datasets: Montgomery County and Shenzhen chest x-ray datasets. We show a comparison of our proposed EMED-UNet network architecture with U-Net. EMED-UNet crosses the accuracy of U-Net with significantly reduced parameters from 31.043M to 6.72M, FLOPs from 386 G to 114 G, and model size from 386 MB to 80 MB. This work can also be treated as a step toward creating real-time systems for biomedical image analysis. |
---|---|
ISSN: | 2642-6102 |
DOI: | 10.1109/TENSYMP54529.2022.9864556 |