Loading…
Segmentation Guided Registration for 3D Spectral-Domain Optical Coherence Tomography Images
Medical image registration can be used for combining information from multiple imaging modalities, monitoring changes in size, shape or image intensity over time intervals. However, the development of such technique can be challenging for 3D spectral-domain optical coherence tomography (SD-OCT) imag...
Saved in:
Published in: | IEEE access 2019, Vol.7, p.138833-138845 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Medical image registration can be used for combining information from multiple imaging modalities, monitoring changes in size, shape or image intensity over time intervals. However, the development of such technique can be challenging for 3D spectral-domain optical coherence tomography (SD-OCT) imaging, because SD-OCT image is inherently noisy and its high resolution leads to high complexity of non-rigid registration. In this paper, a new segmentation guided approach is reported for registration of retinal OCT data. The proposed method models the 3D registration as a two-stage registration including x-y direction registration and z direction registration. In x-y direction registration, the vessel maps of OCT projection images between the template and the subject are registered to find out x-y direction displacement. The multi-scale vessel enhancement filter and morphological thinning methods are used to extract the vessel maps from the projection image of 3D OCT scans. And then x-y direction displacement is estimated by matching Speeded-Up Robust Features of the vessel maps. In z direction registration, using the tissue map instead of the original intensity image, A-scans are aligned to get the local displacements in z direction. The proposed method was evaluated on 45 longitudinal retinal OCT scans from 15 subjects. Experimental results show that the proposed method is accurate and very efficient. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2943172 |