Loading…
Deep learning for improving non-destructive grain mapping in 3D
Laboratory X-ray diffraction contrast tomography (LabDCT) is a novel imaging technique for non-destructive 3D characterization of grain structures. An accurate grain reconstruction critically relies on precise segmentation of diffraction spots in the LabDCT images. The conventional method utilizing...
Saved in:
Published in: | IUCrJ 2021-09, Vol.8 (Pt 5), p.719-731 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Laboratory X-ray diffraction contrast tomography (LabDCT) is a novel imaging technique for non-destructive 3D characterization of grain structures. An accurate grain reconstruction critically relies on precise segmentation of diffraction spots in the LabDCT images. The conventional method utilizing various filters generally satisfies segmentation of sharp spots in the images, thereby serving as a standard routine, but it also very often leads to over or under segmentation of spots, especially those with low signal-to-noise ratios and/or small sizes. The standard routine also requires a fine tuning of the filtering parameters. To overcome these challenges, a deep learning neural network is presented to efficiently and accurately clean the background noise, thereby easing the spot segmentation. The deep learning network is first trained with input images, synthesized using a forward simulation model for LabDCT in combination with a generic approach to extract features of experimental backgrounds. Then, the network is applied to remove the background noise from experimental images measured under different geometrical conditions for different samples. Comparisons of both processed images and grain reconstructions show that the deep learning method outperforms the standard routine, demonstrating significantly better grain mapping. |
---|---|
ISSN: | 2052-2525 2052-2525 |
DOI: | 10.1107/S2052252521005480 |