Loading…

Neural Networks for Hyperspectral Imaging of Historical Paintings: A Practical Review

Hyperspectral imaging (HSI) has become widely used in cultural heritage (CH). This very efficient method for artwork analysis is connected with the generation of large amounts of spectral data. The effective processing of such heavy spectral datasets remains an active research area. Along with the f...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2023-02, Vol.23 (5), p.2419
Main Authors: Liu, Lingxi, Miteva, Tsveta, Delnevo, Giovanni, Mirri, Silvia, Walter, Philippe, de Viguerie, Laurence, Pouyet, Emeline
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Hyperspectral imaging (HSI) has become widely used in cultural heritage (CH). This very efficient method for artwork analysis is connected with the generation of large amounts of spectral data. The effective processing of such heavy spectral datasets remains an active research area. Along with the firmly established statistical and multivariate analysis methods, neural networks (NNs) represent a promising alternative in the field of CH. Over the last five years, the application of NNs for pigment identification and classification based on HSI datasets has drastically expanded due to the flexibility of the types of data they can process, and their superior ability to extract structures contained in the raw spectral data. This review provides an exhaustive analysis of the literature related to NNs applied for HSI data in the CH field. We outline the existing data processing workflows and propose a comprehensive comparison of the applications and limitations of the various input dataset preparation methods and NN architectures. By leveraging NN strategies in CH, the paper contributes to a wider and more systematic application of this novel data analysis method.
ISSN:1424-8220
1424-8220
DOI:10.3390/s23052419