Loading…

Surface-enhanced Raman spectroscopy-based metabolomics for the discrimination of Keemun black teas coupled with chemometrics

In the present study, the Surface-enhanced Raman Spectroscopy (SERS)-based metabolomics approach coupled with chemometrics was developed to determine the geographic origins of Keemun black tea. The SERS peaks enhanced by Ag nanoparticles at Δv = 555, 644, 731, 955, 1240, 1321, and 1539 cm−1 were sel...

Full description

Saved in:
Bibliographic Details
Published in:Food science & technology 2023-05, Vol.181, p.114742, Article 114742
Main Authors: Ren, Yin-feng, Ye, Zhi-hao, Liu, Xiao-qian, Xia, Wei-jing, Yuan, Yan, Zhu, Hai-yan, Chen, Xiao-tong, Hou, Ru-yan, Cai, Hui-mei, Li, Da-xiang, Granato, Daniel, Peng, Chuan-yi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In the present study, the Surface-enhanced Raman Spectroscopy (SERS)-based metabolomics approach coupled with chemometrics was developed to determine the geographic origins of Keemun black tea. The SERS peaks enhanced by Ag nanoparticles at Δv = 555, 644, 731, 955, 1240, 1321, and 1539 cm−1 were selected, and the intensities were calculated for chemometric analysis. Linear discriminant analysis (LDA) presented an average discrimination accuracy of 86.3%, with 84.3% cross-validation for evaluation. The recognition of three machine learning algorithms, namely feedforward neural network (FNN), random forest (RF), and K-Nearest Neighbor (KNN), for black tea were 93.5%, 93.5%, and 87.1%, respectively. Herein, this study demonstrates the potential of the SERS technique coupled with AgNPs and chemometrics as an accessible, prompt, and fast method for discriminating the geographic origins of teas. •Keemun black teas were authenticated by the SERS-based metabolomics fingerprints.•The SERS peaks at Δv = 555, 644, 731, 955, 1240, 1321 and 1539 cm−1 were selected.•LDA presented an 86.3% discrimination accuracy with 84.3% cross-validation.•The recognition of FNN, RF and KNN were 93.5%, 93.5%, and 87.1%, respectively.
ISSN:0023-6438
1096-1127
DOI:10.1016/j.lwt.2023.114742