Loading…

The Novel Combination of Nano Vector Network Analyzer and Machine Learning for Fruit Identification and Ripeness Grading

Fruit classification is required in many smart-farming and industrial applications. In the supermarket, a fruit classification system may be used to help cashiers and customer to identify the fruit species, origin, ripeness, and prices. Some methods, such as image processing and NIRS (near-infrared...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Switzerland), 2023-01, Vol.23 (2), p.952
Main Authors: Tran, Van Lic, Doan, Thi Ngoc Canh, Ferrero, Fabien, Huy, Trinh Le, Le-Thanh, Nhan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Fruit classification is required in many smart-farming and industrial applications. In the supermarket, a fruit classification system may be used to help cashiers and customer to identify the fruit species, origin, ripeness, and prices. Some methods, such as image processing and NIRS (near-infrared spectroscopy) are already used to classify fruit. In this paper, we propose a fast and cost-effective method based on a low-cost Vector Network Analyzer (VNA) device augmented by K-nearest neighbor (KNN) and Neural Network model. S-parameters features are selected, which take into account the information on signal amplitude or phase in the frequency domain, including reflection coefficient and transmission coefficient . This approach was experimentally tested for two separate datasets of five types of fruits, including Apple, Avocado, Dragon Fruit, Guava, and Mango, for fruit recognition as well as their level of ripeness. The classification accuracy of the Neural Network model was higher than KNN with 98.75% and 99.75% on the first dataset, whereas the KNN was seen to be more effective in classifying ripeness with 98.4% as compared to 96.6% for neural network.
ISSN:1424-8220
1424-8220
DOI:10.3390/s23020952