Loading…
Classification of large green chilli maturity using deep learning
Chili ( Capsicum annuum L.) is the source of various nutraceutical small molecules, such as ascorbic acid (vitamin C), carotenoids, tocopherols, flavonoids, and capsinoids. The purpose of this study was to classify the maturity stage of large green chili into three maturity levels, i.e. maturity 1 (...
Saved in:
Published in: | IOP conference series. Earth and environmental science 2021-11, Vol.924 (1), p.12009 |
---|---|
Main Authors: | , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Chili (
Capsicum annuum
L.) is the source of various nutraceutical small molecules, such as ascorbic acid (vitamin C), carotenoids, tocopherols, flavonoids, and capsinoids. The purpose of this study was to classify the maturity stage of large green chili into three maturity levels, i.e. maturity 1 (maturity index 1 / 34 days after anthesis (DAA)), maturity 2 (maturity index 3 / 47 DAA), and maturity 3 (maturity index 5 / 60 DAA) by using convolutional neural networks (CNN) based deep learning and computer vision. Four types of pre-trained networks CNN were used in this study i.e. SqueezeNet, GoogLeNet, ResNet50, and AlexNet. From the overall sensitivity analysis results, the highest maturity classification accuracy of large green chili was 93.89% which can be achieved when using GoogLeNet with SGDmoptimizer and learning rate of 0.00005. However, in further testing using testing-set data, the highest classification accuracy based on confusion matrix was reaching 91.27% when using the CNN SqueezeNet model with RMSProp optimizer and a learning rate of 0.0001. The combination of the CNN model and the low-cost digital commercial camera can later be used to detect the maturity of large green chili with the advantages of being non-destructive, rapid, accurate, low-cost, and real-time. |
---|---|
ISSN: | 1755-1307 1755-1315 |
DOI: | 10.1088/1755-1315/924/1/012009 |