Loading…

A high-quality rice leaf disease image data augmentation method based on a dual GAN

Deep learning models need sufficient training samples to support them in the training process; otherwise, overfitting occurs, resulting in model failure. However, in the field of smart agriculture, there are common problems, such as difficulty in obtaining high-quality disease samples and high cost....

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2023, Vol.11, p.1-1
Main Authors: Zhang, Zhao, Gao, Quan, Liu, Lirong, He, Yun
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep learning models need sufficient training samples to support them in the training process; otherwise, overfitting occurs, resulting in model failure. However, in the field of smart agriculture, there are common problems, such as difficulty in obtaining high-quality disease samples and high cost. To solve this problem, this paper proposed a high-quality image augmentation (HQIA) method for generating high-quality rice leaf disease images based on a dual generative adversarial network (GAN). First, the original samples were used to train Improved Training of Wasserstein GANs (WGAN-GP) to generate pseudo-data samples. The pseudo-data samples were put into the Optimized-Real-ESRGAN (Opt-Real-ESRGAN) to generate high-quality pseudo-data samples. Finally, the high-quality pseudo-data samples were put into the disease classification convolutional neural network, and the effectiveness of the method was verified by indicators. Experimental results showed that this method can generate high-quality rice leaf disease images, and the recognition accuracy of high-quality rice disease image samples augmented by this method was 4.57% higher than that of using only the original training set on ResNet18 and 4.1% higher on VGG11. Compared with the data augmentation method only by WGAN-GP, the accuracy of ResNet18 increased by 3.08%, and the accuracy of VGG11 increased by 3.55%. The results demonstrate the effectiveness of the proposed method with limited training datasets.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2023.3251098