Loading…
Tiny Criss-Cross Network for segmenting paddy panicles using aerial images
Panicle segmentation is a critical step in crop phenotyping research to investigate physiological and structural traits of crops. Unmanned aerial vehicle-based image acquisition integrated with deep learning algorithms replaces labour-intensive field investigation and crop monitoring. The drone has...
Saved in:
Published in: | Computers & electrical engineering 2023-05, Vol.108, p.108728, Article 108728 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Panicle segmentation is a critical step in crop phenotyping research to investigate physiological and structural traits of crops. Unmanned aerial vehicle-based image acquisition integrated with deep learning algorithms replaces labour-intensive field investigation and crop monitoring. The drone has limited resources, so it requires a small but efficient deep learning model. This paper proposes a novel Tiny Criss-Cross Network (TinyCCNET) deep learning model, based on criss-cross attention and a tiny approach. The criss-cross attention method captures contextual information from all pixels, enhancing segmentation accuracy, while the tiny approach is applied to the backbone network ResNet50. The proposed model outperforms the other existing models with 86.5% accuracy, 81.6% mIoU, and 12.3 GFLOPS. TinyCCNET achieves high accuracy, while being resource-efficient, making it an ideal choice for agricultural drone applications.
[Display omitted]
•Deep learning models applied to UAV requires a tiny model with higher performance.•TinyCCNET model based on cris cross attention and the tiny approach is proposed.•It is implemented on paddy rice imagery dataset, which consists of 8400 images.•It achieves 86.5% accuracy, 81.6% mIoU with 12.3 GFLOPS. |
---|---|
ISSN: | 0045-7906 1879-0755 |
DOI: | 10.1016/j.compeleceng.2023.108728 |