Loading…
Graph Neural Networks for lightweight plant organ tracking
Many specific problems within the domain of high throughput phenotyping require the accurate localization of plant organs. To track and count plant organs, we propose GCNNMatch++, a Graph Convolutional Neural Network (GCNN) that is capable of online tracking objects from videos. Based upon the GCNNM...
Saved in:
Published in: | Computers and electronics in agriculture 2024-10, Vol.225, p.109294, Article 109294 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Many specific problems within the domain of high throughput phenotyping require the accurate localization of plant organs. To track and count plant organs, we propose GCNNMatch++, a Graph Convolutional Neural Network (GCNN) that is capable of online tracking objects from videos. Based upon the GCNNMatch tracker with an improved CensNet GNN, our end-to-end tracking approach achieves fast inference. In order to adapt this approach to flower counting, we collected a large, high-quality dataset of cotton flower videos by leveraging our custom-built MARS-X robotic platform. Specifically, our system can count cotton flowers in the field with 80% accuracy, achieving a Higher-Order Tracking Accuracy (HOTA) of 51.09 and outperforming more generic tracking methods. Without any optimization (such as employing TensorRT), our association model runs in 44 ms on a central processing unit (CPU). On appropriate hardware, our model holds promise for achieving real-time counting performance when coupled with a fast detector. Overall, our approach is useful in counting cotton flowers and other relevant plant organs for both breeding programs and yield estimation.
[Display omitted]
•A new graph-based multi-object tracking method was developed to count cotton flowers.•The method’s performance was assessed on image data collected with a ground robot.•The model’s potential for phenotyping flowering traits was investigated. |
---|---|
ISSN: | 0168-1699 |
DOI: | 10.1016/j.compag.2024.109294 |