Loading…

High-Throughput Plot-Level Quantitative Phenotyping Using Convolutional Neural Networks on Very High-Resolution Satellite Images

To ensure global food security, crop breeders conduct extensive trials across various locations to discover new crop varieties that grow more robustly, have higher yields, and are resilient to local stress factors. These trials consist of thousands of plots, each containing a unique crop variety mon...

Full description

Saved in:
Bibliographic Details
Published in:Remote sensing (Basel, Switzerland) Switzerland), 2024-01, Vol.16 (2), p.282
Main Authors: Victor, Brandon, Nibali, Aiden, Newman, Saul Justin, Coram, Tristan, Pinto, Francisco, Reynolds, Matthew, Furbank, Robert T, He, Zhen
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c387t-ba658a48defcdc753920b154409158434b980b8326fa53bdb0e504b688a2a033
container_end_page
container_issue 2
container_start_page 282
container_title Remote sensing (Basel, Switzerland)
container_volume 16
creator Victor, Brandon
Nibali, Aiden
Newman, Saul Justin
Coram, Tristan
Pinto, Francisco
Reynolds, Matthew
Furbank, Robert T
He, Zhen
description To ensure global food security, crop breeders conduct extensive trials across various locations to discover new crop varieties that grow more robustly, have higher yields, and are resilient to local stress factors. These trials consist of thousands of plots, each containing a unique crop variety monitored at intervals during the growing season, requiring considerable manual effort. In this study, we combined satellite imagery and deep learning techniques to automatically collect plot-level phenotypes from plant breeding trials in South Australia and Sonora, Mexico. We implemented two novel methods, utilising state-of-the-art computer vision architectures, to predict plot-level phenotypes: flowering, canopy cover, greenness, height, biomass, and normalised difference vegetation index (NDVI). The first approach uses a classification model to predict for just the centred plot. The second approach predicts per-pixel and then aggregates predictions to determine a value per-plot. Using a modified ResNet18 model to predict the centred plot was found to be the most effective method. These results highlight the exciting potential for improving crop trials with remote sensing and machine learning.
doi_str_mv 10.3390/rs16020282
format article
fullrecord <record><control><sourceid>gale_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_a6b464d38ff045cb8698d88f4849cf65</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A780926605</galeid><doaj_id>oai_doaj_org_article_a6b464d38ff045cb8698d88f4849cf65</doaj_id><sourcerecordid>A780926605</sourcerecordid><originalsourceid>FETCH-LOGICAL-c387t-ba658a48defcdc753920b154409158434b980b8326fa53bdb0e504b688a2a033</originalsourceid><addsrcrecordid>eNptklFrGzEMx4-xQUvbl34Cw94G1_lsn89-LGFbA6HrtmyvRvbZF2eXc2b7MvLWj95LUtYNJoEkxF8_kFBRXFf4hlKJ38dUcUwwEeRVcU5wQ0pGJHn9V31WXKW0xpNRWknMzovHO9-tyuUqhrFbbceMHvqQy4Xd2R59GWHIPkP2O4seVnYIeb_1Q4e-p0OchWEX-jH7MECP7u0Yjyn_DvFnQmFAP2zcoyP_q03PSvQNsu17ny2ab6Cz6bJ446BP9uo5XxTLjx-Ws7ty8fnTfHa7KA0VTS418FoAE611pjVNTSXBuqoZw7KqBaNMS4G1oIQ7qKluNbY1ZpoLAQSmbS-K-QnbBlirbfQbiHsVwKtjI8ROQcze9FYB14yzlgrnMKuNFlyKVgjHBJPG8XpivT2xtjH8Gm3Kah3GOB0hKSIr0cgGC_mi6mCC-sGFHMFsfDLqthFYEs7xgXXzH9Xkrd14Ewbr_NT_Z-DdacDEkFK07s8yFVaHN1Avb0CfAHTFpCk</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2918797089</pqid></control><display><type>article</type><title>High-Throughput Plot-Level Quantitative Phenotyping Using Convolutional Neural Networks on Very High-Resolution Satellite Images</title><source>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</source><creator>Victor, Brandon ; Nibali, Aiden ; Newman, Saul Justin ; Coram, Tristan ; Pinto, Francisco ; Reynolds, Matthew ; Furbank, Robert T ; He, Zhen</creator><creatorcontrib>Victor, Brandon ; Nibali, Aiden ; Newman, Saul Justin ; Coram, Tristan ; Pinto, Francisco ; Reynolds, Matthew ; Furbank, Robert T ; He, Zhen</creatorcontrib><description>To ensure global food security, crop breeders conduct extensive trials across various locations to discover new crop varieties that grow more robustly, have higher yields, and are resilient to local stress factors. These trials consist of thousands of plots, each containing a unique crop variety monitored at intervals during the growing season, requiring considerable manual effort. In this study, we combined satellite imagery and deep learning techniques to automatically collect plot-level phenotypes from plant breeding trials in South Australia and Sonora, Mexico. We implemented two novel methods, utilising state-of-the-art computer vision architectures, to predict plot-level phenotypes: flowering, canopy cover, greenness, height, biomass, and normalised difference vegetation index (NDVI). The first approach uses a classification model to predict for just the centred plot. The second approach predicts per-pixel and then aggregates predictions to determine a value per-plot. Using a modified ResNet18 model to predict the centred plot was found to be the most effective method. These results highlight the exciting potential for improving crop trials with remote sensing and machine learning.</description><identifier>ISSN: 2072-4292</identifier><identifier>EISSN: 2072-4292</identifier><identifier>DOI: 10.3390/rs16020282</identifier><language>eng</language><publisher>Basel: MDPI AG</publisher><subject>Agricultural production ; agriculture ; Artificial neural networks ; Classification ; Comparative analysis ; Computer vision ; Crops ; Data collection ; Deep learning ; Flowering ; Food security ; Genotype &amp; phenotype ; Growing season ; Identification and classification ; Image resolution ; Machine learning ; Machine vision ; Measurement ; Neural networks ; Normalized difference vegetative index ; object-based image analysis ; optical imagery ; Phenotypes ; Phenotyping ; Physiology ; Plant breeding ; Remote sensing ; Satellite imagery ; Satellite imaging</subject><ispartof>Remote sensing (Basel, Switzerland), 2024-01, Vol.16 (2), p.282</ispartof><rights>COPYRIGHT 2024 MDPI AG</rights><rights>2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c387t-ba658a48defcdc753920b154409158434b980b8326fa53bdb0e504b688a2a033</cites><orcidid>0000-0001-9841-1518 ; 0000-0001-8700-6613 ; 0000-0002-4291-4316 ; 0000-0002-1352-9910 ; 0000-0003-0302-5775 ; 0000-0001-5832-6087 ; 0000-0003-3000-9084 ; 0000-0002-3447-973X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2918797089/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2918797089?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,25753,27924,27925,37012,44590,74998</link.rule.ids></links><search><creatorcontrib>Victor, Brandon</creatorcontrib><creatorcontrib>Nibali, Aiden</creatorcontrib><creatorcontrib>Newman, Saul Justin</creatorcontrib><creatorcontrib>Coram, Tristan</creatorcontrib><creatorcontrib>Pinto, Francisco</creatorcontrib><creatorcontrib>Reynolds, Matthew</creatorcontrib><creatorcontrib>Furbank, Robert T</creatorcontrib><creatorcontrib>He, Zhen</creatorcontrib><title>High-Throughput Plot-Level Quantitative Phenotyping Using Convolutional Neural Networks on Very High-Resolution Satellite Images</title><title>Remote sensing (Basel, Switzerland)</title><description>To ensure global food security, crop breeders conduct extensive trials across various locations to discover new crop varieties that grow more robustly, have higher yields, and are resilient to local stress factors. These trials consist of thousands of plots, each containing a unique crop variety monitored at intervals during the growing season, requiring considerable manual effort. In this study, we combined satellite imagery and deep learning techniques to automatically collect plot-level phenotypes from plant breeding trials in South Australia and Sonora, Mexico. We implemented two novel methods, utilising state-of-the-art computer vision architectures, to predict plot-level phenotypes: flowering, canopy cover, greenness, height, biomass, and normalised difference vegetation index (NDVI). The first approach uses a classification model to predict for just the centred plot. The second approach predicts per-pixel and then aggregates predictions to determine a value per-plot. Using a modified ResNet18 model to predict the centred plot was found to be the most effective method. These results highlight the exciting potential for improving crop trials with remote sensing and machine learning.</description><subject>Agricultural production</subject><subject>agriculture</subject><subject>Artificial neural networks</subject><subject>Classification</subject><subject>Comparative analysis</subject><subject>Computer vision</subject><subject>Crops</subject><subject>Data collection</subject><subject>Deep learning</subject><subject>Flowering</subject><subject>Food security</subject><subject>Genotype &amp; phenotype</subject><subject>Growing season</subject><subject>Identification and classification</subject><subject>Image resolution</subject><subject>Machine learning</subject><subject>Machine vision</subject><subject>Measurement</subject><subject>Neural networks</subject><subject>Normalized difference vegetative index</subject><subject>object-based image analysis</subject><subject>optical imagery</subject><subject>Phenotypes</subject><subject>Phenotyping</subject><subject>Physiology</subject><subject>Plant breeding</subject><subject>Remote sensing</subject><subject>Satellite imagery</subject><subject>Satellite imaging</subject><issn>2072-4292</issn><issn>2072-4292</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNptklFrGzEMx4-xQUvbl34Cw94G1_lsn89-LGFbA6HrtmyvRvbZF2eXc2b7MvLWj95LUtYNJoEkxF8_kFBRXFf4hlKJ38dUcUwwEeRVcU5wQ0pGJHn9V31WXKW0xpNRWknMzovHO9-tyuUqhrFbbceMHvqQy4Xd2R59GWHIPkP2O4seVnYIeb_1Q4e-p0OchWEX-jH7MECP7u0Yjyn_DvFnQmFAP2zcoyP_q03PSvQNsu17ny2ab6Cz6bJ446BP9uo5XxTLjx-Ws7ty8fnTfHa7KA0VTS418FoAE611pjVNTSXBuqoZw7KqBaNMS4G1oIQ7qKluNbY1ZpoLAQSmbS-K-QnbBlirbfQbiHsVwKtjI8ROQcze9FYB14yzlgrnMKuNFlyKVgjHBJPG8XpivT2xtjH8Gm3Kah3GOB0hKSIr0cgGC_mi6mCC-sGFHMFsfDLqthFYEs7xgXXzH9Xkrd14Ewbr_NT_Z-DdacDEkFK07s8yFVaHN1Avb0CfAHTFpCk</recordid><startdate>20240101</startdate><enddate>20240101</enddate><creator>Victor, Brandon</creator><creator>Nibali, Aiden</creator><creator>Newman, Saul Justin</creator><creator>Coram, Tristan</creator><creator>Pinto, Francisco</creator><creator>Reynolds, Matthew</creator><creator>Furbank, Robert T</creator><creator>He, Zhen</creator><general>MDPI AG</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7QR</scope><scope>7SC</scope><scope>7SE</scope><scope>7SN</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C1K</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>H8G</scope><scope>HCIFZ</scope><scope>JG9</scope><scope>JQ2</scope><scope>KR7</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PCBAR</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-9841-1518</orcidid><orcidid>https://orcid.org/0000-0001-8700-6613</orcidid><orcidid>https://orcid.org/0000-0002-4291-4316</orcidid><orcidid>https://orcid.org/0000-0002-1352-9910</orcidid><orcidid>https://orcid.org/0000-0003-0302-5775</orcidid><orcidid>https://orcid.org/0000-0001-5832-6087</orcidid><orcidid>https://orcid.org/0000-0003-3000-9084</orcidid><orcidid>https://orcid.org/0000-0002-3447-973X</orcidid></search><sort><creationdate>20240101</creationdate><title>High-Throughput Plot-Level Quantitative Phenotyping Using Convolutional Neural Networks on Very High-Resolution Satellite Images</title><author>Victor, Brandon ; Nibali, Aiden ; Newman, Saul Justin ; Coram, Tristan ; Pinto, Francisco ; Reynolds, Matthew ; Furbank, Robert T ; He, Zhen</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c387t-ba658a48defcdc753920b154409158434b980b8326fa53bdb0e504b688a2a033</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Agricultural production</topic><topic>agriculture</topic><topic>Artificial neural networks</topic><topic>Classification</topic><topic>Comparative analysis</topic><topic>Computer vision</topic><topic>Crops</topic><topic>Data collection</topic><topic>Deep learning</topic><topic>Flowering</topic><topic>Food security</topic><topic>Genotype &amp; phenotype</topic><topic>Growing season</topic><topic>Identification and classification</topic><topic>Image resolution</topic><topic>Machine learning</topic><topic>Machine vision</topic><topic>Measurement</topic><topic>Neural networks</topic><topic>Normalized difference vegetative index</topic><topic>object-based image analysis</topic><topic>optical imagery</topic><topic>Phenotypes</topic><topic>Phenotyping</topic><topic>Physiology</topic><topic>Plant breeding</topic><topic>Remote sensing</topic><topic>Satellite imagery</topic><topic>Satellite imaging</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Victor, Brandon</creatorcontrib><creatorcontrib>Nibali, Aiden</creatorcontrib><creatorcontrib>Newman, Saul Justin</creatorcontrib><creatorcontrib>Coram, Tristan</creatorcontrib><creatorcontrib>Pinto, Francisco</creatorcontrib><creatorcontrib>Reynolds, Matthew</creatorcontrib><creatorcontrib>Furbank, Robert T</creatorcontrib><creatorcontrib>He, Zhen</creatorcontrib><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Chemoreception Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Ecology Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Earth, Atmospheric &amp; Aquatic Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>ANTE: Abstracts in New Technology &amp; Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>SciTech Premium Collection</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ProQuest Engineering Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest Earth, Atmospheric &amp; Aquatic Science Database</collection><collection>Publicly Available Content Database (Proquest) (PQ_SDU_P3)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Remote sensing (Basel, Switzerland)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Victor, Brandon</au><au>Nibali, Aiden</au><au>Newman, Saul Justin</au><au>Coram, Tristan</au><au>Pinto, Francisco</au><au>Reynolds, Matthew</au><au>Furbank, Robert T</au><au>He, Zhen</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>High-Throughput Plot-Level Quantitative Phenotyping Using Convolutional Neural Networks on Very High-Resolution Satellite Images</atitle><jtitle>Remote sensing (Basel, Switzerland)</jtitle><date>2024-01-01</date><risdate>2024</risdate><volume>16</volume><issue>2</issue><spage>282</spage><pages>282-</pages><issn>2072-4292</issn><eissn>2072-4292</eissn><abstract>To ensure global food security, crop breeders conduct extensive trials across various locations to discover new crop varieties that grow more robustly, have higher yields, and are resilient to local stress factors. These trials consist of thousands of plots, each containing a unique crop variety monitored at intervals during the growing season, requiring considerable manual effort. In this study, we combined satellite imagery and deep learning techniques to automatically collect plot-level phenotypes from plant breeding trials in South Australia and Sonora, Mexico. We implemented two novel methods, utilising state-of-the-art computer vision architectures, to predict plot-level phenotypes: flowering, canopy cover, greenness, height, biomass, and normalised difference vegetation index (NDVI). The first approach uses a classification model to predict for just the centred plot. The second approach predicts per-pixel and then aggregates predictions to determine a value per-plot. Using a modified ResNet18 model to predict the centred plot was found to be the most effective method. These results highlight the exciting potential for improving crop trials with remote sensing and machine learning.</abstract><cop>Basel</cop><pub>MDPI AG</pub><doi>10.3390/rs16020282</doi><orcidid>https://orcid.org/0000-0001-9841-1518</orcidid><orcidid>https://orcid.org/0000-0001-8700-6613</orcidid><orcidid>https://orcid.org/0000-0002-4291-4316</orcidid><orcidid>https://orcid.org/0000-0002-1352-9910</orcidid><orcidid>https://orcid.org/0000-0003-0302-5775</orcidid><orcidid>https://orcid.org/0000-0001-5832-6087</orcidid><orcidid>https://orcid.org/0000-0003-3000-9084</orcidid><orcidid>https://orcid.org/0000-0002-3447-973X</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2072-4292
ispartof Remote sensing (Basel, Switzerland), 2024-01, Vol.16 (2), p.282
issn 2072-4292
2072-4292
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_a6b464d38ff045cb8698d88f4849cf65
source Publicly Available Content Database (Proquest) (PQ_SDU_P3)
subjects Agricultural production
agriculture
Artificial neural networks
Classification
Comparative analysis
Computer vision
Crops
Data collection
Deep learning
Flowering
Food security
Genotype & phenotype
Growing season
Identification and classification
Image resolution
Machine learning
Machine vision
Measurement
Neural networks
Normalized difference vegetative index
object-based image analysis
optical imagery
Phenotypes
Phenotyping
Physiology
Plant breeding
Remote sensing
Satellite imagery
Satellite imaging
title High-Throughput Plot-Level Quantitative Phenotyping Using Convolutional Neural Networks on Very High-Resolution Satellite Images
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T23%3A27%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=High-Throughput%20Plot-Level%20Quantitative%20Phenotyping%20Using%20Convolutional%20Neural%20Networks%20on%20Very%20High-Resolution%20Satellite%20Images&rft.jtitle=Remote%20sensing%20(Basel,%20Switzerland)&rft.au=Victor,%20Brandon&rft.date=2024-01-01&rft.volume=16&rft.issue=2&rft.spage=282&rft.pages=282-&rft.issn=2072-4292&rft.eissn=2072-4292&rft_id=info:doi/10.3390/rs16020282&rft_dat=%3Cgale_doaj_%3EA780926605%3C/gale_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c387t-ba658a48defcdc753920b154409158434b980b8326fa53bdb0e504b688a2a033%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2918797089&rft_id=info:pmid/&rft_galeid=A780926605&rfr_iscdi=true