Loading…
Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network
Applications of digital agricultural services often require either farmers or their advisers to provide digital records of their field boundaries. Automatic extraction of field boundaries from satellite imagery would reduce the reliance on manual input of these records, which is time consuming, and...
Saved in:
Published in: | Remote sensing of environment 2020-08, Vol.245, p.111741, Article 111741 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c325t-332d42e00358d2bdbfb2ab87c3a3d0a70222ce47f312fc80cc98675312b08b0e3 |
---|---|
cites | cdi_FETCH-LOGICAL-c325t-332d42e00358d2bdbfb2ab87c3a3d0a70222ce47f312fc80cc98675312b08b0e3 |
container_end_page | |
container_issue | |
container_start_page | 111741 |
container_title | Remote sensing of environment |
container_volume | 245 |
creator | Waldner, François Diakogiannis, Foivos I. |
description | Applications of digital agricultural services often require either farmers or their advisers to provide digital records of their field boundaries. Automatic extraction of field boundaries from satellite imagery would reduce the reliance on manual input of these records, which is time consuming, and would underpin the provision of remote products and services. The lack of current field boundary data sets seems to indicate low uptake of existing methods, presumably because of expensive image preprocessing requirements and local, often arbitrary, tuning. In this paper, we propose a data-driven, robust and general method to facilitate field boundary extraction from satellite images. We formulated this task as a multi-task semantic segmentation problem. We used ResUNet-a, a deep convolutional neural network with a fully connected UNet backbone that features dilated convolutions and conditioned inference to identify: 1) the extent of fields; 2) the field boundaries; and 3) the distance to the closest boundary. By asking the algorithm to reconstruct three correlated outputs, the model's performance and its ability to generalise greatly improve. Segmentation of individual fields was then achieved by post-processing the three model outputs, e.g., via thresholding or watershed segmentation. Using a single monthly composite image from Sentinel-2 as input, our model was highly accurate in mapping field extent, field boundaries and, consequently, individual fields. Replacing the monthly composite with a single-date image close to the compositing period marginally decreased accuracy. We then showed in a series of experiments that, without recalibration, the same model generalised well across resolutions (10 m to 30 m), sensors (Sentinel-2 to Landsat-8), space and time. Building consensus by averaging model predictions from at least four images acquired across the season is paramount to reducing the temporal variations of accuracy. Our convolutional neural network is capable of learning complex hierarchical contextual features from the image to accurately detect field boundaries and discard irrelevant boundaries, thereby outperforming conventional edge filters. By minimising over-fitting and image preprocessing requirements, and by replacing local arbitrary decisions by data-driven ones, our approach is expected to facilitate the extraction of individual crop fields at scale.
•We extract field boundaries from Sentinel-2 data using a convolutional neural network.•High the |
doi_str_mv | 10.1016/j.rse.2020.111741 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2440938680</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0034425720301115</els_id><sourcerecordid>2440938680</sourcerecordid><originalsourceid>FETCH-LOGICAL-c325t-332d42e00358d2bdbfb2ab87c3a3d0a70222ce47f312fc80cc98675312b08b0e3</originalsourceid><addsrcrecordid>eNp9UE1PAyEUJEYTa_UHeCPxvPUBu4XqyfidmHjRM2HhbaWuSwXWj38vtZ49TWYy8zJvCDlmMGPA5qerWUw448ALZ0zWbIdMmJKLCiTUu2QCIOqq5o3cJwcprQBYoySbkP4KcU17NHHww5KGgaJb4hm9_srR2LzROo-9o20YB2eix0S7GN5oMhn73mek_s0si_rp8ws11IbhI_Rj9mEwPR1wjL-QP0N8PSR7nekTHv3hlDzfXD9d3lUPj7f3lxcPlRW8yZUQ3NUcS-VGOd66tmu5aZW0wggHRgLn3GItO8F4ZxVYu1Bz2RTWgmoBxZScbO-uY3gfMWW9CmMsfZLmdQ0LoeYKiottXTaGlCJ2eh3LL_FbM9CbUfVKl1H1ZlS9HbVkzrcZLPU_PEadrMfBovMRbdYu-H_SPxqvgEc</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2440938680</pqid></control><display><type>article</type><title>Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network</title><source>Elsevier</source><creator>Waldner, François ; Diakogiannis, Foivos I.</creator><creatorcontrib>Waldner, François ; Diakogiannis, Foivos I.</creatorcontrib><description>Applications of digital agricultural services often require either farmers or their advisers to provide digital records of their field boundaries. Automatic extraction of field boundaries from satellite imagery would reduce the reliance on manual input of these records, which is time consuming, and would underpin the provision of remote products and services. The lack of current field boundary data sets seems to indicate low uptake of existing methods, presumably because of expensive image preprocessing requirements and local, often arbitrary, tuning. In this paper, we propose a data-driven, robust and general method to facilitate field boundary extraction from satellite images. We formulated this task as a multi-task semantic segmentation problem. We used ResUNet-a, a deep convolutional neural network with a fully connected UNet backbone that features dilated convolutions and conditioned inference to identify: 1) the extent of fields; 2) the field boundaries; and 3) the distance to the closest boundary. By asking the algorithm to reconstruct three correlated outputs, the model's performance and its ability to generalise greatly improve. Segmentation of individual fields was then achieved by post-processing the three model outputs, e.g., via thresholding or watershed segmentation. Using a single monthly composite image from Sentinel-2 as input, our model was highly accurate in mapping field extent, field boundaries and, consequently, individual fields. Replacing the monthly composite with a single-date image close to the compositing period marginally decreased accuracy. We then showed in a series of experiments that, without recalibration, the same model generalised well across resolutions (10 m to 30 m), sensors (Sentinel-2 to Landsat-8), space and time. Building consensus by averaging model predictions from at least four images acquired across the season is paramount to reducing the temporal variations of accuracy. Our convolutional neural network is capable of learning complex hierarchical contextual features from the image to accurately detect field boundaries and discard irrelevant boundaries, thereby outperforming conventional edge filters. By minimising over-fitting and image preprocessing requirements, and by replacing local arbitrary decisions by data-driven ones, our approach is expected to facilitate the extraction of individual crop fields at scale.
•We extract field boundaries from Sentinel-2 data using a convolutional neural network.•High thematic and geometric accuracies were obtained using a composite image.•The same model generalised well across sensors, resolution, space and time.•Building consensus by averaging predictions from multiple dates improves accuracy.</description><identifier>ISSN: 0034-4257</identifier><identifier>EISSN: 1879-0704</identifier><identifier>DOI: 10.1016/j.rse.2020.111741</identifier><language>eng</language><publisher>New York: Elsevier Inc</publisher><subject>Agriculture ; Algorithms ; Artificial neural networks ; Boundaries ; Computer vision ; Crop fields ; Deep learning ; Digital imaging ; Field boundaries ; Generalisation ; Image acquisition ; Image processing ; Image segmentation ; Instance segmentation ; Landsat ; Landsat satellites ; Mapping ; Multitasking ; Neural networks ; Post-production processing ; Preprocessing ; Remote sensing ; Satellite imagery ; Satellites ; Semantic segmentation ; Sentinel-2 ; Temporal variations</subject><ispartof>Remote sensing of environment, 2020-08, Vol.245, p.111741, Article 111741</ispartof><rights>2020 Elsevier Inc.</rights><rights>Copyright Elsevier BV Aug 2020</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c325t-332d42e00358d2bdbfb2ab87c3a3d0a70222ce47f312fc80cc98675312b08b0e3</citedby><cites>FETCH-LOGICAL-c325t-332d42e00358d2bdbfb2ab87c3a3d0a70222ce47f312fc80cc98675312b08b0e3</cites><orcidid>0000-0002-5599-7456</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,777,781,27905,27906</link.rule.ids></links><search><creatorcontrib>Waldner, François</creatorcontrib><creatorcontrib>Diakogiannis, Foivos I.</creatorcontrib><title>Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network</title><title>Remote sensing of environment</title><description>Applications of digital agricultural services often require either farmers or their advisers to provide digital records of their field boundaries. Automatic extraction of field boundaries from satellite imagery would reduce the reliance on manual input of these records, which is time consuming, and would underpin the provision of remote products and services. The lack of current field boundary data sets seems to indicate low uptake of existing methods, presumably because of expensive image preprocessing requirements and local, often arbitrary, tuning. In this paper, we propose a data-driven, robust and general method to facilitate field boundary extraction from satellite images. We formulated this task as a multi-task semantic segmentation problem. We used ResUNet-a, a deep convolutional neural network with a fully connected UNet backbone that features dilated convolutions and conditioned inference to identify: 1) the extent of fields; 2) the field boundaries; and 3) the distance to the closest boundary. By asking the algorithm to reconstruct three correlated outputs, the model's performance and its ability to generalise greatly improve. Segmentation of individual fields was then achieved by post-processing the three model outputs, e.g., via thresholding or watershed segmentation. Using a single monthly composite image from Sentinel-2 as input, our model was highly accurate in mapping field extent, field boundaries and, consequently, individual fields. Replacing the monthly composite with a single-date image close to the compositing period marginally decreased accuracy. We then showed in a series of experiments that, without recalibration, the same model generalised well across resolutions (10 m to 30 m), sensors (Sentinel-2 to Landsat-8), space and time. Building consensus by averaging model predictions from at least four images acquired across the season is paramount to reducing the temporal variations of accuracy. Our convolutional neural network is capable of learning complex hierarchical contextual features from the image to accurately detect field boundaries and discard irrelevant boundaries, thereby outperforming conventional edge filters. By minimising over-fitting and image preprocessing requirements, and by replacing local arbitrary decisions by data-driven ones, our approach is expected to facilitate the extraction of individual crop fields at scale.
•We extract field boundaries from Sentinel-2 data using a convolutional neural network.•High thematic and geometric accuracies were obtained using a composite image.•The same model generalised well across sensors, resolution, space and time.•Building consensus by averaging predictions from multiple dates improves accuracy.</description><subject>Agriculture</subject><subject>Algorithms</subject><subject>Artificial neural networks</subject><subject>Boundaries</subject><subject>Computer vision</subject><subject>Crop fields</subject><subject>Deep learning</subject><subject>Digital imaging</subject><subject>Field boundaries</subject><subject>Generalisation</subject><subject>Image acquisition</subject><subject>Image processing</subject><subject>Image segmentation</subject><subject>Instance segmentation</subject><subject>Landsat</subject><subject>Landsat satellites</subject><subject>Mapping</subject><subject>Multitasking</subject><subject>Neural networks</subject><subject>Post-production processing</subject><subject>Preprocessing</subject><subject>Remote sensing</subject><subject>Satellite imagery</subject><subject>Satellites</subject><subject>Semantic segmentation</subject><subject>Sentinel-2</subject><subject>Temporal variations</subject><issn>0034-4257</issn><issn>1879-0704</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNp9UE1PAyEUJEYTa_UHeCPxvPUBu4XqyfidmHjRM2HhbaWuSwXWj38vtZ49TWYy8zJvCDlmMGPA5qerWUw448ALZ0zWbIdMmJKLCiTUu2QCIOqq5o3cJwcprQBYoySbkP4KcU17NHHww5KGgaJb4hm9_srR2LzROo-9o20YB2eix0S7GN5oMhn73mek_s0si_rp8ws11IbhI_Rj9mEwPR1wjL-QP0N8PSR7nekTHv3hlDzfXD9d3lUPj7f3lxcPlRW8yZUQ3NUcS-VGOd66tmu5aZW0wggHRgLn3GItO8F4ZxVYu1Bz2RTWgmoBxZScbO-uY3gfMWW9CmMsfZLmdQ0LoeYKiottXTaGlCJ2eh3LL_FbM9CbUfVKl1H1ZlS9HbVkzrcZLPU_PEadrMfBovMRbdYu-H_SPxqvgEc</recordid><startdate>202008</startdate><enddate>202008</enddate><creator>Waldner, François</creator><creator>Diakogiannis, Foivos I.</creator><general>Elsevier Inc</general><general>Elsevier BV</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7QF</scope><scope>7QO</scope><scope>7QQ</scope><scope>7SC</scope><scope>7SE</scope><scope>7SN</scope><scope>7SP</scope><scope>7SR</scope><scope>7TA</scope><scope>7TB</scope><scope>7TG</scope><scope>7U5</scope><scope>8BQ</scope><scope>8FD</scope><scope>C1K</scope><scope>F28</scope><scope>FR3</scope><scope>H8D</scope><scope>H8G</scope><scope>JG9</scope><scope>JQ2</scope><scope>KL.</scope><scope>KR7</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>P64</scope><orcidid>https://orcid.org/0000-0002-5599-7456</orcidid></search><sort><creationdate>202008</creationdate><title>Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network</title><author>Waldner, François ; Diakogiannis, Foivos I.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c325t-332d42e00358d2bdbfb2ab87c3a3d0a70222ce47f312fc80cc98675312b08b0e3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Agriculture</topic><topic>Algorithms</topic><topic>Artificial neural networks</topic><topic>Boundaries</topic><topic>Computer vision</topic><topic>Crop fields</topic><topic>Deep learning</topic><topic>Digital imaging</topic><topic>Field boundaries</topic><topic>Generalisation</topic><topic>Image acquisition</topic><topic>Image processing</topic><topic>Image segmentation</topic><topic>Instance segmentation</topic><topic>Landsat</topic><topic>Landsat satellites</topic><topic>Mapping</topic><topic>Multitasking</topic><topic>Neural networks</topic><topic>Post-production processing</topic><topic>Preprocessing</topic><topic>Remote sensing</topic><topic>Satellite imagery</topic><topic>Satellites</topic><topic>Semantic segmentation</topic><topic>Sentinel-2</topic><topic>Temporal variations</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Waldner, François</creatorcontrib><creatorcontrib>Diakogiannis, Foivos I.</creatorcontrib><collection>CrossRef</collection><collection>Aluminium Industry Abstracts</collection><collection>Biotechnology Research Abstracts</collection><collection>Ceramic Abstracts</collection><collection>Computer and Information Systems Abstracts</collection><collection>Corrosion Abstracts</collection><collection>Ecology Abstracts</collection><collection>Electronics & Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>Materials Business File</collection><collection>Mechanical & Transportation Engineering Abstracts</collection><collection>Meteorological & Geoastrophysical Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ANTE: Abstracts in New Technology & Engineering</collection><collection>Engineering Research Database</collection><collection>Aerospace Database</collection><collection>Copper Technical Reference Library</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Meteorological & Geoastrophysical Abstracts - Academic</collection><collection>Civil Engineering Abstracts</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>Biotechnology and BioEngineering Abstracts</collection><jtitle>Remote sensing of environment</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Waldner, François</au><au>Diakogiannis, Foivos I.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network</atitle><jtitle>Remote sensing of environment</jtitle><date>2020-08</date><risdate>2020</risdate><volume>245</volume><spage>111741</spage><pages>111741-</pages><artnum>111741</artnum><issn>0034-4257</issn><eissn>1879-0704</eissn><abstract>Applications of digital agricultural services often require either farmers or their advisers to provide digital records of their field boundaries. Automatic extraction of field boundaries from satellite imagery would reduce the reliance on manual input of these records, which is time consuming, and would underpin the provision of remote products and services. The lack of current field boundary data sets seems to indicate low uptake of existing methods, presumably because of expensive image preprocessing requirements and local, often arbitrary, tuning. In this paper, we propose a data-driven, robust and general method to facilitate field boundary extraction from satellite images. We formulated this task as a multi-task semantic segmentation problem. We used ResUNet-a, a deep convolutional neural network with a fully connected UNet backbone that features dilated convolutions and conditioned inference to identify: 1) the extent of fields; 2) the field boundaries; and 3) the distance to the closest boundary. By asking the algorithm to reconstruct three correlated outputs, the model's performance and its ability to generalise greatly improve. Segmentation of individual fields was then achieved by post-processing the three model outputs, e.g., via thresholding or watershed segmentation. Using a single monthly composite image from Sentinel-2 as input, our model was highly accurate in mapping field extent, field boundaries and, consequently, individual fields. Replacing the monthly composite with a single-date image close to the compositing period marginally decreased accuracy. We then showed in a series of experiments that, without recalibration, the same model generalised well across resolutions (10 m to 30 m), sensors (Sentinel-2 to Landsat-8), space and time. Building consensus by averaging model predictions from at least four images acquired across the season is paramount to reducing the temporal variations of accuracy. Our convolutional neural network is capable of learning complex hierarchical contextual features from the image to accurately detect field boundaries and discard irrelevant boundaries, thereby outperforming conventional edge filters. By minimising over-fitting and image preprocessing requirements, and by replacing local arbitrary decisions by data-driven ones, our approach is expected to facilitate the extraction of individual crop fields at scale.
•We extract field boundaries from Sentinel-2 data using a convolutional neural network.•High thematic and geometric accuracies were obtained using a composite image.•The same model generalised well across sensors, resolution, space and time.•Building consensus by averaging predictions from multiple dates improves accuracy.</abstract><cop>New York</cop><pub>Elsevier Inc</pub><doi>10.1016/j.rse.2020.111741</doi><orcidid>https://orcid.org/0000-0002-5599-7456</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0034-4257 |
ispartof | Remote sensing of environment, 2020-08, Vol.245, p.111741, Article 111741 |
issn | 0034-4257 1879-0704 |
language | eng |
recordid | cdi_proquest_journals_2440938680 |
source | Elsevier |
subjects | Agriculture Algorithms Artificial neural networks Boundaries Computer vision Crop fields Deep learning Digital imaging Field boundaries Generalisation Image acquisition Image processing Image segmentation Instance segmentation Landsat Landsat satellites Mapping Multitasking Neural networks Post-production processing Preprocessing Remote sensing Satellite imagery Satellites Semantic segmentation Sentinel-2 Temporal variations |
title | Deep learning on edge: Extracting field boundaries from satellite images with a convolutional neural network |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T12%3A20%3A07IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deep%20learning%20on%20edge:%20Extracting%20field%20boundaries%20from%20satellite%20images%20with%20a%20convolutional%20neural%20network&rft.jtitle=Remote%20sensing%20of%20environment&rft.au=Waldner,%20Fran%C3%A7ois&rft.date=2020-08&rft.volume=245&rft.spage=111741&rft.pages=111741-&rft.artnum=111741&rft.issn=0034-4257&rft.eissn=1879-0704&rft_id=info:doi/10.1016/j.rse.2020.111741&rft_dat=%3Cproquest_cross%3E2440938680%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c325t-332d42e00358d2bdbfb2ab87c3a3d0a70222ce47f312fc80cc98675312b08b0e3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2440938680&rft_id=info:pmid/&rfr_iscdi=true |