Loading…

GEOREFERENCING UAS DERIVATIVES THROUGH POINT CLOUD REGISTRATION WITH ARCHIVED LIDAR DATASETS

Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotaggin...

Full description

Saved in:
Bibliographic Details
Published in:ISPRS annals of the photogrammetry, remote sensing and spatial information sciences remote sensing and spatial information sciences, 2016-10, Vol.IV-2/W1, p.195-199
Main Authors: Magtalas, M. S. L. Y., Aves, J. C. L., Blanco, A. C.
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c3336-96074596ee131c8eb3fc8e395beef0c819e4c12049137fabc3fa92245ebaf6fb3
cites
container_end_page 199
container_issue
container_start_page 195
container_title ISPRS annals of the photogrammetry, remote sensing and spatial information sciences
container_volume IV-2/W1
creator Magtalas, M. S. L. Y.
Aves, J. C. L.
Blanco, A. C.
description Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a ‘skeleton point cloud’. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of CANUPO and manual skeleton clouds were obtained with values for both equal to around 0.67 meters at 1.73 standard deviation.
doi_str_mv 10.5194/isprs-annals-IV-2-W1-195-2016
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_b440536037ff43bd9313f919fefe5a41</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_b440536037ff43bd9313f919fefe5a41</doaj_id><sourcerecordid>1986187775</sourcerecordid><originalsourceid>FETCH-LOGICAL-c3336-96074596ee131c8eb3fc8e395beef0c819e4c12049137fabc3fa92245ebaf6fb3</originalsourceid><addsrcrecordid>eNpNkcFKxDAQhosoKOo7BMRjNNMk7ebgobSxDSxbabu7FyGkNZFd1K6pHnx7s66Il0wYPr4Z5o-iayA3HAS73Uw7P2Hz9mZeJqxWOMZrwCA4jgkkR9FZHCAsCCfH__6n0eU0bQkhkHIhRHwWPZaybuS9bOQiV4sSLbMWFbJRq6xTK9mirmrqZVmhh1otOpTP62WBGlmqtmsCUS_QWnUVypq8CniB5qrIGlRkXdbKrr2ITlxYz17-1vNoeS-7vMLzulR5NscDpTTBIiEp4yKxFigMM9tTF14qeG-tI8MMhGUDxIQJoKkz_UCdEXHMuO2NS1xPzyN18D6NZqt3fvNq_JcezUb_NEb_rI3_2AwvVveMEU4TEkSO0f5JUKBOgHDWWW4YBNfVwbXz4_unnT70dvz0-ytrELMEZmma8kDdHajBj9PkrfubCkTv89E_-ehDPlqtdKzXEAxc7_Oh39mkgF8</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1986187775</pqid></control><display><type>article</type><title>GEOREFERENCING UAS DERIVATIVES THROUGH POINT CLOUD REGISTRATION WITH ARCHIVED LIDAR DATASETS</title><source>Publicly Available Content (ProQuest)</source><creator>Magtalas, M. S. L. Y. ; Aves, J. C. L. ; Blanco, A. C.</creator><creatorcontrib>Magtalas, M. S. L. Y. ; Aves, J. C. L. ; Blanco, A. C.</creatorcontrib><description>Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a ‘skeleton point cloud’. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of CANUPO and manual skeleton clouds were obtained with values for both equal to around 0.67 meters at 1.73 standard deviation.</description><identifier>ISSN: 2194-9050</identifier><identifier>ISSN: 2194-9042</identifier><identifier>EISSN: 2194-9050</identifier><identifier>DOI: 10.5194/isprs-annals-IV-2-W1-195-2016</identifier><language>eng</language><publisher>Gottingen: Copernicus GmbH</publisher><subject>Datasets ; Derivatives ; Feature extraction ; Global positioning systems ; GPS ; Ground based control ; Iterative algorithms ; Lidar ; Mathematical models ; Measuring instruments ; Post-production processing ; Quality ; Satellite navigation systems ; Source code ; Spatial analysis ; Spatial data ; Three dimensional models ; Unmanned aerial vehicles ; Workflow</subject><ispartof>ISPRS annals of the photogrammetry, remote sensing and spatial information sciences, 2016-10, Vol.IV-2/W1, p.195-199</ispartof><rights>Copyright Copernicus GmbH 2016</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c3336-96074596ee131c8eb3fc8e395beef0c819e4c12049137fabc3fa92245ebaf6fb3</citedby></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/1986187775?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>314,776,780,25728,27898,27899,36986,44563</link.rule.ids></links><search><creatorcontrib>Magtalas, M. S. L. Y.</creatorcontrib><creatorcontrib>Aves, J. C. L.</creatorcontrib><creatorcontrib>Blanco, A. C.</creatorcontrib><title>GEOREFERENCING UAS DERIVATIVES THROUGH POINT CLOUD REGISTRATION WITH ARCHIVED LIDAR DATASETS</title><title>ISPRS annals of the photogrammetry, remote sensing and spatial information sciences</title><description>Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a ‘skeleton point cloud’. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of CANUPO and manual skeleton clouds were obtained with values for both equal to around 0.67 meters at 1.73 standard deviation.</description><subject>Datasets</subject><subject>Derivatives</subject><subject>Feature extraction</subject><subject>Global positioning systems</subject><subject>GPS</subject><subject>Ground based control</subject><subject>Iterative algorithms</subject><subject>Lidar</subject><subject>Mathematical models</subject><subject>Measuring instruments</subject><subject>Post-production processing</subject><subject>Quality</subject><subject>Satellite navigation systems</subject><subject>Source code</subject><subject>Spatial analysis</subject><subject>Spatial data</subject><subject>Three dimensional models</subject><subject>Unmanned aerial vehicles</subject><subject>Workflow</subject><issn>2194-9050</issn><issn>2194-9042</issn><issn>2194-9050</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2016</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNpNkcFKxDAQhosoKOo7BMRjNNMk7ebgobSxDSxbabu7FyGkNZFd1K6pHnx7s66Il0wYPr4Z5o-iayA3HAS73Uw7P2Hz9mZeJqxWOMZrwCA4jgkkR9FZHCAsCCfH__6n0eU0bQkhkHIhRHwWPZaybuS9bOQiV4sSLbMWFbJRq6xTK9mirmrqZVmhh1otOpTP62WBGlmqtmsCUS_QWnUVypq8CniB5qrIGlRkXdbKrr2ITlxYz17-1vNoeS-7vMLzulR5NscDpTTBIiEp4yKxFigMM9tTF14qeG-tI8MMhGUDxIQJoKkz_UCdEXHMuO2NS1xPzyN18D6NZqt3fvNq_JcezUb_NEb_rI3_2AwvVveMEU4TEkSO0f5JUKBOgHDWWW4YBNfVwbXz4_unnT70dvz0-ytrELMEZmma8kDdHajBj9PkrfubCkTv89E_-ehDPlqtdKzXEAxc7_Oh39mkgF8</recordid><startdate>20161005</startdate><enddate>20161005</enddate><creator>Magtalas, M. S. L. Y.</creator><creator>Aves, J. C. L.</creator><creator>Blanco, A. C.</creator><general>Copernicus GmbH</general><general>Copernicus Publications</general><scope>AAYXX</scope><scope>CITATION</scope><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PCBAR</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PIMPY</scope><scope>PKEHL</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>DOA</scope></search><sort><creationdate>20161005</creationdate><title>GEOREFERENCING UAS DERIVATIVES THROUGH POINT CLOUD REGISTRATION WITH ARCHIVED LIDAR DATASETS</title><author>Magtalas, M. S. L. Y. ; Aves, J. C. L. ; Blanco, A. C.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c3336-96074596ee131c8eb3fc8e395beef0c819e4c12049137fabc3fa92245ebaf6fb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2016</creationdate><topic>Datasets</topic><topic>Derivatives</topic><topic>Feature extraction</topic><topic>Global positioning systems</topic><topic>GPS</topic><topic>Ground based control</topic><topic>Iterative algorithms</topic><topic>Lidar</topic><topic>Mathematical models</topic><topic>Measuring instruments</topic><topic>Post-production processing</topic><topic>Quality</topic><topic>Satellite navigation systems</topic><topic>Source code</topic><topic>Spatial analysis</topic><topic>Spatial data</topic><topic>Three dimensional models</topic><topic>Unmanned aerial vehicles</topic><topic>Workflow</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Magtalas, M. S. L. Y.</creatorcontrib><creatorcontrib>Aves, J. C. L.</creatorcontrib><creatorcontrib>Blanco, A. C.</creatorcontrib><collection>CrossRef</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Earth, Atmospheric &amp; Aquatic Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Engineering Database</collection><collection>ProQuest Earth, Atmospheric &amp; Aquatic Science Database</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied &amp; Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>ISPRS annals of the photogrammetry, remote sensing and spatial information sciences</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Magtalas, M. S. L. Y.</au><au>Aves, J. C. L.</au><au>Blanco, A. C.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>GEOREFERENCING UAS DERIVATIVES THROUGH POINT CLOUD REGISTRATION WITH ARCHIVED LIDAR DATASETS</atitle><jtitle>ISPRS annals of the photogrammetry, remote sensing and spatial information sciences</jtitle><date>2016-10-05</date><risdate>2016</risdate><volume>IV-2/W1</volume><spage>195</spage><epage>199</epage><pages>195-199</pages><issn>2194-9050</issn><issn>2194-9042</issn><eissn>2194-9050</eissn><abstract>Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a ‘skeleton point cloud’. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of CANUPO and manual skeleton clouds were obtained with values for both equal to around 0.67 meters at 1.73 standard deviation.</abstract><cop>Gottingen</cop><pub>Copernicus GmbH</pub><doi>10.5194/isprs-annals-IV-2-W1-195-2016</doi><tpages>5</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2194-9050
ispartof ISPRS annals of the photogrammetry, remote sensing and spatial information sciences, 2016-10, Vol.IV-2/W1, p.195-199
issn 2194-9050
2194-9042
2194-9050
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_b440536037ff43bd9313f919fefe5a41
source Publicly Available Content (ProQuest)
subjects Datasets
Derivatives
Feature extraction
Global positioning systems
GPS
Ground based control
Iterative algorithms
Lidar
Mathematical models
Measuring instruments
Post-production processing
Quality
Satellite navigation systems
Source code
Spatial analysis
Spatial data
Three dimensional models
Unmanned aerial vehicles
Workflow
title GEOREFERENCING UAS DERIVATIVES THROUGH POINT CLOUD REGISTRATION WITH ARCHIVED LIDAR DATASETS
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-27T07%3A04%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=GEOREFERENCING%20UAS%20DERIVATIVES%20THROUGH%20POINT%20CLOUD%20REGISTRATION%20WITH%20ARCHIVED%20LIDAR%20DATASETS&rft.jtitle=ISPRS%20annals%20of%20the%20photogrammetry,%20remote%20sensing%20and%20spatial%20information%20sciences&rft.au=Magtalas,%20M.%20S.%20L.%20Y.&rft.date=2016-10-05&rft.volume=IV-2/W1&rft.spage=195&rft.epage=199&rft.pages=195-199&rft.issn=2194-9050&rft.eissn=2194-9050&rft_id=info:doi/10.5194/isprs-annals-IV-2-W1-195-2016&rft_dat=%3Cproquest_doaj_%3E1986187775%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c3336-96074596ee131c8eb3fc8e395beef0c819e4c12049137fabc3fa92245ebaf6fb3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1986187775&rft_id=info:pmid/&rfr_iscdi=true