Loading…

A Fast and Robust Attitude Estimation Method Based on Vision-Inertial Tight Coupling With Unknown Correspondences

Simultaneous pose and correspondence determination (SPCD) plays a crucial role in attitude estimation with unknown correspondences. In contrast to the conventional loose integration of optimizing the initial pose selection of SPCD algorithms using inertial information, we refine the data collected b...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on instrumentation and measurement 2024, Vol.73, p.1-12
Main Authors: Li, Yue, Sun, Changku, Lu, Yutai, Wang, Dawei, Wang, Peng, Fu, Luhua
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c175t-4bc5a58c36f9b69f1a4efb42afdf78dc55be1259fcc95ceafd2227f122290bec3
container_end_page 12
container_issue
container_start_page 1
container_title IEEE transactions on instrumentation and measurement
container_volume 73
creator Li, Yue
Sun, Changku
Lu, Yutai
Wang, Dawei
Wang, Peng
Fu, Luhua
description Simultaneous pose and correspondence determination (SPCD) plays a crucial role in attitude estimation with unknown correspondences. In contrast to the conventional loose integration of optimizing the initial pose selection of SPCD algorithms using inertial information, we refine the data collected by the inertial sensors into motion information and tightly fuse it with the feature point information provided by the camera. As a result, the accuracy and computational speed of the attitude estimation are improved. Specifically, we propose a novel SPCD method that combines motion information-assisted feature point hybrid tracking (MIFHT) and cascaded square root cubature Kalman filter (CSRCKF), along with a vision-inertial tightly coupled framework incorporating robust strategies. Our method maintains high tracking accuracy even in visual occlusion and exhibits robustness against outliers. Finally, comparative experiments conducted on our measurement system validate the effectiveness and superiority of our method. While maintaining a convergence rate of 95%, the computational speed improves by 55.11% compared to the fastest existing SPCD method.
doi_str_mv 10.1109/TIM.2024.3472799
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1109_TIM_2024_3472799</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10705020</ieee_id><sourcerecordid>3120647555</sourcerecordid><originalsourceid>FETCH-LOGICAL-c175t-4bc5a58c36f9b69f1a4efb42afdf78dc55be1259fcc95ceafd2227f122290bec3</originalsourceid><addsrcrecordid>eNpNUE1LAzEUDKJg_bh78BDwvDXJbjbNsZZWCxZBWj0u2eyLTa1Jm2QR_72R9uDlfQwz7zGD0A0lQ0qJvF_OF0NGWDUsK8GElCdoQDkXhaxrdooGhNBRISten6OLGDeEEFFXYoD2YzxTMWHlOvzq2z6P45Rs6jvA05jsl0rWO7yAtPYdflAROpz3NxszXMwdhGTVFi_txzrhie93W-s-8LtNa7xyn85_u4yGAHHnXQdOQ7xCZ0ZtI1wf-yVazabLyVPx_PI4n4yfC00FT0XVaq74SJe1kW0tDVUVmLZiynRGjDrNeQuUcWm0llxDhhljwtBcJWlBl5fo7nB3F_y-h5iaje-Dyy-bkjKS3XPOM4scWDr4GAOYZhey6fDTUNL8BdvkYJu_YJtjsFlye5BYAPhHF4QTRspf_E12TQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3120647555</pqid></control><display><type>article</type><title>A Fast and Robust Attitude Estimation Method Based on Vision-Inertial Tight Coupling With Unknown Correspondences</title><source>IEEE Electronic Library (IEL) Journals</source><creator>Li, Yue ; Sun, Changku ; Lu, Yutai ; Wang, Dawei ; Wang, Peng ; Fu, Luhua</creator><creatorcontrib>Li, Yue ; Sun, Changku ; Lu, Yutai ; Wang, Dawei ; Wang, Peng ; Fu, Luhua</creatorcontrib><description>Simultaneous pose and correspondence determination (SPCD) plays a crucial role in attitude estimation with unknown correspondences. In contrast to the conventional loose integration of optimizing the initial pose selection of SPCD algorithms using inertial information, we refine the data collected by the inertial sensors into motion information and tightly fuse it with the feature point information provided by the camera. As a result, the accuracy and computational speed of the attitude estimation are improved. Specifically, we propose a novel SPCD method that combines motion information-assisted feature point hybrid tracking (MIFHT) and cascaded square root cubature Kalman filter (CSRCKF), along with a vision-inertial tightly coupled framework incorporating robust strategies. Our method maintains high tracking accuracy even in visual occlusion and exhibits robustness against outliers. Finally, comparative experiments conducted on our measurement system validate the effectiveness and superiority of our method. While maintaining a convergence rate of 95%, the computational speed improves by 55.11% compared to the fastest existing SPCD method.</description><identifier>ISSN: 0018-9456</identifier><identifier>EISSN: 1557-9662</identifier><identifier>DOI: 10.1109/TIM.2024.3472799</identifier><identifier>CODEN: IEIMAO</identifier><language>eng</language><publisher>New York: IEEE</publisher><subject>Algorithms ; Attitude estimation ; Attitudes ; Cameras ; feature point tracking ; Inertial sensing devices ; Integrated circuit modeling ; Kalman filter ; Kalman filters ; Noise ; Occlusion ; Outliers (statistics) ; Pose estimation ; Robustness ; Solid modeling ; Symbols ; Target tracking ; Three-dimensional displays ; Tracking ; unknown correspondences ; Vision ; vision-inertial fusion ; Visualization</subject><ispartof>IEEE transactions on instrumentation and measurement, 2024, Vol.73, p.1-12</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2024</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c175t-4bc5a58c36f9b69f1a4efb42afdf78dc55be1259fcc95ceafd2227f122290bec3</cites><orcidid>0000-0001-9031-983X ; 0009-0003-0710-9682 ; 0000-0003-1310-9630 ; 0000-0002-4428-0599 ; 0009-0005-7441-3750 ; 0000-0001-7775-0588</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10705020$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,776,780,4009,27902,27903,27904,54774</link.rule.ids></links><search><creatorcontrib>Li, Yue</creatorcontrib><creatorcontrib>Sun, Changku</creatorcontrib><creatorcontrib>Lu, Yutai</creatorcontrib><creatorcontrib>Wang, Dawei</creatorcontrib><creatorcontrib>Wang, Peng</creatorcontrib><creatorcontrib>Fu, Luhua</creatorcontrib><title>A Fast and Robust Attitude Estimation Method Based on Vision-Inertial Tight Coupling With Unknown Correspondences</title><title>IEEE transactions on instrumentation and measurement</title><addtitle>TIM</addtitle><description>Simultaneous pose and correspondence determination (SPCD) plays a crucial role in attitude estimation with unknown correspondences. In contrast to the conventional loose integration of optimizing the initial pose selection of SPCD algorithms using inertial information, we refine the data collected by the inertial sensors into motion information and tightly fuse it with the feature point information provided by the camera. As a result, the accuracy and computational speed of the attitude estimation are improved. Specifically, we propose a novel SPCD method that combines motion information-assisted feature point hybrid tracking (MIFHT) and cascaded square root cubature Kalman filter (CSRCKF), along with a vision-inertial tightly coupled framework incorporating robust strategies. Our method maintains high tracking accuracy even in visual occlusion and exhibits robustness against outliers. Finally, comparative experiments conducted on our measurement system validate the effectiveness and superiority of our method. While maintaining a convergence rate of 95%, the computational speed improves by 55.11% compared to the fastest existing SPCD method.</description><subject>Algorithms</subject><subject>Attitude estimation</subject><subject>Attitudes</subject><subject>Cameras</subject><subject>feature point tracking</subject><subject>Inertial sensing devices</subject><subject>Integrated circuit modeling</subject><subject>Kalman filter</subject><subject>Kalman filters</subject><subject>Noise</subject><subject>Occlusion</subject><subject>Outliers (statistics)</subject><subject>Pose estimation</subject><subject>Robustness</subject><subject>Solid modeling</subject><subject>Symbols</subject><subject>Target tracking</subject><subject>Three-dimensional displays</subject><subject>Tracking</subject><subject>unknown correspondences</subject><subject>Vision</subject><subject>vision-inertial fusion</subject><subject>Visualization</subject><issn>0018-9456</issn><issn>1557-9662</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpNUE1LAzEUDKJg_bh78BDwvDXJbjbNsZZWCxZBWj0u2eyLTa1Jm2QR_72R9uDlfQwz7zGD0A0lQ0qJvF_OF0NGWDUsK8GElCdoQDkXhaxrdooGhNBRISten6OLGDeEEFFXYoD2YzxTMWHlOvzq2z6P45Rs6jvA05jsl0rWO7yAtPYdflAROpz3NxszXMwdhGTVFi_txzrhie93W-s-8LtNa7xyn85_u4yGAHHnXQdOQ7xCZ0ZtI1wf-yVazabLyVPx_PI4n4yfC00FT0XVaq74SJe1kW0tDVUVmLZiynRGjDrNeQuUcWm0llxDhhljwtBcJWlBl5fo7nB3F_y-h5iaje-Dyy-bkjKS3XPOM4scWDr4GAOYZhey6fDTUNL8BdvkYJu_YJtjsFlye5BYAPhHF4QTRspf_E12TQ</recordid><startdate>2024</startdate><enddate>2024</enddate><creator>Li, Yue</creator><creator>Sun, Changku</creator><creator>Lu, Yutai</creator><creator>Wang, Dawei</creator><creator>Wang, Peng</creator><creator>Fu, Luhua</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SP</scope><scope>7U5</scope><scope>8FD</scope><scope>L7M</scope><orcidid>https://orcid.org/0000-0001-9031-983X</orcidid><orcidid>https://orcid.org/0009-0003-0710-9682</orcidid><orcidid>https://orcid.org/0000-0003-1310-9630</orcidid><orcidid>https://orcid.org/0000-0002-4428-0599</orcidid><orcidid>https://orcid.org/0009-0005-7441-3750</orcidid><orcidid>https://orcid.org/0000-0001-7775-0588</orcidid></search><sort><creationdate>2024</creationdate><title>A Fast and Robust Attitude Estimation Method Based on Vision-Inertial Tight Coupling With Unknown Correspondences</title><author>Li, Yue ; Sun, Changku ; Lu, Yutai ; Wang, Dawei ; Wang, Peng ; Fu, Luhua</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c175t-4bc5a58c36f9b69f1a4efb42afdf78dc55be1259fcc95ceafd2227f122290bec3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Attitude estimation</topic><topic>Attitudes</topic><topic>Cameras</topic><topic>feature point tracking</topic><topic>Inertial sensing devices</topic><topic>Integrated circuit modeling</topic><topic>Kalman filter</topic><topic>Kalman filters</topic><topic>Noise</topic><topic>Occlusion</topic><topic>Outliers (statistics)</topic><topic>Pose estimation</topic><topic>Robustness</topic><topic>Solid modeling</topic><topic>Symbols</topic><topic>Target tracking</topic><topic>Three-dimensional displays</topic><topic>Tracking</topic><topic>unknown correspondences</topic><topic>Vision</topic><topic>vision-inertial fusion</topic><topic>Visualization</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Yue</creatorcontrib><creatorcontrib>Sun, Changku</creatorcontrib><creatorcontrib>Lu, Yutai</creatorcontrib><creatorcontrib>Wang, Dawei</creatorcontrib><creatorcontrib>Wang, Peng</creatorcontrib><creatorcontrib>Fu, Luhua</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEL</collection><collection>CrossRef</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Solid State and Superconductivity Abstracts</collection><collection>Technology Research Database</collection><collection>Advanced Technologies Database with Aerospace</collection><jtitle>IEEE transactions on instrumentation and measurement</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Yue</au><au>Sun, Changku</au><au>Lu, Yutai</au><au>Wang, Dawei</au><au>Wang, Peng</au><au>Fu, Luhua</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A Fast and Robust Attitude Estimation Method Based on Vision-Inertial Tight Coupling With Unknown Correspondences</atitle><jtitle>IEEE transactions on instrumentation and measurement</jtitle><stitle>TIM</stitle><date>2024</date><risdate>2024</risdate><volume>73</volume><spage>1</spage><epage>12</epage><pages>1-12</pages><issn>0018-9456</issn><eissn>1557-9662</eissn><coden>IEIMAO</coden><abstract>Simultaneous pose and correspondence determination (SPCD) plays a crucial role in attitude estimation with unknown correspondences. In contrast to the conventional loose integration of optimizing the initial pose selection of SPCD algorithms using inertial information, we refine the data collected by the inertial sensors into motion information and tightly fuse it with the feature point information provided by the camera. As a result, the accuracy and computational speed of the attitude estimation are improved. Specifically, we propose a novel SPCD method that combines motion information-assisted feature point hybrid tracking (MIFHT) and cascaded square root cubature Kalman filter (CSRCKF), along with a vision-inertial tightly coupled framework incorporating robust strategies. Our method maintains high tracking accuracy even in visual occlusion and exhibits robustness against outliers. Finally, comparative experiments conducted on our measurement system validate the effectiveness and superiority of our method. While maintaining a convergence rate of 95%, the computational speed improves by 55.11% compared to the fastest existing SPCD method.</abstract><cop>New York</cop><pub>IEEE</pub><doi>10.1109/TIM.2024.3472799</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0001-9031-983X</orcidid><orcidid>https://orcid.org/0009-0003-0710-9682</orcidid><orcidid>https://orcid.org/0000-0003-1310-9630</orcidid><orcidid>https://orcid.org/0000-0002-4428-0599</orcidid><orcidid>https://orcid.org/0009-0005-7441-3750</orcidid><orcidid>https://orcid.org/0000-0001-7775-0588</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0018-9456
ispartof IEEE transactions on instrumentation and measurement, 2024, Vol.73, p.1-12
issn 0018-9456
1557-9662
language eng
recordid cdi_crossref_primary_10_1109_TIM_2024_3472799
source IEEE Electronic Library (IEL) Journals
subjects Algorithms
Attitude estimation
Attitudes
Cameras
feature point tracking
Inertial sensing devices
Integrated circuit modeling
Kalman filter
Kalman filters
Noise
Occlusion
Outliers (statistics)
Pose estimation
Robustness
Solid modeling
Symbols
Target tracking
Three-dimensional displays
Tracking
unknown correspondences
Vision
vision-inertial fusion
Visualization
title A Fast and Robust Attitude Estimation Method Based on Vision-Inertial Tight Coupling With Unknown Correspondences
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-26T06%3A33%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20Fast%20and%20Robust%20Attitude%20Estimation%20Method%20Based%20on%20Vision-Inertial%20Tight%20Coupling%20With%20Unknown%20Correspondences&rft.jtitle=IEEE%20transactions%20on%20instrumentation%20and%20measurement&rft.au=Li,%20Yue&rft.date=2024&rft.volume=73&rft.spage=1&rft.epage=12&rft.pages=1-12&rft.issn=0018-9456&rft.eissn=1557-9662&rft.coden=IEIMAO&rft_id=info:doi/10.1109/TIM.2024.3472799&rft_dat=%3Cproquest_cross%3E3120647555%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c175t-4bc5a58c36f9b69f1a4efb42afdf78dc55be1259fcc95ceafd2227f122290bec3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3120647555&rft_id=info:pmid/&rft_ieee_id=10705020&rfr_iscdi=true