Loading…

Stereo dense depth tracking based on optical flow using frames and events

Event cameras are biologically inspired sensors that asynchronously detect brightness changes in the scene independently for each pixel. Their output is a stream of events which is reported with a low latency and high temporal resolution of a microsecond, making them superior to standard cameras in...

Full description

Saved in:
Bibliographic Details
Published in:Advanced robotics 2021-02, Vol.35 (3-4), p.141-152
Main Authors: Hadviger, Antea, Marković, Ivan, Petrović, Ivan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c310t-c2c7d03ea6fdbe47494c8be9a57aca7050925b299105ffe4fecedbdbbf073def3
cites cdi_FETCH-LOGICAL-c310t-c2c7d03ea6fdbe47494c8be9a57aca7050925b299105ffe4fecedbdbbf073def3
container_end_page 152
container_issue 3-4
container_start_page 141
container_title Advanced robotics
container_volume 35
creator Hadviger, Antea
Marković, Ivan
Petrović, Ivan
description Event cameras are biologically inspired sensors that asynchronously detect brightness changes in the scene independently for each pixel. Their output is a stream of events which is reported with a low latency and high temporal resolution of a microsecond, making them superior to standard cameras in highly dynamic scenarios when they are sensitive to motion blur. Event cameras can be used in a wide range of applications, one of them being depth estimation, in both stereo and monocular settings. However, most known event-based depth estimation methods yield sparse depth maps due to the nature of the sparse event stream. We present a novel method that fuses information from both events and standard frames, as well as odometry, to exploit the advantages of both sensors. We propose to estimate dense disparity from standard frames at the point of their availability, predict the disparity using odometry information, and track the disparity asynchronously using optical flow of events between the standard frames. We present the performance of the method through several experiments in various setups, including synthetic data, KITTI dataset enhanced with events, MVSEC dataset, as well as our own stereo event camera recordings.
doi_str_mv 10.1080/01691864.2020.1821770
format article
fullrecord <record><control><sourceid>crossref_infor</sourceid><recordid>TN_cdi_crossref_primary_10_1080_01691864_2020_1821770</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>10_1080_01691864_2020_1821770</sourcerecordid><originalsourceid>FETCH-LOGICAL-c310t-c2c7d03ea6fdbe47494c8be9a57aca7050925b299105ffe4fecedbdbbf073def3</originalsourceid><addsrcrecordid>eNp9kM1OwzAQhC0EEqXwCEh-gZR1HMfJDVTxU6kSB-Bs-WcNgTSubEPVtydRy5XLrrQ7M9J8hFwzWDBo4AZY3bKmrhYllOOpKZmUcEJmTNRNIQQXp2Q2aYpJdE4uUvoEgKbickZWLxkjBupwSDjObf6gOWr71Q3v1OiEjoaBhm3urO6p78OOfqfp56PeYKJ6cBR_cMjpkpx53Se8Ou45eXu4f10-Fevnx9Xybl1YziAXtrTSAUdde2ewklVb2cZgq4XUVksQ0JbClG3LQHiPlUeLzjhjPEju0PM5EYdcG0NKEb3axm6j414xUBMP9cdDTTzUkcfouz34usGHuNG7EHunst73IY5lBtslxf-P-AVWJ2j0</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Stereo dense depth tracking based on optical flow using frames and events</title><source>Taylor and Francis Science and Technology Collection</source><creator>Hadviger, Antea ; Marković, Ivan ; Petrović, Ivan</creator><creatorcontrib>Hadviger, Antea ; Marković, Ivan ; Petrović, Ivan</creatorcontrib><description>Event cameras are biologically inspired sensors that asynchronously detect brightness changes in the scene independently for each pixel. Their output is a stream of events which is reported with a low latency and high temporal resolution of a microsecond, making them superior to standard cameras in highly dynamic scenarios when they are sensitive to motion blur. Event cameras can be used in a wide range of applications, one of them being depth estimation, in both stereo and monocular settings. However, most known event-based depth estimation methods yield sparse depth maps due to the nature of the sparse event stream. We present a novel method that fuses information from both events and standard frames, as well as odometry, to exploit the advantages of both sensors. We propose to estimate dense disparity from standard frames at the point of their availability, predict the disparity using odometry information, and track the disparity asynchronously using optical flow of events between the standard frames. We present the performance of the method through several experiments in various setups, including synthetic data, KITTI dataset enhanced with events, MVSEC dataset, as well as our own stereo event camera recordings.</description><identifier>ISSN: 0169-1864</identifier><identifier>EISSN: 1568-5535</identifier><identifier>DOI: 10.1080/01691864.2020.1821770</identifier><language>eng</language><publisher>Taylor &amp; Francis</publisher><subject>depth estimation ; Event cameras ; stereo vision</subject><ispartof>Advanced robotics, 2021-02, Vol.35 (3-4), p.141-152</ispartof><rights>2020 Informa UK Limited, trading as Taylor &amp; Francis Group and The Robotics Society of Japan 2020</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c310t-c2c7d03ea6fdbe47494c8be9a57aca7050925b299105ffe4fecedbdbbf073def3</citedby><cites>FETCH-LOGICAL-c310t-c2c7d03ea6fdbe47494c8be9a57aca7050925b299105ffe4fecedbdbbf073def3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27903,27904</link.rule.ids></links><search><creatorcontrib>Hadviger, Antea</creatorcontrib><creatorcontrib>Marković, Ivan</creatorcontrib><creatorcontrib>Petrović, Ivan</creatorcontrib><title>Stereo dense depth tracking based on optical flow using frames and events</title><title>Advanced robotics</title><description>Event cameras are biologically inspired sensors that asynchronously detect brightness changes in the scene independently for each pixel. Their output is a stream of events which is reported with a low latency and high temporal resolution of a microsecond, making them superior to standard cameras in highly dynamic scenarios when they are sensitive to motion blur. Event cameras can be used in a wide range of applications, one of them being depth estimation, in both stereo and monocular settings. However, most known event-based depth estimation methods yield sparse depth maps due to the nature of the sparse event stream. We present a novel method that fuses information from both events and standard frames, as well as odometry, to exploit the advantages of both sensors. We propose to estimate dense disparity from standard frames at the point of their availability, predict the disparity using odometry information, and track the disparity asynchronously using optical flow of events between the standard frames. We present the performance of the method through several experiments in various setups, including synthetic data, KITTI dataset enhanced with events, MVSEC dataset, as well as our own stereo event camera recordings.</description><subject>depth estimation</subject><subject>Event cameras</subject><subject>stereo vision</subject><issn>0169-1864</issn><issn>1568-5535</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNp9kM1OwzAQhC0EEqXwCEh-gZR1HMfJDVTxU6kSB-Bs-WcNgTSubEPVtydRy5XLrrQ7M9J8hFwzWDBo4AZY3bKmrhYllOOpKZmUcEJmTNRNIQQXp2Q2aYpJdE4uUvoEgKbickZWLxkjBupwSDjObf6gOWr71Q3v1OiEjoaBhm3urO6p78OOfqfp56PeYKJ6cBR_cMjpkpx53Se8Ou45eXu4f10-Fevnx9Xybl1YziAXtrTSAUdde2ewklVb2cZgq4XUVksQ0JbClG3LQHiPlUeLzjhjPEju0PM5EYdcG0NKEb3axm6j414xUBMP9cdDTTzUkcfouz34usGHuNG7EHunst73IY5lBtslxf-P-AVWJ2j0</recordid><startdate>20210216</startdate><enddate>20210216</enddate><creator>Hadviger, Antea</creator><creator>Marković, Ivan</creator><creator>Petrović, Ivan</creator><general>Taylor &amp; Francis</general><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20210216</creationdate><title>Stereo dense depth tracking based on optical flow using frames and events</title><author>Hadviger, Antea ; Marković, Ivan ; Petrović, Ivan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c310t-c2c7d03ea6fdbe47494c8be9a57aca7050925b299105ffe4fecedbdbbf073def3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>depth estimation</topic><topic>Event cameras</topic><topic>stereo vision</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hadviger, Antea</creatorcontrib><creatorcontrib>Marković, Ivan</creatorcontrib><creatorcontrib>Petrović, Ivan</creatorcontrib><collection>CrossRef</collection><jtitle>Advanced robotics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hadviger, Antea</au><au>Marković, Ivan</au><au>Petrović, Ivan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Stereo dense depth tracking based on optical flow using frames and events</atitle><jtitle>Advanced robotics</jtitle><date>2021-02-16</date><risdate>2021</risdate><volume>35</volume><issue>3-4</issue><spage>141</spage><epage>152</epage><pages>141-152</pages><issn>0169-1864</issn><eissn>1568-5535</eissn><abstract>Event cameras are biologically inspired sensors that asynchronously detect brightness changes in the scene independently for each pixel. Their output is a stream of events which is reported with a low latency and high temporal resolution of a microsecond, making them superior to standard cameras in highly dynamic scenarios when they are sensitive to motion blur. Event cameras can be used in a wide range of applications, one of them being depth estimation, in both stereo and monocular settings. However, most known event-based depth estimation methods yield sparse depth maps due to the nature of the sparse event stream. We present a novel method that fuses information from both events and standard frames, as well as odometry, to exploit the advantages of both sensors. We propose to estimate dense disparity from standard frames at the point of their availability, predict the disparity using odometry information, and track the disparity asynchronously using optical flow of events between the standard frames. We present the performance of the method through several experiments in various setups, including synthetic data, KITTI dataset enhanced with events, MVSEC dataset, as well as our own stereo event camera recordings.</abstract><pub>Taylor &amp; Francis</pub><doi>10.1080/01691864.2020.1821770</doi><tpages>12</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0169-1864
ispartof Advanced robotics, 2021-02, Vol.35 (3-4), p.141-152
issn 0169-1864
1568-5535
language eng
recordid cdi_crossref_primary_10_1080_01691864_2020_1821770
source Taylor and Francis Science and Technology Collection
subjects depth estimation
Event cameras
stereo vision
title Stereo dense depth tracking based on optical flow using frames and events
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T15%3A18%3A43IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-crossref_infor&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Stereo%20dense%20depth%20tracking%20based%20on%20optical%20flow%20using%20frames%20and%20events&rft.jtitle=Advanced%20robotics&rft.au=Hadviger,%20Antea&rft.date=2021-02-16&rft.volume=35&rft.issue=3-4&rft.spage=141&rft.epage=152&rft.pages=141-152&rft.issn=0169-1864&rft.eissn=1568-5535&rft_id=info:doi/10.1080/01691864.2020.1821770&rft_dat=%3Ccrossref_infor%3E10_1080_01691864_2020_1821770%3C/crossref_infor%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c310t-c2c7d03ea6fdbe47494c8be9a57aca7050925b299105ffe4fecedbdbbf073def3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true