Loading…

DOTIE - Detecting Objects through Temporal Isolation of Events using a Spiking Architecture

Vision-based autonomous navigation systems rely on fast and accurate object detection algorithms to avoid obstacles. Algorithms and sensors designed for such systems need to be computationally efficient, due to the limited energy of the hardware used for deployment. Biologically inspired event camer...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2022-10
Main Authors: Nagaraj, Manish, Liyanagedera, Chamika Mihiranga, Kaushik, Roy
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Nagaraj, Manish
Liyanagedera, Chamika Mihiranga
Kaushik, Roy
description Vision-based autonomous navigation systems rely on fast and accurate object detection algorithms to avoid obstacles. Algorithms and sensors designed for such systems need to be computationally efficient, due to the limited energy of the hardware used for deployment. Biologically inspired event cameras are a good candidate as a vision sensor for such systems due to their speed, energy efficiency, and robustness to varying lighting conditions. However, traditional computer vision algorithms fail to work on event-based outputs, as they lack photometric features such as light intensity and texture. In this work, we propose a novel technique that utilizes the temporal information inherently present in the events to efficiently detect moving objects. Our technique consists of a lightweight spiking neural architecture that is able to separate events based on the speed of the corresponding objects. These separated events are then further grouped spatially to determine object boundaries. This method of object detection is both asynchronous and robust to camera noise. In addition, it shows good performance in scenarios with events generated by static objects in the background, where existing event-based algorithms fail. We show that by utilizing our architecture, autonomous navigation systems can have minimal latency and energy overheads for performing object detection.
format article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2721071012</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2721071012</sourcerecordid><originalsourceid>FETCH-proquest_journals_27210710123</originalsourceid><addsrcrecordid>eNqNjM0KgkAUhYcgSMp3uNBamLlmto00ctUidy1kklHHbMbmp-dPoQdodT443zkLEmAcs-iwQ1yR0NqeUor7FJMkDsg9u5ZFDhFkwonaSdXC9dFPZMF1Rvu2g1K8Rm34AIXVA3dSK9AN5B-hJsnbecLhNsrnTEdTd3J-8kZsyLLhgxXhL9dke87L0yUajX57YV3Va2_UVFWYIqMpowzj_6wv1I1CEQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2721071012</pqid></control><display><type>article</type><title>DOTIE - Detecting Objects through Temporal Isolation of Events using a Spiking Architecture</title><source>Publicly Available Content Database</source><creator>Nagaraj, Manish ; Liyanagedera, Chamika Mihiranga ; Kaushik, Roy</creator><creatorcontrib>Nagaraj, Manish ; Liyanagedera, Chamika Mihiranga ; Kaushik, Roy</creatorcontrib><description>Vision-based autonomous navigation systems rely on fast and accurate object detection algorithms to avoid obstacles. Algorithms and sensors designed for such systems need to be computationally efficient, due to the limited energy of the hardware used for deployment. Biologically inspired event cameras are a good candidate as a vision sensor for such systems due to their speed, energy efficiency, and robustness to varying lighting conditions. However, traditional computer vision algorithms fail to work on event-based outputs, as they lack photometric features such as light intensity and texture. In this work, we propose a novel technique that utilizes the temporal information inherently present in the events to efficiently detect moving objects. Our technique consists of a lightweight spiking neural architecture that is able to separate events based on the speed of the corresponding objects. These separated events are then further grouped spatially to determine object boundaries. This method of object detection is both asynchronous and robust to camera noise. In addition, it shows good performance in scenarios with events generated by static objects in the background, where existing event-based algorithms fail. We show that by utilizing our architecture, autonomous navigation systems can have minimal latency and energy overheads for performing object detection.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Autonomous navigation ; Cameras ; Computer vision ; Luminous intensity ; Moving object recognition ; Navigation systems ; Obstacle avoidance ; Spiking ; Static objects</subject><ispartof>arXiv.org, 2022-10</ispartof><rights>2022. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2721071012?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>780,784,25753,37012,44590</link.rule.ids></links><search><creatorcontrib>Nagaraj, Manish</creatorcontrib><creatorcontrib>Liyanagedera, Chamika Mihiranga</creatorcontrib><creatorcontrib>Kaushik, Roy</creatorcontrib><title>DOTIE - Detecting Objects through Temporal Isolation of Events using a Spiking Architecture</title><title>arXiv.org</title><description>Vision-based autonomous navigation systems rely on fast and accurate object detection algorithms to avoid obstacles. Algorithms and sensors designed for such systems need to be computationally efficient, due to the limited energy of the hardware used for deployment. Biologically inspired event cameras are a good candidate as a vision sensor for such systems due to their speed, energy efficiency, and robustness to varying lighting conditions. However, traditional computer vision algorithms fail to work on event-based outputs, as they lack photometric features such as light intensity and texture. In this work, we propose a novel technique that utilizes the temporal information inherently present in the events to efficiently detect moving objects. Our technique consists of a lightweight spiking neural architecture that is able to separate events based on the speed of the corresponding objects. These separated events are then further grouped spatially to determine object boundaries. This method of object detection is both asynchronous and robust to camera noise. In addition, it shows good performance in scenarios with events generated by static objects in the background, where existing event-based algorithms fail. We show that by utilizing our architecture, autonomous navigation systems can have minimal latency and energy overheads for performing object detection.</description><subject>Algorithms</subject><subject>Autonomous navigation</subject><subject>Cameras</subject><subject>Computer vision</subject><subject>Luminous intensity</subject><subject>Moving object recognition</subject><subject>Navigation systems</subject><subject>Obstacle avoidance</subject><subject>Spiking</subject><subject>Static objects</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNjM0KgkAUhYcgSMp3uNBamLlmto00ctUidy1kklHHbMbmp-dPoQdodT443zkLEmAcs-iwQ1yR0NqeUor7FJMkDsg9u5ZFDhFkwonaSdXC9dFPZMF1Rvu2g1K8Rm34AIXVA3dSK9AN5B-hJsnbecLhNsrnTEdTd3J-8kZsyLLhgxXhL9dke87L0yUajX57YV3Va2_UVFWYIqMpowzj_6wv1I1CEQ</recordid><startdate>20221003</startdate><enddate>20221003</enddate><creator>Nagaraj, Manish</creator><creator>Liyanagedera, Chamika Mihiranga</creator><creator>Kaushik, Roy</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20221003</creationdate><title>DOTIE - Detecting Objects through Temporal Isolation of Events using a Spiking Architecture</title><author>Nagaraj, Manish ; Liyanagedera, Chamika Mihiranga ; Kaushik, Roy</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_27210710123</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Algorithms</topic><topic>Autonomous navigation</topic><topic>Cameras</topic><topic>Computer vision</topic><topic>Luminous intensity</topic><topic>Moving object recognition</topic><topic>Navigation systems</topic><topic>Obstacle avoidance</topic><topic>Spiking</topic><topic>Static objects</topic><toplevel>online_resources</toplevel><creatorcontrib>Nagaraj, Manish</creatorcontrib><creatorcontrib>Liyanagedera, Chamika Mihiranga</creatorcontrib><creatorcontrib>Kaushik, Roy</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Nagaraj, Manish</au><au>Liyanagedera, Chamika Mihiranga</au><au>Kaushik, Roy</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>DOTIE - Detecting Objects through Temporal Isolation of Events using a Spiking Architecture</atitle><jtitle>arXiv.org</jtitle><date>2022-10-03</date><risdate>2022</risdate><eissn>2331-8422</eissn><abstract>Vision-based autonomous navigation systems rely on fast and accurate object detection algorithms to avoid obstacles. Algorithms and sensors designed for such systems need to be computationally efficient, due to the limited energy of the hardware used for deployment. Biologically inspired event cameras are a good candidate as a vision sensor for such systems due to their speed, energy efficiency, and robustness to varying lighting conditions. However, traditional computer vision algorithms fail to work on event-based outputs, as they lack photometric features such as light intensity and texture. In this work, we propose a novel technique that utilizes the temporal information inherently present in the events to efficiently detect moving objects. Our technique consists of a lightweight spiking neural architecture that is able to separate events based on the speed of the corresponding objects. These separated events are then further grouped spatially to determine object boundaries. This method of object detection is both asynchronous and robust to camera noise. In addition, it shows good performance in scenarios with events generated by static objects in the background, where existing event-based algorithms fail. We show that by utilizing our architecture, autonomous navigation systems can have minimal latency and energy overheads for performing object detection.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2022-10
issn 2331-8422
language eng
recordid cdi_proquest_journals_2721071012
source Publicly Available Content Database
subjects Algorithms
Autonomous navigation
Cameras
Computer vision
Luminous intensity
Moving object recognition
Navigation systems
Obstacle avoidance
Spiking
Static objects
title DOTIE - Detecting Objects through Temporal Isolation of Events using a Spiking Architecture
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T21%3A32%3A46IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=DOTIE%20-%20Detecting%20Objects%20through%20Temporal%20Isolation%20of%20Events%20using%20a%20Spiking%20Architecture&rft.jtitle=arXiv.org&rft.au=Nagaraj,%20Manish&rft.date=2022-10-03&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2721071012%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_27210710123%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2721071012&rft_id=info:pmid/&rfr_iscdi=true