Loading…

Multi-sensor fusion strategy to obtain 3-D occupancy profile

This paper presents a strategy to fuse information from two vision sensors and one infrared proximity sensor to obtain a three-dimensional occupancy profile of robotic workspace, identify key features, and obtain a 3-D model of the objects in the work space. The two vision sensors are mounted on a s...

Full description

Saved in:
Bibliographic Details
Main Authors: Kumar, M., Garg, D.P., Zachery, R.
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page 6 pp.
container_title
container_volume
creator Kumar, M.
Garg, D.P.
Zachery, R.
description This paper presents a strategy to fuse information from two vision sensors and one infrared proximity sensor to obtain a three-dimensional occupancy profile of robotic workspace, identify key features, and obtain a 3-D model of the objects in the work space. The two vision sensors are mounted on a stereo rig on the sidewall of the robotic workcell. The IR sensor is mounted on the wrist of the robot. The vision sensors on the stereo rig provide information about the three-dimensional position of any point in the robotic workspace. The IR sensor provides the distance of an object from the sensor. The information from these sensors has been fused using a probabilistic approach based on Bayesian formalism in an occupancy grid framework to obtain a 3-D occupancy model of the workspace. The proposed fusion and sensor modeling scheme is demonstrated to reduce individual sensor uncertainties and perform at a superior level as compared to some other schemes.
doi_str_mv 10.1109/IECON.2005.1569225
format conference_proceeding
fullrecord <record><control><sourceid>ieee_CHZPO</sourceid><recordid>TN_cdi_ieee_primary_1569225</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>1569225</ieee_id><sourcerecordid>1569225</sourcerecordid><originalsourceid>FETCH-LOGICAL-i90t-4fcfdef05eb0395ee48e7b2b0a148a78f517735d1068ba720abfe0402f3a8d8c3</originalsourceid><addsrcrecordid>eNotj7FOwzAURS0BEm3hB2DxDzg823HsSCwotFCp0KUDW2Unz8goxFHsDP17KtHpLkdH5xLywKHgHOqn7brZfxYCQBVcVbUQ6oosQRuQtVBCXpMFV0oypcXXLVmm9HMmS1PxBXn-mPscWMIhxYn6OYU40JQnm_H7RHOk0WUbBirZK41tO492aE90nKIPPd6RG2_7hPeXXZHDZn1o3tlu_7ZtXnYs1JBZ6VvfoQeF7tyjEEuD2gkHlpfGauMV11qqjkNlnNUCrPMIJQgvrelMK1fk8V8bEPE4TuHXTqfj5aj8AyXHR_c</addsrcrecordid><sourcetype>Publisher</sourcetype><iscdi>true</iscdi><recordtype>conference_proceeding</recordtype></control><display><type>conference_proceeding</type><title>Multi-sensor fusion strategy to obtain 3-D occupancy profile</title><source>IEEE Xplore All Conference Series</source><creator>Kumar, M. ; Garg, D.P. ; Zachery, R.</creator><creatorcontrib>Kumar, M. ; Garg, D.P. ; Zachery, R.</creatorcontrib><description>This paper presents a strategy to fuse information from two vision sensors and one infrared proximity sensor to obtain a three-dimensional occupancy profile of robotic workspace, identify key features, and obtain a 3-D model of the objects in the work space. The two vision sensors are mounted on a stereo rig on the sidewall of the robotic workcell. The IR sensor is mounted on the wrist of the robot. The vision sensors on the stereo rig provide information about the three-dimensional position of any point in the robotic workspace. The IR sensor provides the distance of an object from the sensor. The information from these sensors has been fused using a probabilistic approach based on Bayesian formalism in an occupancy grid framework to obtain a 3-D occupancy model of the workspace. The proposed fusion and sensor modeling scheme is demonstrated to reduce individual sensor uncertainties and perform at a superior level as compared to some other schemes.</description><identifier>ISSN: 1553-572X</identifier><identifier>ISBN: 0780392523</identifier><identifier>ISBN: 9780780392526</identifier><identifier>DOI: 10.1109/IECON.2005.1569225</identifier><language>eng</language><publisher>IEEE</publisher><subject>Bayesian methods ; Fuses ; Infrared sensors ; Orbital robotics ; Robot sensing systems ; Robot vision systems ; Sensor fusion ; Sensor phenomena and characterization ; Uncertainty ; Wrist</subject><ispartof>31st Annual Conference of IEEE Industrial Electronics Society, 2005. IECON 2005, 2005, p.6 pp.</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/1569225$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>309,310,780,784,789,790,2056,4047,4048,27923,54553,54918,54930</link.rule.ids><linktorsrc>$$Uhttps://ieeexplore.ieee.org/document/1569225$$EView_record_in_IEEE$$FView_record_in_$$GIEEE</linktorsrc></links><search><creatorcontrib>Kumar, M.</creatorcontrib><creatorcontrib>Garg, D.P.</creatorcontrib><creatorcontrib>Zachery, R.</creatorcontrib><title>Multi-sensor fusion strategy to obtain 3-D occupancy profile</title><title>31st Annual Conference of IEEE Industrial Electronics Society, 2005. IECON 2005</title><addtitle>IECON</addtitle><description>This paper presents a strategy to fuse information from two vision sensors and one infrared proximity sensor to obtain a three-dimensional occupancy profile of robotic workspace, identify key features, and obtain a 3-D model of the objects in the work space. The two vision sensors are mounted on a stereo rig on the sidewall of the robotic workcell. The IR sensor is mounted on the wrist of the robot. The vision sensors on the stereo rig provide information about the three-dimensional position of any point in the robotic workspace. The IR sensor provides the distance of an object from the sensor. The information from these sensors has been fused using a probabilistic approach based on Bayesian formalism in an occupancy grid framework to obtain a 3-D occupancy model of the workspace. The proposed fusion and sensor modeling scheme is demonstrated to reduce individual sensor uncertainties and perform at a superior level as compared to some other schemes.</description><subject>Bayesian methods</subject><subject>Fuses</subject><subject>Infrared sensors</subject><subject>Orbital robotics</subject><subject>Robot sensing systems</subject><subject>Robot vision systems</subject><subject>Sensor fusion</subject><subject>Sensor phenomena and characterization</subject><subject>Uncertainty</subject><subject>Wrist</subject><issn>1553-572X</issn><isbn>0780392523</isbn><isbn>9780780392526</isbn><fulltext>true</fulltext><rsrctype>conference_proceeding</rsrctype><creationdate>2005</creationdate><recordtype>conference_proceeding</recordtype><sourceid>6IE</sourceid><recordid>eNotj7FOwzAURS0BEm3hB2DxDzg823HsSCwotFCp0KUDW2Unz8goxFHsDP17KtHpLkdH5xLywKHgHOqn7brZfxYCQBVcVbUQ6oosQRuQtVBCXpMFV0oypcXXLVmm9HMmS1PxBXn-mPscWMIhxYn6OYU40JQnm_H7RHOk0WUbBirZK41tO492aE90nKIPPd6RG2_7hPeXXZHDZn1o3tlu_7ZtXnYs1JBZ6VvfoQeF7tyjEEuD2gkHlpfGauMV11qqjkNlnNUCrPMIJQgvrelMK1fk8V8bEPE4TuHXTqfj5aj8AyXHR_c</recordid><startdate>2005</startdate><enddate>2005</enddate><creator>Kumar, M.</creator><creator>Garg, D.P.</creator><creator>Zachery, R.</creator><general>IEEE</general><scope>6IE</scope><scope>6IH</scope><scope>CBEJK</scope><scope>RIE</scope><scope>RIO</scope></search><sort><creationdate>2005</creationdate><title>Multi-sensor fusion strategy to obtain 3-D occupancy profile</title><author>Kumar, M. ; Garg, D.P. ; Zachery, R.</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-i90t-4fcfdef05eb0395ee48e7b2b0a148a78f517735d1068ba720abfe0402f3a8d8c3</frbrgroupid><rsrctype>conference_proceedings</rsrctype><prefilter>conference_proceedings</prefilter><language>eng</language><creationdate>2005</creationdate><topic>Bayesian methods</topic><topic>Fuses</topic><topic>Infrared sensors</topic><topic>Orbital robotics</topic><topic>Robot sensing systems</topic><topic>Robot vision systems</topic><topic>Sensor fusion</topic><topic>Sensor phenomena and characterization</topic><topic>Uncertainty</topic><topic>Wrist</topic><toplevel>online_resources</toplevel><creatorcontrib>Kumar, M.</creatorcontrib><creatorcontrib>Garg, D.P.</creatorcontrib><creatorcontrib>Zachery, R.</creatorcontrib><collection>IEEE Electronic Library (IEL) Conference Proceedings</collection><collection>IEEE Proceedings Order Plan (POP) 1998-present by volume</collection><collection>IEEE Xplore All Conference Proceedings</collection><collection>IEEE Electronic Library (IEL)</collection><collection>IEEE Proceedings Order Plans (POP) 1998-present</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Kumar, M.</au><au>Garg, D.P.</au><au>Zachery, R.</au><format>book</format><genre>proceeding</genre><ristype>CONF</ristype><atitle>Multi-sensor fusion strategy to obtain 3-D occupancy profile</atitle><btitle>31st Annual Conference of IEEE Industrial Electronics Society, 2005. IECON 2005</btitle><stitle>IECON</stitle><date>2005</date><risdate>2005</risdate><spage>6 pp.</spage><pages>6 pp.-</pages><issn>1553-572X</issn><isbn>0780392523</isbn><isbn>9780780392526</isbn><abstract>This paper presents a strategy to fuse information from two vision sensors and one infrared proximity sensor to obtain a three-dimensional occupancy profile of robotic workspace, identify key features, and obtain a 3-D model of the objects in the work space. The two vision sensors are mounted on a stereo rig on the sidewall of the robotic workcell. The IR sensor is mounted on the wrist of the robot. The vision sensors on the stereo rig provide information about the three-dimensional position of any point in the robotic workspace. The IR sensor provides the distance of an object from the sensor. The information from these sensors has been fused using a probabilistic approach based on Bayesian formalism in an occupancy grid framework to obtain a 3-D occupancy model of the workspace. The proposed fusion and sensor modeling scheme is demonstrated to reduce individual sensor uncertainties and perform at a superior level as compared to some other schemes.</abstract><pub>IEEE</pub><doi>10.1109/IECON.2005.1569225</doi></addata></record>
fulltext fulltext_linktorsrc
identifier ISSN: 1553-572X
ispartof 31st Annual Conference of IEEE Industrial Electronics Society, 2005. IECON 2005, 2005, p.6 pp.
issn 1553-572X
language eng
recordid cdi_ieee_primary_1569225
source IEEE Xplore All Conference Series
subjects Bayesian methods
Fuses
Infrared sensors
Orbital robotics
Robot sensing systems
Robot vision systems
Sensor fusion
Sensor phenomena and characterization
Uncertainty
Wrist
title Multi-sensor fusion strategy to obtain 3-D occupancy profile
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T07%3A33%3A49IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-ieee_CHZPO&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=proceeding&rft.atitle=Multi-sensor%20fusion%20strategy%20to%20obtain%203-D%20occupancy%20profile&rft.btitle=31st%20Annual%20Conference%20of%20IEEE%20Industrial%20Electronics%20Society,%202005.%20IECON%202005&rft.au=Kumar,%20M.&rft.date=2005&rft.spage=6%20pp.&rft.pages=6%20pp.-&rft.issn=1553-572X&rft.isbn=0780392523&rft.isbn_list=9780780392526&rft_id=info:doi/10.1109/IECON.2005.1569225&rft_dat=%3Cieee_CHZPO%3E1569225%3C/ieee_CHZPO%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-i90t-4fcfdef05eb0395ee48e7b2b0a148a78f517735d1068ba720abfe0402f3a8d8c3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ieee_id=1569225&rfr_iscdi=true