Loading…
SparseRadNet: Sparse Perception Neural Network on Subsampled Radar Data
Radar-based perception has gained increasing attention in autonomous driving, yet the inherent sparsity of radars poses challenges. Radar raw data often contains excessive noise, whereas radar point clouds retain only limited information. In this work, we holistically treat the sparse nature of rada...
Saved in:
Published in: | arXiv.org 2024-07 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | |
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Wu, Jialong Meuter, Mirko Schoeler, Markus Rottmann, Matthias |
description | Radar-based perception has gained increasing attention in autonomous driving, yet the inherent sparsity of radars poses challenges. Radar raw data often contains excessive noise, whereas radar point clouds retain only limited information. In this work, we holistically treat the sparse nature of radar data by introducing an adaptive subsampling method together with a tailored network architecture that exploits the sparsity patterns to discover global and local dependencies in the radar signal. Our subsampling module selects a subset of pixels from range-doppler (RD) spectra that contribute most to the downstream perception tasks. To improve the feature extraction on sparse subsampled data, we propose a new way of applying graph neural networks on radar data and design a novel two-branch backbone to capture both global and local neighbor information. An attentive fusion module is applied to combine features from both branches. Experiments on the RADIal dataset show that our SparseRadNet exceeds state-of-the-art (SOTA) performance in object detection and achieves close to SOTA accuracy in freespace segmentation, meanwhile using sparse subsampled input data. |
format | article |
fullrecord | <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_3069649920</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3069649920</sourcerecordid><originalsourceid>FETCH-proquest_journals_30696499203</originalsourceid><addsrcrecordid>eNqNi8sKwjAURIMgWLT_EHBdiElbrVufqyK2-3K1V7DWJt4k-PsG9ANcHWbOzIhFUqlFskqlnLDY2k4IIfOlzDIVsUNlgCyeoS3Rrfk38RPSFY2764GX6An6APfW9OChqfzFwtP02PJwA-JbcDBj4xv0FuMfp2y-39WbY2JIvzxa13Ta0xBUo0Re5GlRSKH-W30Azsw7Ow</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3069649920</pqid></control><display><type>article</type><title>SparseRadNet: Sparse Perception Neural Network on Subsampled Radar Data</title><source>Publicly Available Content Database</source><creator>Wu, Jialong ; Meuter, Mirko ; Schoeler, Markus ; Rottmann, Matthias</creator><creatorcontrib>Wu, Jialong ; Meuter, Mirko ; Schoeler, Markus ; Rottmann, Matthias</creatorcontrib><description>Radar-based perception has gained increasing attention in autonomous driving, yet the inherent sparsity of radars poses challenges. Radar raw data often contains excessive noise, whereas radar point clouds retain only limited information. In this work, we holistically treat the sparse nature of radar data by introducing an adaptive subsampling method together with a tailored network architecture that exploits the sparsity patterns to discover global and local dependencies in the radar signal. Our subsampling module selects a subset of pixels from range-doppler (RD) spectra that contribute most to the downstream perception tasks. To improve the feature extraction on sparse subsampled data, we propose a new way of applying graph neural networks on radar data and design a novel two-branch backbone to capture both global and local neighbor information. An attentive fusion module is applied to combine features from both branches. Experiments on the RADIal dataset show that our SparseRadNet exceeds state-of-the-art (SOTA) performance in object detection and achieves close to SOTA accuracy in freespace segmentation, meanwhile using sparse subsampled input data.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Graph neural networks ; Modules ; Neural networks ; Object recognition ; Perception ; Radar data ; Sparsity</subject><ispartof>arXiv.org, 2024-07</ispartof><rights>2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/3069649920?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>776,780,25732,36991,44569</link.rule.ids></links><search><creatorcontrib>Wu, Jialong</creatorcontrib><creatorcontrib>Meuter, Mirko</creatorcontrib><creatorcontrib>Schoeler, Markus</creatorcontrib><creatorcontrib>Rottmann, Matthias</creatorcontrib><title>SparseRadNet: Sparse Perception Neural Network on Subsampled Radar Data</title><title>arXiv.org</title><description>Radar-based perception has gained increasing attention in autonomous driving, yet the inherent sparsity of radars poses challenges. Radar raw data often contains excessive noise, whereas radar point clouds retain only limited information. In this work, we holistically treat the sparse nature of radar data by introducing an adaptive subsampling method together with a tailored network architecture that exploits the sparsity patterns to discover global and local dependencies in the radar signal. Our subsampling module selects a subset of pixels from range-doppler (RD) spectra that contribute most to the downstream perception tasks. To improve the feature extraction on sparse subsampled data, we propose a new way of applying graph neural networks on radar data and design a novel two-branch backbone to capture both global and local neighbor information. An attentive fusion module is applied to combine features from both branches. Experiments on the RADIal dataset show that our SparseRadNet exceeds state-of-the-art (SOTA) performance in object detection and achieves close to SOTA accuracy in freespace segmentation, meanwhile using sparse subsampled input data.</description><subject>Graph neural networks</subject><subject>Modules</subject><subject>Neural networks</subject><subject>Object recognition</subject><subject>Perception</subject><subject>Radar data</subject><subject>Sparsity</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNqNi8sKwjAURIMgWLT_EHBdiElbrVufqyK2-3K1V7DWJt4k-PsG9ANcHWbOzIhFUqlFskqlnLDY2k4IIfOlzDIVsUNlgCyeoS3Rrfk38RPSFY2764GX6An6APfW9OChqfzFwtP02PJwA-JbcDBj4xv0FuMfp2y-39WbY2JIvzxa13Ta0xBUo0Re5GlRSKH-W30Azsw7Ow</recordid><startdate>20240716</startdate><enddate>20240716</enddate><creator>Wu, Jialong</creator><creator>Meuter, Mirko</creator><creator>Schoeler, Markus</creator><creator>Rottmann, Matthias</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20240716</creationdate><title>SparseRadNet: Sparse Perception Neural Network on Subsampled Radar Data</title><author>Wu, Jialong ; Meuter, Mirko ; Schoeler, Markus ; Rottmann, Matthias</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_30696499203</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Graph neural networks</topic><topic>Modules</topic><topic>Neural networks</topic><topic>Object recognition</topic><topic>Perception</topic><topic>Radar data</topic><topic>Sparsity</topic><toplevel>online_resources</toplevel><creatorcontrib>Wu, Jialong</creatorcontrib><creatorcontrib>Meuter, Mirko</creatorcontrib><creatorcontrib>Schoeler, Markus</creatorcontrib><creatorcontrib>Rottmann, Matthias</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wu, Jialong</au><au>Meuter, Mirko</au><au>Schoeler, Markus</au><au>Rottmann, Matthias</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>SparseRadNet: Sparse Perception Neural Network on Subsampled Radar Data</atitle><jtitle>arXiv.org</jtitle><date>2024-07-16</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>Radar-based perception has gained increasing attention in autonomous driving, yet the inherent sparsity of radars poses challenges. Radar raw data often contains excessive noise, whereas radar point clouds retain only limited information. In this work, we holistically treat the sparse nature of radar data by introducing an adaptive subsampling method together with a tailored network architecture that exploits the sparsity patterns to discover global and local dependencies in the radar signal. Our subsampling module selects a subset of pixels from range-doppler (RD) spectra that contribute most to the downstream perception tasks. To improve the feature extraction on sparse subsampled data, we propose a new way of applying graph neural networks on radar data and design a novel two-branch backbone to capture both global and local neighbor information. An attentive fusion module is applied to combine features from both branches. Experiments on the RADIal dataset show that our SparseRadNet exceeds state-of-the-art (SOTA) performance in object detection and achieves close to SOTA accuracy in freespace segmentation, meanwhile using sparse subsampled input data.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-07 |
issn | 2331-8422 |
language | eng |
recordid | cdi_proquest_journals_3069649920 |
source | Publicly Available Content Database |
subjects | Graph neural networks Modules Neural networks Object recognition Perception Radar data Sparsity |
title | SparseRadNet: Sparse Perception Neural Network on Subsampled Radar Data |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-27T06%3A01%3A14IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=SparseRadNet:%20Sparse%20Perception%20Neural%20Network%20on%20Subsampled%20Radar%20Data&rft.jtitle=arXiv.org&rft.au=Wu,%20Jialong&rft.date=2024-07-16&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E3069649920%3C/proquest%3E%3Cgrp_id%3Ecdi_FETCH-proquest_journals_30696499203%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3069649920&rft_id=info:pmid/&rfr_iscdi=true |