Loading…

Adversarial Computer Vision via Acoustic Manipulation of Camera Sensors

Autonomous vehicles increasingly rely on camera-based computer vision systems to perceive environments and make critical driving decisions. To improve image quality, image stabilizers with inertial sensors are added to reduce image blurring caused by camera jitters. However, this trend creates a new...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on dependable and secure computing 2024-07, Vol.21 (4), p.3734-3750
Main Authors: Cheng, Yushi, Ji, Xiaoyu, Zhu, Wenjun, Zhang, Shibo, Fu, Kevin, Xu, Wenyuan
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page 3750
container_issue 4
container_start_page 3734
container_title IEEE transactions on dependable and secure computing
container_volume 21
creator Cheng, Yushi
Ji, Xiaoyu
Zhu, Wenjun
Zhang, Shibo
Fu, Kevin
Xu, Wenyuan
description Autonomous vehicles increasingly rely on camera-based computer vision systems to perceive environments and make critical driving decisions. To improve image quality, image stabilizers with inertial sensors are added to reduce image blurring caused by camera jitters. However, this trend creates a new attack surface. This paper identifies a system-level vulnerability resulting from the combination of emerging image stabilizer hardware susceptible to acoustic manipulation and computer vision algorithms subject to adversarial examples. By emitting deliberately designed acoustic signals, an adversary can control the output of an inertial sensor, which triggers unnecessary motion compensation and results in a blurred image, even when the camera is stable. These blurred images can induce object misclassification, affecting safety-critical decision-making. We model the feasibility of such acoustic manipulation and design an attack framework that can accomplish three types of attacks: hiding, creating, and altering objects. Evaluation results demonstrate the effectiveness of our attacks against five object detectors (YOLO V3/V4/V5, Faster R-CNN, and Apollo) and two lane detectors (UFLD and LaneAF). We further introduce the concept of AMpLe attacks, a new class of system-level security vulnerabilities resulting from a combination of adversarial machine learning and physics-based injection of information-carrying signals into hardware.
doi_str_mv 10.1109/TDSC.2023.3334618
format article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_ieee_primary_10330036</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>10330036</ieee_id><sourcerecordid>3078106990</sourcerecordid><originalsourceid>FETCH-LOGICAL-c246t-be153d60c41194c95de6dcaeb94e8c8e7bb396973321ce52b2e40f41d891d7a93</originalsourceid><addsrcrecordid>eNpNkE1Lw0AQhhdRsFZ_gOAh4Dl1Jrv52GOJWoWKh1avy2YzgS1tNu4mBf-9Ce3B0wzM884MD2P3CAtEkE_b5025SCDhC865yLC4YDOUAmMALC7HPhVpnMocr9lNCDuARBRSzNhqWR_JB-2t3kelO3RDTz76tsG6NjpaHS2NG0JvTfShW9sNe91PE9dEpT6Q19GG2uB8uGVXjd4HujvXOft6fdmWb_H6c_VeLtexSUTWxxVhyusMjMDxOyPTmrLaaKqkoMIUlFcVl5nMOU_QUJpUCQloBNaFxDrXks_Z42lv593PQKFXOzf4djypOOQFQiYljBSeKONdCJ4a1Xl70P5XIajJl5p8qcmXOvsaMw-njCWifzznADzjf3iUZok</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3078106990</pqid></control><display><type>article</type><title>Adversarial Computer Vision via Acoustic Manipulation of Camera Sensors</title><source>IEEE Xplore (Online service)</source><creator>Cheng, Yushi ; Ji, Xiaoyu ; Zhu, Wenjun ; Zhang, Shibo ; Fu, Kevin ; Xu, Wenyuan</creator><creatorcontrib>Cheng, Yushi ; Ji, Xiaoyu ; Zhu, Wenjun ; Zhang, Shibo ; Fu, Kevin ; Xu, Wenyuan</creatorcontrib><description>Autonomous vehicles increasingly rely on camera-based computer vision systems to perceive environments and make critical driving decisions. To improve image quality, image stabilizers with inertial sensors are added to reduce image blurring caused by camera jitters. However, this trend creates a new attack surface. This paper identifies a system-level vulnerability resulting from the combination of emerging image stabilizer hardware susceptible to acoustic manipulation and computer vision algorithms subject to adversarial examples. By emitting deliberately designed acoustic signals, an adversary can control the output of an inertial sensor, which triggers unnecessary motion compensation and results in a blurred image, even when the camera is stable. These blurred images can induce object misclassification, affecting safety-critical decision-making. We model the feasibility of such acoustic manipulation and design an attack framework that can accomplish three types of attacks: hiding, creating, and altering objects. Evaluation results demonstrate the effectiveness of our attacks against five object detectors (YOLO V3/V4/V5, Faster R-CNN, and Apollo) and two lane detectors (UFLD and LaneAF). We further introduce the concept of AMpLe attacks, a new class of system-level security vulnerabilities resulting from a combination of adversarial machine learning and physics-based injection of information-carrying signals into hardware.</description><identifier>ISSN: 1545-5971</identifier><identifier>EISSN: 1941-0018</identifier><identifier>DOI: 10.1109/TDSC.2023.3334618</identifier><identifier>CODEN: ITDSCM</identifier><language>eng</language><publisher>Washington: IEEE</publisher><subject>Acoustics ; Adversarial machine learning ; Algorithms ; Automobiles ; Blurring ; Cameras ; Computer vision ; Detectors ; Hardware ; Image manipulation ; Image quality ; Image stabilizers ; Inertial sensing devices ; Inertial sensors ; intelligent vehicle security ; Machine learning ; Motion compensation ; Safety critical ; Sensor systems ; Sensors ; Vision systems</subject><ispartof>IEEE transactions on dependable and secure computing, 2024-07, Vol.21 (4), p.3734-3750</ispartof><rights>Copyright IEEE Computer Society 2024</rights><woscitedreferencessubscribed>false</woscitedreferencessubscribed><orcidid>0000-0002-0888-2322 ; 0000-0002-1101-0007 ; 0000-0002-5770-6421 ; 0009-0000-9595-9203 ; 0009-0009-1545-8106 ; 0000-0002-5043-9148</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/10330036$$EHTML$$P50$$Gieee$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,54796</link.rule.ids></links><search><creatorcontrib>Cheng, Yushi</creatorcontrib><creatorcontrib>Ji, Xiaoyu</creatorcontrib><creatorcontrib>Zhu, Wenjun</creatorcontrib><creatorcontrib>Zhang, Shibo</creatorcontrib><creatorcontrib>Fu, Kevin</creatorcontrib><creatorcontrib>Xu, Wenyuan</creatorcontrib><title>Adversarial Computer Vision via Acoustic Manipulation of Camera Sensors</title><title>IEEE transactions on dependable and secure computing</title><addtitle>TDSC</addtitle><description>Autonomous vehicles increasingly rely on camera-based computer vision systems to perceive environments and make critical driving decisions. To improve image quality, image stabilizers with inertial sensors are added to reduce image blurring caused by camera jitters. However, this trend creates a new attack surface. This paper identifies a system-level vulnerability resulting from the combination of emerging image stabilizer hardware susceptible to acoustic manipulation and computer vision algorithms subject to adversarial examples. By emitting deliberately designed acoustic signals, an adversary can control the output of an inertial sensor, which triggers unnecessary motion compensation and results in a blurred image, even when the camera is stable. These blurred images can induce object misclassification, affecting safety-critical decision-making. We model the feasibility of such acoustic manipulation and design an attack framework that can accomplish three types of attacks: hiding, creating, and altering objects. Evaluation results demonstrate the effectiveness of our attacks against five object detectors (YOLO V3/V4/V5, Faster R-CNN, and Apollo) and two lane detectors (UFLD and LaneAF). We further introduce the concept of AMpLe attacks, a new class of system-level security vulnerabilities resulting from a combination of adversarial machine learning and physics-based injection of information-carrying signals into hardware.</description><subject>Acoustics</subject><subject>Adversarial machine learning</subject><subject>Algorithms</subject><subject>Automobiles</subject><subject>Blurring</subject><subject>Cameras</subject><subject>Computer vision</subject><subject>Detectors</subject><subject>Hardware</subject><subject>Image manipulation</subject><subject>Image quality</subject><subject>Image stabilizers</subject><subject>Inertial sensing devices</subject><subject>Inertial sensors</subject><subject>intelligent vehicle security</subject><subject>Machine learning</subject><subject>Motion compensation</subject><subject>Safety critical</subject><subject>Sensor systems</subject><subject>Sensors</subject><subject>Vision systems</subject><issn>1545-5971</issn><issn>1941-0018</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNpNkE1Lw0AQhhdRsFZ_gOAh4Dl1Jrv52GOJWoWKh1avy2YzgS1tNu4mBf-9Ce3B0wzM884MD2P3CAtEkE_b5025SCDhC865yLC4YDOUAmMALC7HPhVpnMocr9lNCDuARBRSzNhqWR_JB-2t3kelO3RDTz76tsG6NjpaHS2NG0JvTfShW9sNe91PE9dEpT6Q19GG2uB8uGVXjd4HujvXOft6fdmWb_H6c_VeLtexSUTWxxVhyusMjMDxOyPTmrLaaKqkoMIUlFcVl5nMOU_QUJpUCQloBNaFxDrXks_Z42lv593PQKFXOzf4djypOOQFQiYljBSeKONdCJ4a1Xl70P5XIajJl5p8qcmXOvsaMw-njCWifzznADzjf3iUZok</recordid><startdate>20240701</startdate><enddate>20240701</enddate><creator>Cheng, Yushi</creator><creator>Ji, Xiaoyu</creator><creator>Zhu, Wenjun</creator><creator>Zhang, Shibo</creator><creator>Fu, Kevin</creator><creator>Xu, Wenyuan</creator><general>IEEE</general><general>IEEE Computer Society</general><scope>97E</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope><orcidid>https://orcid.org/0000-0002-0888-2322</orcidid><orcidid>https://orcid.org/0000-0002-1101-0007</orcidid><orcidid>https://orcid.org/0000-0002-5770-6421</orcidid><orcidid>https://orcid.org/0009-0000-9595-9203</orcidid><orcidid>https://orcid.org/0009-0009-1545-8106</orcidid><orcidid>https://orcid.org/0000-0002-5043-9148</orcidid></search><sort><creationdate>20240701</creationdate><title>Adversarial Computer Vision via Acoustic Manipulation of Camera Sensors</title><author>Cheng, Yushi ; Ji, Xiaoyu ; Zhu, Wenjun ; Zhang, Shibo ; Fu, Kevin ; Xu, Wenyuan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c246t-be153d60c41194c95de6dcaeb94e8c8e7bb396973321ce52b2e40f41d891d7a93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Acoustics</topic><topic>Adversarial machine learning</topic><topic>Algorithms</topic><topic>Automobiles</topic><topic>Blurring</topic><topic>Cameras</topic><topic>Computer vision</topic><topic>Detectors</topic><topic>Hardware</topic><topic>Image manipulation</topic><topic>Image quality</topic><topic>Image stabilizers</topic><topic>Inertial sensing devices</topic><topic>Inertial sensors</topic><topic>intelligent vehicle security</topic><topic>Machine learning</topic><topic>Motion compensation</topic><topic>Safety critical</topic><topic>Sensor systems</topic><topic>Sensors</topic><topic>Vision systems</topic><toplevel>online_resources</toplevel><creatorcontrib>Cheng, Yushi</creatorcontrib><creatorcontrib>Ji, Xiaoyu</creatorcontrib><creatorcontrib>Zhu, Wenjun</creatorcontrib><creatorcontrib>Zhang, Shibo</creatorcontrib><creatorcontrib>Fu, Kevin</creatorcontrib><creatorcontrib>Xu, Wenyuan</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998–Present</collection><collection>IEEE</collection><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>IEEE transactions on dependable and secure computing</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Cheng, Yushi</au><au>Ji, Xiaoyu</au><au>Zhu, Wenjun</au><au>Zhang, Shibo</au><au>Fu, Kevin</au><au>Xu, Wenyuan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Adversarial Computer Vision via Acoustic Manipulation of Camera Sensors</atitle><jtitle>IEEE transactions on dependable and secure computing</jtitle><stitle>TDSC</stitle><date>2024-07-01</date><risdate>2024</risdate><volume>21</volume><issue>4</issue><spage>3734</spage><epage>3750</epage><pages>3734-3750</pages><issn>1545-5971</issn><eissn>1941-0018</eissn><coden>ITDSCM</coden><abstract>Autonomous vehicles increasingly rely on camera-based computer vision systems to perceive environments and make critical driving decisions. To improve image quality, image stabilizers with inertial sensors are added to reduce image blurring caused by camera jitters. However, this trend creates a new attack surface. This paper identifies a system-level vulnerability resulting from the combination of emerging image stabilizer hardware susceptible to acoustic manipulation and computer vision algorithms subject to adversarial examples. By emitting deliberately designed acoustic signals, an adversary can control the output of an inertial sensor, which triggers unnecessary motion compensation and results in a blurred image, even when the camera is stable. These blurred images can induce object misclassification, affecting safety-critical decision-making. We model the feasibility of such acoustic manipulation and design an attack framework that can accomplish three types of attacks: hiding, creating, and altering objects. Evaluation results demonstrate the effectiveness of our attacks against five object detectors (YOLO V3/V4/V5, Faster R-CNN, and Apollo) and two lane detectors (UFLD and LaneAF). We further introduce the concept of AMpLe attacks, a new class of system-level security vulnerabilities resulting from a combination of adversarial machine learning and physics-based injection of information-carrying signals into hardware.</abstract><cop>Washington</cop><pub>IEEE</pub><doi>10.1109/TDSC.2023.3334618</doi><tpages>17</tpages><orcidid>https://orcid.org/0000-0002-0888-2322</orcidid><orcidid>https://orcid.org/0000-0002-1101-0007</orcidid><orcidid>https://orcid.org/0000-0002-5770-6421</orcidid><orcidid>https://orcid.org/0009-0000-9595-9203</orcidid><orcidid>https://orcid.org/0009-0009-1545-8106</orcidid><orcidid>https://orcid.org/0000-0002-5043-9148</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1545-5971
ispartof IEEE transactions on dependable and secure computing, 2024-07, Vol.21 (4), p.3734-3750
issn 1545-5971
1941-0018
language eng
recordid cdi_ieee_primary_10330036
source IEEE Xplore (Online service)
subjects Acoustics
Adversarial machine learning
Algorithms
Automobiles
Blurring
Cameras
Computer vision
Detectors
Hardware
Image manipulation
Image quality
Image stabilizers
Inertial sensing devices
Inertial sensors
intelligent vehicle security
Machine learning
Motion compensation
Safety critical
Sensor systems
Sensors
Vision systems
title Adversarial Computer Vision via Acoustic Manipulation of Camera Sensors
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T14%3A20%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Adversarial%20Computer%20Vision%20via%20Acoustic%20Manipulation%20of%20Camera%20Sensors&rft.jtitle=IEEE%20transactions%20on%20dependable%20and%20secure%20computing&rft.au=Cheng,%20Yushi&rft.date=2024-07-01&rft.volume=21&rft.issue=4&rft.spage=3734&rft.epage=3750&rft.pages=3734-3750&rft.issn=1545-5971&rft.eissn=1941-0018&rft.coden=ITDSCM&rft_id=info:doi/10.1109/TDSC.2023.3334618&rft_dat=%3Cproquest_ieee_%3E3078106990%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c246t-be153d60c41194c95de6dcaeb94e8c8e7bb396973321ce52b2e40f41d891d7a93%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=3078106990&rft_id=info:pmid/&rft_ieee_id=10330036&rfr_iscdi=true