Loading…
Hidden in Plain Sight: Exploring Privacy Risks of Mobile Augmented Reality Applications
Mobile augmented reality systems are becoming increasingly common and powerful, with applications in such domains as healthcare, manufacturing, education, and more. This rise in popularity is thanks in part to the functionalities offered by commercially available vision libraries such as ARCore, Vuf...
Saved in:
Published in: | ACM transactions on privacy and security 2022-11, Vol.25 (4), p.1-35 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Mobile augmented reality systems are becoming increasingly common and powerful, with applications in such domains as healthcare, manufacturing, education, and more. This rise in popularity is thanks in part to the functionalities offered by commercially available vision libraries such as ARCore, Vuforia, and Google’s ML Kit; however, these libraries also give rise to the possibility of a
hidden operations threat
, that is, the ability of a malicious or incompetent application developer to conduct additional vision operations behind the scenes of an otherwise honest AR application without alerting the end-user. In this article, we present the privacy risks associated with the hidden operations threat and propose a framework for application development and runtime permissions targeted specifically at preventing the execution of hidden operations. We follow this with a set of experimental results, exploring the feasibility and utility of our system in differentiating between user-expectation-compliant and non-compliant AR applications during runtime testing, for which preliminary results demonstrate accuracy of up to 71%. We conclude with a discussion of open problems in the areas of software testing and privacy standards in mobile AR systems. |
---|---|
ISSN: | 2471-2566 2471-2574 |
DOI: | 10.1145/3524020 |