Loading…
Adaptive Driving Assistant Model (ADAM) for Advising Drivers of Autonomous Vehicles
Fully autonomous driving is on the horizon; vehicles with advanced driver assistance systems (ADAS) such as Tesla's Autopilot are already available to consumers. However, all currently available ADAS applications require a human driver to be alert and ready to take control if needed. Partially...
Saved in:
Published in: | ACM transactions on interactive intelligent systems 2022-07, Vol.12 (3), p.1-28, Article 21 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | cdi_FETCH-LOGICAL-a206t-9a58f74940570e10fa16faf0c4eb10f5fe5a07317c564665ef814bafaaf8df6d3 |
container_end_page | 28 |
container_issue | 3 |
container_start_page | 1 |
container_title | ACM transactions on interactive intelligent systems |
container_volume | 12 |
creator | Hsieh, Sheng-Jen Wang, Andy R. Madison, Anna Tossell, Chad de Visser, Ewart |
description | Fully autonomous driving is on the horizon; vehicles with advanced driver assistance systems (ADAS) such as Tesla's Autopilot are already available to consumers. However, all currently available ADAS applications require a human driver to be alert and ready to take control if needed. Partially automated driving introduces new complexities to human interactions with cars and can even increase collision risk. A better understanding of drivers’ trust in automation may help reduce these complexities. Much of the existing research on trust in ADAS has relied on use of surveys and physiological measures to assess trust and has been conducted using driving simulators. There have been relatively few studies that use telemetry data from real automated vehicles to assess trust in ADAS. In addition, although some ADAS technologies provide alerts when, for example, drivers’ hands are not on the steering wheel, these systems are not personalized to individual drivers. Needed are adaptive technologies that can help drivers of autonomous vehicles avoid crashes based on multiple real-time data streams. In this paper, we propose an architecture for adaptive autonomous driving assistance. Two layers of multiple sensory fusion models are developed to provide appropriate voice reminders to increase driving safety based on predicted driving status. Results suggest that human trust in automation can be quantified and predicted with 80% accuracy based on vehicle data, and that adaptive speech-based advice can be provided to drivers with 90 to 95% accuracy. With more data, these models can be used to evaluate trust in driving assistance tools, which can ultimately lead to safer and appropriate use of these features. |
doi_str_mv | 10.1145/3545994 |
format | article |
fullrecord | <record><control><sourceid>acm_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1145_3545994</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3545994</sourcerecordid><originalsourceid>FETCH-LOGICAL-a206t-9a58f74940570e10fa16faf0c4eb10f5fe5a07317c564665ef814bafaaf8df6d3</originalsourceid><addsrcrecordid>eNo9kM1LAzEQxYMoWGrx7ik39bCaNJns7jG0fkGLBz-uy3Q3o5G2Kcl2wf_eLa19l3mP-TEwj7FLKe6k1HCvQENZ6hM2GEsjMqONOj16gHM2SulH9AJQoPIBe7MNblrfOT6NvvPrL25T8qnFdcvnoXFLfmOndn7LKURum86nHbNjXUw8ELfbNqzDKmwT_3Tfvl66dMHOCJfJjQ5zyD4eH94nz9ns9ellYmcZjoVpsxKhoFyXWkAunBSE0hCSqLVb9AnIAYpcybyG_g8DjgqpF0iIVDRkGjVk1_u7dQwpRUfVJvoVxt9KimpXR3Wooyev9iTWqyP0v_wDaQxZgA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Adaptive Driving Assistant Model (ADAM) for Advising Drivers of Autonomous Vehicles</title><source>Association for Computing Machinery:Jisc Collections:ACM OPEN Journals 2023-2025 (reading list)</source><creator>Hsieh, Sheng-Jen ; Wang, Andy R. ; Madison, Anna ; Tossell, Chad ; de Visser, Ewart</creator><creatorcontrib>Hsieh, Sheng-Jen ; Wang, Andy R. ; Madison, Anna ; Tossell, Chad ; de Visser, Ewart</creatorcontrib><description>Fully autonomous driving is on the horizon; vehicles with advanced driver assistance systems (ADAS) such as Tesla's Autopilot are already available to consumers. However, all currently available ADAS applications require a human driver to be alert and ready to take control if needed. Partially automated driving introduces new complexities to human interactions with cars and can even increase collision risk. A better understanding of drivers’ trust in automation may help reduce these complexities. Much of the existing research on trust in ADAS has relied on use of surveys and physiological measures to assess trust and has been conducted using driving simulators. There have been relatively few studies that use telemetry data from real automated vehicles to assess trust in ADAS. In addition, although some ADAS technologies provide alerts when, for example, drivers’ hands are not on the steering wheel, these systems are not personalized to individual drivers. Needed are adaptive technologies that can help drivers of autonomous vehicles avoid crashes based on multiple real-time data streams. In this paper, we propose an architecture for adaptive autonomous driving assistance. Two layers of multiple sensory fusion models are developed to provide appropriate voice reminders to increase driving safety based on predicted driving status. Results suggest that human trust in automation can be quantified and predicted with 80% accuracy based on vehicle data, and that adaptive speech-based advice can be provided to drivers with 90 to 95% accuracy. With more data, these models can be used to evaluate trust in driving assistance tools, which can ultimately lead to safer and appropriate use of these features.</description><identifier>ISSN: 2160-6455</identifier><identifier>EISSN: 2160-6463</identifier><identifier>DOI: 10.1145/3545994</identifier><language>eng</language><publisher>New York, NY: ACM</publisher><subject>Applied computing ; Computing methodologies ; Emerging interfaces ; Engineering ; Hardware ; Human-centered computing ; Interactive systems and tools ; Model verification and validation</subject><ispartof>ACM transactions on interactive intelligent systems, 2022-07, Vol.12 (3), p.1-28, Article 21</ispartof><rights>ACM acknowledges that this contribution was authored or co-authored by an employee, contractor, or affiliate of the United States government. As such, the United States government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for government purposes only.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-a206t-9a58f74940570e10fa16faf0c4eb10f5fe5a07317c564665ef814bafaaf8df6d3</cites><orcidid>0000-0002-8804-4715 ; 0000-0003-1116-0774 ; 0000-0002-0856-3743 ; 0000-0003-1662-9308 ; 0000-0001-9238-9081</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Hsieh, Sheng-Jen</creatorcontrib><creatorcontrib>Wang, Andy R.</creatorcontrib><creatorcontrib>Madison, Anna</creatorcontrib><creatorcontrib>Tossell, Chad</creatorcontrib><creatorcontrib>de Visser, Ewart</creatorcontrib><title>Adaptive Driving Assistant Model (ADAM) for Advising Drivers of Autonomous Vehicles</title><title>ACM transactions on interactive intelligent systems</title><addtitle>ACM TIIS</addtitle><description>Fully autonomous driving is on the horizon; vehicles with advanced driver assistance systems (ADAS) such as Tesla's Autopilot are already available to consumers. However, all currently available ADAS applications require a human driver to be alert and ready to take control if needed. Partially automated driving introduces new complexities to human interactions with cars and can even increase collision risk. A better understanding of drivers’ trust in automation may help reduce these complexities. Much of the existing research on trust in ADAS has relied on use of surveys and physiological measures to assess trust and has been conducted using driving simulators. There have been relatively few studies that use telemetry data from real automated vehicles to assess trust in ADAS. In addition, although some ADAS technologies provide alerts when, for example, drivers’ hands are not on the steering wheel, these systems are not personalized to individual drivers. Needed are adaptive technologies that can help drivers of autonomous vehicles avoid crashes based on multiple real-time data streams. In this paper, we propose an architecture for adaptive autonomous driving assistance. Two layers of multiple sensory fusion models are developed to provide appropriate voice reminders to increase driving safety based on predicted driving status. Results suggest that human trust in automation can be quantified and predicted with 80% accuracy based on vehicle data, and that adaptive speech-based advice can be provided to drivers with 90 to 95% accuracy. With more data, these models can be used to evaluate trust in driving assistance tools, which can ultimately lead to safer and appropriate use of these features.</description><subject>Applied computing</subject><subject>Computing methodologies</subject><subject>Emerging interfaces</subject><subject>Engineering</subject><subject>Hardware</subject><subject>Human-centered computing</subject><subject>Interactive systems and tools</subject><subject>Model verification and validation</subject><issn>2160-6455</issn><issn>2160-6463</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNo9kM1LAzEQxYMoWGrx7ik39bCaNJns7jG0fkGLBz-uy3Q3o5G2Kcl2wf_eLa19l3mP-TEwj7FLKe6k1HCvQENZ6hM2GEsjMqONOj16gHM2SulH9AJQoPIBe7MNblrfOT6NvvPrL25T8qnFdcvnoXFLfmOndn7LKURum86nHbNjXUw8ELfbNqzDKmwT_3Tfvl66dMHOCJfJjQ5zyD4eH94nz9ns9ellYmcZjoVpsxKhoFyXWkAunBSE0hCSqLVb9AnIAYpcybyG_g8DjgqpF0iIVDRkGjVk1_u7dQwpRUfVJvoVxt9KimpXR3Wooyev9iTWqyP0v_wDaQxZgA</recordid><startdate>20220726</startdate><enddate>20220726</enddate><creator>Hsieh, Sheng-Jen</creator><creator>Wang, Andy R.</creator><creator>Madison, Anna</creator><creator>Tossell, Chad</creator><creator>de Visser, Ewart</creator><general>ACM</general><scope>AAYXX</scope><scope>CITATION</scope><orcidid>https://orcid.org/0000-0002-8804-4715</orcidid><orcidid>https://orcid.org/0000-0003-1116-0774</orcidid><orcidid>https://orcid.org/0000-0002-0856-3743</orcidid><orcidid>https://orcid.org/0000-0003-1662-9308</orcidid><orcidid>https://orcid.org/0000-0001-9238-9081</orcidid></search><sort><creationdate>20220726</creationdate><title>Adaptive Driving Assistant Model (ADAM) for Advising Drivers of Autonomous Vehicles</title><author>Hsieh, Sheng-Jen ; Wang, Andy R. ; Madison, Anna ; Tossell, Chad ; de Visser, Ewart</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a206t-9a58f74940570e10fa16faf0c4eb10f5fe5a07317c564665ef814bafaaf8df6d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Applied computing</topic><topic>Computing methodologies</topic><topic>Emerging interfaces</topic><topic>Engineering</topic><topic>Hardware</topic><topic>Human-centered computing</topic><topic>Interactive systems and tools</topic><topic>Model verification and validation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Hsieh, Sheng-Jen</creatorcontrib><creatorcontrib>Wang, Andy R.</creatorcontrib><creatorcontrib>Madison, Anna</creatorcontrib><creatorcontrib>Tossell, Chad</creatorcontrib><creatorcontrib>de Visser, Ewart</creatorcontrib><collection>CrossRef</collection><jtitle>ACM transactions on interactive intelligent systems</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Hsieh, Sheng-Jen</au><au>Wang, Andy R.</au><au>Madison, Anna</au><au>Tossell, Chad</au><au>de Visser, Ewart</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Adaptive Driving Assistant Model (ADAM) for Advising Drivers of Autonomous Vehicles</atitle><jtitle>ACM transactions on interactive intelligent systems</jtitle><stitle>ACM TIIS</stitle><date>2022-07-26</date><risdate>2022</risdate><volume>12</volume><issue>3</issue><spage>1</spage><epage>28</epage><pages>1-28</pages><artnum>21</artnum><issn>2160-6455</issn><eissn>2160-6463</eissn><abstract>Fully autonomous driving is on the horizon; vehicles with advanced driver assistance systems (ADAS) such as Tesla's Autopilot are already available to consumers. However, all currently available ADAS applications require a human driver to be alert and ready to take control if needed. Partially automated driving introduces new complexities to human interactions with cars and can even increase collision risk. A better understanding of drivers’ trust in automation may help reduce these complexities. Much of the existing research on trust in ADAS has relied on use of surveys and physiological measures to assess trust and has been conducted using driving simulators. There have been relatively few studies that use telemetry data from real automated vehicles to assess trust in ADAS. In addition, although some ADAS technologies provide alerts when, for example, drivers’ hands are not on the steering wheel, these systems are not personalized to individual drivers. Needed are adaptive technologies that can help drivers of autonomous vehicles avoid crashes based on multiple real-time data streams. In this paper, we propose an architecture for adaptive autonomous driving assistance. Two layers of multiple sensory fusion models are developed to provide appropriate voice reminders to increase driving safety based on predicted driving status. Results suggest that human trust in automation can be quantified and predicted with 80% accuracy based on vehicle data, and that adaptive speech-based advice can be provided to drivers with 90 to 95% accuracy. With more data, these models can be used to evaluate trust in driving assistance tools, which can ultimately lead to safer and appropriate use of these features.</abstract><cop>New York, NY</cop><pub>ACM</pub><doi>10.1145/3545994</doi><tpages>28</tpages><orcidid>https://orcid.org/0000-0002-8804-4715</orcidid><orcidid>https://orcid.org/0000-0003-1116-0774</orcidid><orcidid>https://orcid.org/0000-0002-0856-3743</orcidid><orcidid>https://orcid.org/0000-0003-1662-9308</orcidid><orcidid>https://orcid.org/0000-0001-9238-9081</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2160-6455 |
ispartof | ACM transactions on interactive intelligent systems, 2022-07, Vol.12 (3), p.1-28, Article 21 |
issn | 2160-6455 2160-6463 |
language | eng |
recordid | cdi_crossref_primary_10_1145_3545994 |
source | Association for Computing Machinery:Jisc Collections:ACM OPEN Journals 2023-2025 (reading list) |
subjects | Applied computing Computing methodologies Emerging interfaces Engineering Hardware Human-centered computing Interactive systems and tools Model verification and validation |
title | Adaptive Driving Assistant Model (ADAM) for Advising Drivers of Autonomous Vehicles |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-05T01%3A39%3A48IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-acm_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Adaptive%20Driving%20Assistant%20Model%20(ADAM)%20for%20Advising%20Drivers%20of%20Autonomous%20Vehicles&rft.jtitle=ACM%20transactions%20on%20interactive%20intelligent%20systems&rft.au=Hsieh,%20Sheng-Jen&rft.date=2022-07-26&rft.volume=12&rft.issue=3&rft.spage=1&rft.epage=28&rft.pages=1-28&rft.artnum=21&rft.issn=2160-6455&rft.eissn=2160-6463&rft_id=info:doi/10.1145/3545994&rft_dat=%3Cacm_cross%3E3545994%3C/acm_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-a206t-9a58f74940570e10fa16faf0c4eb10f5fe5a07317c564665ef814bafaaf8df6d3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true |