Loading…
Argument Schemes and a Dialogue System for Explainable Planning
Artificial Intelligence (AI) is being increasingly deployed in practical applications. However, there is a major concern whether AI systems will be trusted by humans. To establish trust in AI systems, there is a need for users to understand the reasoning behind their solutions. Therefore, systems sh...
Saved in:
Published in: | ACM transactions on intelligent systems and technology 2023-09, Vol.14 (5), p.1-25, Article 89 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Artificial Intelligence (AI) is being increasingly deployed in practical applications. However, there is a major concern whether AI systems will be trusted by humans. To establish trust in AI systems, there is a need for users to understand the reasoning behind their solutions. Therefore, systems should be able to explain and justify their output. Explainable AI Planning is a field that involves explaining the outputs, i.e., solution plans produced by AI planning systems to a user. The main goal of a plan explanation is to help humans understand reasoning behind the plans that are produced by the planners. In this article, we propose an argument scheme-based approach to provide explanations in the domain of AI planning. We present novel argument schemes to create arguments that explain a plan and its key elements and a set of critical questions that allow interaction between the arguments and enable the user to obtain further information regarding the key elements of the plan. Furthermore, we present a novel dialogue system using the argument schemes and critical questions for providing interactive dialectical explanations. |
---|---|
ISSN: | 2157-6904 2157-6912 |
DOI: | 10.1145/3610301 |