Loading…
Explainable AI tools for legal reasoning about cases: A study on the European Court of Human Rights
In this paper we report on a significant research project undertaken to design, implement and evaluate explainable decision-support tools for deciding legal cases. We provide a model of a legal domain, Article 6 of the European Convention on Human Rights, constructed using a methodology from the fie...
Saved in:
Published in: | Artificial intelligence 2023-04, Vol.317, p.103861, Article 103861 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this paper we report on a significant research project undertaken to design, implement and evaluate explainable decision-support tools for deciding legal cases. We provide a model of a legal domain, Article 6 of the European Convention on Human Rights, constructed using a methodology from the field of computational models of argument. We describe how the formal model has been developed, extended and transformed into practical tools, which were then used in evaluation exercises to determine the effectiveness and usability of the tools. The underpinning AI techniques used yield a level of explanation that is firmly grounded in legal reasoning and is also digestible by the target end users, as demonstrated through our evaluation activities. The results of our experimental evaluation show that on the first pass, our tool achieved an accuracy rate of 97% in matching the actual decisions of the cases and the user studies conducted gave highly encouraging results with respect to usability. As such, our project demonstrates how trustworthy AI tools can be built for a real world legal domain where critical needs of the end users are accounted for. |
---|---|
ISSN: | 0004-3702 1872-7921 |
DOI: | 10.1016/j.artint.2023.103861 |