Loading…
How should I explain? A comparison of different explanation types for recommender systems
Recommender systems help users locate possible items of interest more quickly by filtering and ranking them in a personalized way. Some of these systems provide the end user not only with such a personalized item list but also with an explanation which describes why a specific item is recommended an...
Saved in:
Published in: | International journal of human-computer studies 2014-04, Vol.72 (4), p.367-382 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Recommender systems help users locate possible items of interest more quickly by filtering and ranking them in a personalized way. Some of these systems provide the end user not only with such a personalized item list but also with an explanation which describes why a specific item is recommended and why the system supposes that the user will like it. Besides helping the user understand the output and rationale of the system, the provision of such explanations can also improve the general acceptance, perceived quality, or effectiveness of the system.
In recent years, the question of how to automatically generate and present system-side explanations has attracted increased interest in research. Today some basic explanation facilities are already incorporated in e-commerce Web sites such as Amazon.com. In this work, we continue this line of recent research and address the question of how explanations can be communicated to the user in a more effective way.
In particular, we present the results of a user study in which users of a recommender system were provided with different types of explanation. We experimented with 10 different explanation types and measured their effects in different dimensions. The explanation types used in the study include both known visualizations from the literature as well as two novel interfaces based on tag clouds. Our study reveals that the content-based tag cloud explanations are particularly helpful to increase the user-perceived level of transparency and to increase user satisfaction even though they demand higher cognitive effort from the user. Based on these insights and observations, we derive a set of possible guidelines for designing or selecting suitable explanations for recommender systems.
•We adopt a two-phase approach to analyze the effects of 10 different explanation types on users.•We conduct a laboratory study to evaluate the explanation types with respect to several quality factors.•Qualitative semi-structured interviews were used to interpret and validate the results of the quantitative study.•We propose a set of guidelines of how to build explanation interfaces for recommender systems that support the user in the decision making process. |
---|---|
ISSN: | 1071-5819 1095-9300 |
DOI: | 10.1016/j.ijhcs.2013.12.007 |