Loading…

Performance rather than Reputation Affects Humans’ Trust towards an Artificial Agent

To succeed in teamwork with artificial agents, humans have to calibrate their trust towards agents based on information they receive about an agent before interaction (reputation information) as well as on experiences they have during interaction (agent performance). This study (N = 253) focused on...

Full description

Saved in:
Bibliographic Details
Published in:Computers in Human Behavior: Artificial Humans 2025-01, p.100122, Article 100122
Main Authors: Becker, Fritz, Spannagl, Celine Ina, Buder, Jürgen, Huff, Markus
Format: Article
Language:English
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:To succeed in teamwork with artificial agents, humans have to calibrate their trust towards agents based on information they receive about an agent before interaction (reputation information) as well as on experiences they have during interaction (agent performance). This study (N = 253) focused on the influence of a virtual agent’s reputation (high/low) and actual observed performance (high/low) on a human user’s behavioral trust (delegation behavior) and self-reported trust (questionnaires) in a cooperative Tetris game. The main findings suggested that agent reputation influences self-reported trust prior to interaction. However, the effect of reputation immediately got overridden by performance of the agent during the interaction. The agent’s performance during the interactive task influenced delegation behavior, as well as self-reported trust measured post-interaction. Pre- to post-change in self-reported trust was significantly larger when reputation and performance were incongruent. We concluded that reputation might have had a smaller than expected influence on behavior in the presence of a novel tool that afforded exploration. Our research contributes to understanding trust and delegation dynamics, which is crucial for the design and adequate use of artificial agent team partners in a world of digital transformation. •We measure trust towards an artificial agent using behavioral and self-reported data.•The agent‘s reputation has an influence on the self-reported trust but not on the behavior.•The agent’s performance dominates the trust towards it during an interaction.•Following an interaction, the influence of reputation on the self-report trust is no longer detectable and is replaced by the influence of the agent’s performance.•First-hand experience is more relevant for trusting an agent than information from secondary sources.
ISSN:2949-8821
2949-8821
DOI:10.1016/j.chbah.2025.100122