Loading…

Bots with Feelings: Should AI Agents Express Positive Emotion in Customer Service?

The rise of emotional intelligence technology and the recent debate about the possibility of a “sentient” artificial intelligence (AI) urge the need to study the role of emotion during people’s interactions with AIs. In customer service, human employees are increasingly replaced by AI agents, such a...

Full description

Saved in:
Bibliographic Details
Published in:Information systems research 2023-09, Vol.34 (3), p.1296-1311
Main Authors: Han, Elizabeth, Yin, Dezhi, Zhang, Han
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c362t-7a4fc6e263b2d114d3503580b72c0f0525bc218aecbc083ff5f435b33676a61f3
cites cdi_FETCH-LOGICAL-c362t-7a4fc6e263b2d114d3503580b72c0f0525bc218aecbc083ff5f435b33676a61f3
container_end_page 1311
container_issue 3
container_start_page 1296
container_title Information systems research
container_volume 34
creator Han, Elizabeth
Yin, Dezhi
Zhang, Han
description The rise of emotional intelligence technology and the recent debate about the possibility of a “sentient” artificial intelligence (AI) urge the need to study the role of emotion during people’s interactions with AIs. In customer service, human employees are increasingly replaced by AI agents, such as chatbots, and often these AI agents are equipped with emotion-expressing capabilities to replicate the positive impact of human-expressed positive emotion. But is it indeed beneficial? This research explores how, when, and why an AI agent’s expression of positive emotion affects customers’ service evaluations. Through controlled experiments in which the subjects interacted with a service agent (AI or human) to resolve a hypothetical service issue, we provide answers to these questions. We show that AI-expressed positive emotion can influence customers affectively (by evoking customers’ positive emotions) and cognitively (by violating customers’ expectations) in opposite directions. Thus, positive emotion expressed by an AI agent (versus a human employee) is less effective in facilitating service evaluations. We further underscore that, depending on customers’ expectations toward their relationship with a service agent, AI-expressed positive emotion may enhance or hurt service evaluations. Overall, our work provides useful guidance on how and when companies can best deploy emotion-expressing AI agents. Customer service employees are generally advised to express positive emotion during their interactions with customers. The rise and maturity of artificial intelligence (AI)–powered conversational agents, also known as chatbots, beg the question: should AI agents be equipped with the ability to express positive emotion during customer service interactions? This research explores how, when, and why an AI agent’s expression of positive emotion affects customers’ service evaluations. We argue that AI-expressed positive emotion can influence customers via dual pathways: an affective pathway of emotional contagion and a cognitive pathway of expectation–disconfirmation. We propose that positive emotion expressed by an AI agent (versus a human employee) is less effective in facilitating service evaluations because of a heightened level of expectation–disconfirmation. We further introduce a novel individual difference variable, customers’ relationship norm orientation, which affects their expectations toward the AI agent and moderates the cognitive pathway. Results from t
doi_str_mv 10.1287/isre.2022.1179
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2886076600</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2886076600</sourcerecordid><originalsourceid>FETCH-LOGICAL-c362t-7a4fc6e263b2d114d3503580b72c0f0525bc218aecbc083ff5f435b33676a61f3</originalsourceid><addsrcrecordid>eNqFkM1LAzEQxYMoWKtXzwHPu-Zjk2y9SC2tFgqK1XPYTZM2pbupSbbW_94s9e5lZmDem8f8ALjFKMekFPc2eJ0TREiOsRidgQFmhGeMUX6eZlSITKRyCa5C2CKEKB3RAXh_cjHAbxs3cKb1zrbr8ACXG9ftVnA8h-O1btN-etx7HQJ8c8FGe9Bw2rhoXQttCyddiK7RHi61P1ilH6_Bhal2Qd_89SH4nE0_Ji_Z4vV5PhkvMkU5iZmoCqO4JpzWZIVxsaIMUVaiWhCFDGKE1YrgstKqVqikxjBTUFZTygWvODZ0CO5Od_fefXU6RLl1nW9TpCRlyZHgPH05BPlJpbwLiZCRe2-byv9IjGTPTfbcZM9N9tySITsZbGucb8J_-l9vmG7d</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2886076600</pqid></control><display><type>article</type><title>Bots with Feelings: Should AI Agents Express Positive Emotion in Customer Service?</title><source>Informs</source><creator>Han, Elizabeth ; Yin, Dezhi ; Zhang, Han</creator><creatorcontrib>Han, Elizabeth ; Yin, Dezhi ; Zhang, Han</creatorcontrib><description>The rise of emotional intelligence technology and the recent debate about the possibility of a “sentient” artificial intelligence (AI) urge the need to study the role of emotion during people’s interactions with AIs. In customer service, human employees are increasingly replaced by AI agents, such as chatbots, and often these AI agents are equipped with emotion-expressing capabilities to replicate the positive impact of human-expressed positive emotion. But is it indeed beneficial? This research explores how, when, and why an AI agent’s expression of positive emotion affects customers’ service evaluations. Through controlled experiments in which the subjects interacted with a service agent (AI or human) to resolve a hypothetical service issue, we provide answers to these questions. We show that AI-expressed positive emotion can influence customers affectively (by evoking customers’ positive emotions) and cognitively (by violating customers’ expectations) in opposite directions. Thus, positive emotion expressed by an AI agent (versus a human employee) is less effective in facilitating service evaluations. We further underscore that, depending on customers’ expectations toward their relationship with a service agent, AI-expressed positive emotion may enhance or hurt service evaluations. Overall, our work provides useful guidance on how and when companies can best deploy emotion-expressing AI agents. Customer service employees are generally advised to express positive emotion during their interactions with customers. The rise and maturity of artificial intelligence (AI)–powered conversational agents, also known as chatbots, beg the question: should AI agents be equipped with the ability to express positive emotion during customer service interactions? This research explores how, when, and why an AI agent’s expression of positive emotion affects customers’ service evaluations. We argue that AI-expressed positive emotion can influence customers via dual pathways: an affective pathway of emotional contagion and a cognitive pathway of expectation–disconfirmation. We propose that positive emotion expressed by an AI agent (versus a human employee) is less effective in facilitating service evaluations because of a heightened level of expectation–disconfirmation. We further introduce a novel individual difference variable, customers’ relationship norm orientation, which affects their expectations toward the AI agent and moderates the cognitive pathway. Results from three laboratory experiments substantiate our claims. By revealing a distinctive impact of positive emotion expressed by an AI agent compared with a human employee, these findings deepen our understanding of customers’ reactions to emotional AIs, and they offer valuable insights for the deployment of AIs in customer service. History: Deepa Mani, Senior Editor; Pallab Sanyal, Associate Editor. Supplemental Material: The online appendix is available at https://doi.org/10.1287/isre.2022.1179 .</description><identifier>ISSN: 1047-7047</identifier><identifier>EISSN: 1526-5536</identifier><identifier>DOI: 10.1287/isre.2022.1179</identifier><language>eng</language><publisher>Linthicum: INFORMS</publisher><subject>Agents (artificial intelligence) ; Artificial intelligence ; chatbot ; Chatbots ; Cognitive ability ; conversational agent ; Conversational artificial intelligence ; COVID-19 ; customer service ; Customer services ; emotional artificial intelligence ; emotional contagion ; Emotions ; expectation–disconfirmation ; relationship norm orientation</subject><ispartof>Information systems research, 2023-09, Vol.34 (3), p.1296-1311</ispartof><rights>Copyright Institute for Operations Research and the Management Sciences Sep 2023</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c362t-7a4fc6e263b2d114d3503580b72c0f0525bc218aecbc083ff5f435b33676a61f3</citedby><cites>FETCH-LOGICAL-c362t-7a4fc6e263b2d114d3503580b72c0f0525bc218aecbc083ff5f435b33676a61f3</cites><orcidid>0000-0003-1107-3232 ; 0000-0002-6258-2486 ; 0000-0002-3131-2117</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://pubsonline.informs.org/doi/full/10.1287/isre.2022.1179$$EHTML$$P50$$Ginforms$$H</linktohtml><link.rule.ids>314,780,784,3692,27924,27925,62616</link.rule.ids></links><search><creatorcontrib>Han, Elizabeth</creatorcontrib><creatorcontrib>Yin, Dezhi</creatorcontrib><creatorcontrib>Zhang, Han</creatorcontrib><title>Bots with Feelings: Should AI Agents Express Positive Emotion in Customer Service?</title><title>Information systems research</title><description>The rise of emotional intelligence technology and the recent debate about the possibility of a “sentient” artificial intelligence (AI) urge the need to study the role of emotion during people’s interactions with AIs. In customer service, human employees are increasingly replaced by AI agents, such as chatbots, and often these AI agents are equipped with emotion-expressing capabilities to replicate the positive impact of human-expressed positive emotion. But is it indeed beneficial? This research explores how, when, and why an AI agent’s expression of positive emotion affects customers’ service evaluations. Through controlled experiments in which the subjects interacted with a service agent (AI or human) to resolve a hypothetical service issue, we provide answers to these questions. We show that AI-expressed positive emotion can influence customers affectively (by evoking customers’ positive emotions) and cognitively (by violating customers’ expectations) in opposite directions. Thus, positive emotion expressed by an AI agent (versus a human employee) is less effective in facilitating service evaluations. We further underscore that, depending on customers’ expectations toward their relationship with a service agent, AI-expressed positive emotion may enhance or hurt service evaluations. Overall, our work provides useful guidance on how and when companies can best deploy emotion-expressing AI agents. Customer service employees are generally advised to express positive emotion during their interactions with customers. The rise and maturity of artificial intelligence (AI)–powered conversational agents, also known as chatbots, beg the question: should AI agents be equipped with the ability to express positive emotion during customer service interactions? This research explores how, when, and why an AI agent’s expression of positive emotion affects customers’ service evaluations. We argue that AI-expressed positive emotion can influence customers via dual pathways: an affective pathway of emotional contagion and a cognitive pathway of expectation–disconfirmation. We propose that positive emotion expressed by an AI agent (versus a human employee) is less effective in facilitating service evaluations because of a heightened level of expectation–disconfirmation. We further introduce a novel individual difference variable, customers’ relationship norm orientation, which affects their expectations toward the AI agent and moderates the cognitive pathway. Results from three laboratory experiments substantiate our claims. By revealing a distinctive impact of positive emotion expressed by an AI agent compared with a human employee, these findings deepen our understanding of customers’ reactions to emotional AIs, and they offer valuable insights for the deployment of AIs in customer service. History: Deepa Mani, Senior Editor; Pallab Sanyal, Associate Editor. Supplemental Material: The online appendix is available at https://doi.org/10.1287/isre.2022.1179 .</description><subject>Agents (artificial intelligence)</subject><subject>Artificial intelligence</subject><subject>chatbot</subject><subject>Chatbots</subject><subject>Cognitive ability</subject><subject>conversational agent</subject><subject>Conversational artificial intelligence</subject><subject>COVID-19</subject><subject>customer service</subject><subject>Customer services</subject><subject>emotional artificial intelligence</subject><subject>emotional contagion</subject><subject>Emotions</subject><subject>expectation–disconfirmation</subject><subject>relationship norm orientation</subject><issn>1047-7047</issn><issn>1526-5536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNqFkM1LAzEQxYMoWKtXzwHPu-Zjk2y9SC2tFgqK1XPYTZM2pbupSbbW_94s9e5lZmDem8f8ALjFKMekFPc2eJ0TREiOsRidgQFmhGeMUX6eZlSITKRyCa5C2CKEKB3RAXh_cjHAbxs3cKb1zrbr8ACXG9ftVnA8h-O1btN-etx7HQJ8c8FGe9Bw2rhoXQttCyddiK7RHi61P1ilH6_Bhal2Qd_89SH4nE0_Ji_Z4vV5PhkvMkU5iZmoCqO4JpzWZIVxsaIMUVaiWhCFDGKE1YrgstKqVqikxjBTUFZTygWvODZ0CO5Od_fefXU6RLl1nW9TpCRlyZHgPH05BPlJpbwLiZCRe2-byv9IjGTPTfbcZM9N9tySITsZbGucb8J_-l9vmG7d</recordid><startdate>20230901</startdate><enddate>20230901</enddate><creator>Han, Elizabeth</creator><creator>Yin, Dezhi</creator><creator>Zhang, Han</creator><general>INFORMS</general><general>Institute for Operations Research and the Management Sciences</general><scope>AAYXX</scope><scope>CITATION</scope><scope>JQ2</scope><orcidid>https://orcid.org/0000-0003-1107-3232</orcidid><orcidid>https://orcid.org/0000-0002-6258-2486</orcidid><orcidid>https://orcid.org/0000-0002-3131-2117</orcidid></search><sort><creationdate>20230901</creationdate><title>Bots with Feelings: Should AI Agents Express Positive Emotion in Customer Service?</title><author>Han, Elizabeth ; Yin, Dezhi ; Zhang, Han</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c362t-7a4fc6e263b2d114d3503580b72c0f0525bc218aecbc083ff5f435b33676a61f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Agents (artificial intelligence)</topic><topic>Artificial intelligence</topic><topic>chatbot</topic><topic>Chatbots</topic><topic>Cognitive ability</topic><topic>conversational agent</topic><topic>Conversational artificial intelligence</topic><topic>COVID-19</topic><topic>customer service</topic><topic>Customer services</topic><topic>emotional artificial intelligence</topic><topic>emotional contagion</topic><topic>Emotions</topic><topic>expectation–disconfirmation</topic><topic>relationship norm orientation</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Han, Elizabeth</creatorcontrib><creatorcontrib>Yin, Dezhi</creatorcontrib><creatorcontrib>Zhang, Han</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Computer Science Collection</collection><jtitle>Information systems research</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Han, Elizabeth</au><au>Yin, Dezhi</au><au>Zhang, Han</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Bots with Feelings: Should AI Agents Express Positive Emotion in Customer Service?</atitle><jtitle>Information systems research</jtitle><date>2023-09-01</date><risdate>2023</risdate><volume>34</volume><issue>3</issue><spage>1296</spage><epage>1311</epage><pages>1296-1311</pages><issn>1047-7047</issn><eissn>1526-5536</eissn><abstract>The rise of emotional intelligence technology and the recent debate about the possibility of a “sentient” artificial intelligence (AI) urge the need to study the role of emotion during people’s interactions with AIs. In customer service, human employees are increasingly replaced by AI agents, such as chatbots, and often these AI agents are equipped with emotion-expressing capabilities to replicate the positive impact of human-expressed positive emotion. But is it indeed beneficial? This research explores how, when, and why an AI agent’s expression of positive emotion affects customers’ service evaluations. Through controlled experiments in which the subjects interacted with a service agent (AI or human) to resolve a hypothetical service issue, we provide answers to these questions. We show that AI-expressed positive emotion can influence customers affectively (by evoking customers’ positive emotions) and cognitively (by violating customers’ expectations) in opposite directions. Thus, positive emotion expressed by an AI agent (versus a human employee) is less effective in facilitating service evaluations. We further underscore that, depending on customers’ expectations toward their relationship with a service agent, AI-expressed positive emotion may enhance or hurt service evaluations. Overall, our work provides useful guidance on how and when companies can best deploy emotion-expressing AI agents. Customer service employees are generally advised to express positive emotion during their interactions with customers. The rise and maturity of artificial intelligence (AI)–powered conversational agents, also known as chatbots, beg the question: should AI agents be equipped with the ability to express positive emotion during customer service interactions? This research explores how, when, and why an AI agent’s expression of positive emotion affects customers’ service evaluations. We argue that AI-expressed positive emotion can influence customers via dual pathways: an affective pathway of emotional contagion and a cognitive pathway of expectation–disconfirmation. We propose that positive emotion expressed by an AI agent (versus a human employee) is less effective in facilitating service evaluations because of a heightened level of expectation–disconfirmation. We further introduce a novel individual difference variable, customers’ relationship norm orientation, which affects their expectations toward the AI agent and moderates the cognitive pathway. Results from three laboratory experiments substantiate our claims. By revealing a distinctive impact of positive emotion expressed by an AI agent compared with a human employee, these findings deepen our understanding of customers’ reactions to emotional AIs, and they offer valuable insights for the deployment of AIs in customer service. History: Deepa Mani, Senior Editor; Pallab Sanyal, Associate Editor. Supplemental Material: The online appendix is available at https://doi.org/10.1287/isre.2022.1179 .</abstract><cop>Linthicum</cop><pub>INFORMS</pub><doi>10.1287/isre.2022.1179</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0003-1107-3232</orcidid><orcidid>https://orcid.org/0000-0002-6258-2486</orcidid><orcidid>https://orcid.org/0000-0002-3131-2117</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1047-7047
ispartof Information systems research, 2023-09, Vol.34 (3), p.1296-1311
issn 1047-7047
1526-5536
language eng
recordid cdi_proquest_journals_2886076600
source Informs
subjects Agents (artificial intelligence)
Artificial intelligence
chatbot
Chatbots
Cognitive ability
conversational agent
Conversational artificial intelligence
COVID-19
customer service
Customer services
emotional artificial intelligence
emotional contagion
Emotions
expectation–disconfirmation
relationship norm orientation
title Bots with Feelings: Should AI Agents Express Positive Emotion in Customer Service?
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T01%3A02%3A16IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Bots%20with%20Feelings:%20Should%20AI%20Agents%20Express%20Positive%20Emotion%20in%20Customer%20Service?&rft.jtitle=Information%20systems%20research&rft.au=Han,%20Elizabeth&rft.date=2023-09-01&rft.volume=34&rft.issue=3&rft.spage=1296&rft.epage=1311&rft.pages=1296-1311&rft.issn=1047-7047&rft.eissn=1526-5536&rft_id=info:doi/10.1287/isre.2022.1179&rft_dat=%3Cproquest_cross%3E2886076600%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c362t-7a4fc6e263b2d114d3503580b72c0f0525bc218aecbc083ff5f435b33676a61f3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2886076600&rft_id=info:pmid/&rfr_iscdi=true