Loading…

What Holds Attention? Linguistic Drivers of Engagement

From advertisers and marketers to salespeople and leaders, everyone wants to hold attention. They want to make ads, pitches, presentations, and content that captivates audiences and keeps them engaged. But not all content has that effect. What makes some content more engaging? A multimethod investig...

Full description

Saved in:
Bibliographic Details
Published in:Journal of marketing 2023-09, Vol.87 (5), p.793-809
Main Authors: Berger, Jonah, Moe, Wendy W., Schweidel, David A.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:From advertisers and marketers to salespeople and leaders, everyone wants to hold attention. They want to make ads, pitches, presentations, and content that captivates audiences and keeps them engaged. But not all content has that effect. What makes some content more engaging? A multimethod investigation combines controlled experiments with natural language processing of 600,000 reading sessions from over 35,000 pieces of content to examine what types of language hold attention and why. Results demonstrate that linguistic features associated with processing ease (e.g., concrete or familiar words) and emotion both play an important role. Rather than simply being driven by valence, though, the effects of emotional language are driven by the degree to which different discrete emotions evoke arousal and uncertainty. Consistent with this idea, anxious, exciting, and hopeful language holds attention while sad language discourages it. Experimental evidence underscores emotional language's causal impact and demonstrates the mediating role of uncertainty and arousal. The findings shed light on what holds attention; illustrate how content creators can generate more impactful content; and, as shown in a stylized simulation, have important societal implications for content recommendation algorithms.
ISSN:0022-2429
1547-7185
DOI:10.1177/00222429231152880