Loading…

Evidence-Based Misinformation Interventions: Challenges and Opportunities for Measurement and Collaboration

The lingering coronavirus pandemic has only underscored the need to find effective interventions to help internet users evaluate the credibility of the information before them. Yet a divide remains between researchers within digital platforms and those in academia and other research professions who...

Full description

Saved in:
Bibliographic Details
Published in:Policy File 2023
Main Authors: Green, Yasmin, Gully, Andrew, Roth, Yoel, Roy, Abhishek, Tucker, Joshua A, Wanless, Alicia
Format: Report
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue
container_start_page
container_title Policy File
container_volume
creator Green, Yasmin
Gully, Andrew
Roth, Yoel
Roy, Abhishek
Tucker, Joshua A
Wanless, Alicia
description The lingering coronavirus pandemic has only underscored the need to find effective interventions to help internet users evaluate the credibility of the information before them. Yet a divide remains between researchers within digital platforms and those in academia and other research professions who are analyzing interventions. Beyond issues related to data access, a challenge deserving papers of its own, opportunities exist to clarify the core competencies of each research community and to build bridges between them in pursuit of the shared goal of improving user-facing interventions that address misinformation online. This paper attempts to contribute to such bridge-building by posing questions for discussion: How do different incentive structures determine the selection of outcome metrics and the design of research studies by academics and platform researchers, given the values and objectives of their respective institutions? What factors affect the evaluation of intervention feasibility for platforms that are not present for academics (for example, platform users' perceptions, measurability at scale, interaction, and longitudinal effects on metrics that are introduced in real-world deployments)? What are the mutually beneficial opportunities for collaboration (such as increased insight-sharing from platforms to researchers about user feedback regarding a diversity of intervention designs). Finally, we introduce a measurement attributes framework to aid development of feasible, meaningful, and replicable metrics for researchers and platform practitioners to consider when developing, testing, and deploying misinformation interventions.
format report
fullrecord <record><control><sourceid>proquest_AOXKD</sourceid><recordid>TN_cdi_proquest_reports_2808540090</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2808540090</sourcerecordid><originalsourceid>FETCH-proquest_reports_28085400903</originalsourceid><addsrcrecordid>eNqNjL0KwkAQhNNYiPoOh33g8AcSS0NEi2BjH1az0cPLXty9y_N7CT6A1TDD9808eZeDaZAemB5BsFGVEUOt4w68caQu5JEHpLHIQRUvsBbpiaKAGnXte8c-kPEmLtFSFYIExi4aE1E4a-HueHpbJrMWrODql4tkfSpvxTnt2X0Ciq8Zx0OpN5nO9jutc739C_oC6H9ECw</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>report</recordtype><pqid>2808540090</pqid></control><display><type>report</type><title>Evidence-Based Misinformation Interventions: Challenges and Opportunities for Measurement and Collaboration</title><source>Policy File Index</source><creator>Green, Yasmin ; Gully, Andrew ; Roth, Yoel ; Roy, Abhishek ; Tucker, Joshua A ; Wanless, Alicia</creator><creatorcontrib>Green, Yasmin ; Gully, Andrew ; Roth, Yoel ; Roy, Abhishek ; Tucker, Joshua A ; Wanless, Alicia</creatorcontrib><description>The lingering coronavirus pandemic has only underscored the need to find effective interventions to help internet users evaluate the credibility of the information before them. Yet a divide remains between researchers within digital platforms and those in academia and other research professions who are analyzing interventions. Beyond issues related to data access, a challenge deserving papers of its own, opportunities exist to clarify the core competencies of each research community and to build bridges between them in pursuit of the shared goal of improving user-facing interventions that address misinformation online. This paper attempts to contribute to such bridge-building by posing questions for discussion: How do different incentive structures determine the selection of outcome metrics and the design of research studies by academics and platform researchers, given the values and objectives of their respective institutions? What factors affect the evaluation of intervention feasibility for platforms that are not present for academics (for example, platform users' perceptions, measurability at scale, interaction, and longitudinal effects on metrics that are introduced in real-world deployments)? What are the mutually beneficial opportunities for collaboration (such as increased insight-sharing from platforms to researchers about user feedback regarding a diversity of intervention designs). Finally, we introduce a measurement attributes framework to aid development of feasible, meaningful, and replicable metrics for researchers and platform practitioners to consider when developing, testing, and deploying misinformation interventions.</description><language>eng</language><publisher>Carnegie Endowment for International Peace - US</publisher><subject>Big Data ; Carnegie Endowment for International Peace - US</subject><ispartof>Policy File, 2023</ispartof><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://www.proquest.com/docview/2808540090?pq-origsite=primo$$EHTML$$P50$$Gproquest$$H</linktohtml><link.rule.ids>777,781,4476,43729,72845,72850</link.rule.ids><linktorsrc>$$Uhttps://www.proquest.com/docview/2808540090?pq-origsite=primo$$EView_record_in_ProQuest$$FView_record_in_$$GProQuest</linktorsrc></links><search><creatorcontrib>Green, Yasmin</creatorcontrib><creatorcontrib>Gully, Andrew</creatorcontrib><creatorcontrib>Roth, Yoel</creatorcontrib><creatorcontrib>Roy, Abhishek</creatorcontrib><creatorcontrib>Tucker, Joshua A</creatorcontrib><creatorcontrib>Wanless, Alicia</creatorcontrib><title>Evidence-Based Misinformation Interventions: Challenges and Opportunities for Measurement and Collaboration</title><title>Policy File</title><description>The lingering coronavirus pandemic has only underscored the need to find effective interventions to help internet users evaluate the credibility of the information before them. Yet a divide remains between researchers within digital platforms and those in academia and other research professions who are analyzing interventions. Beyond issues related to data access, a challenge deserving papers of its own, opportunities exist to clarify the core competencies of each research community and to build bridges between them in pursuit of the shared goal of improving user-facing interventions that address misinformation online. This paper attempts to contribute to such bridge-building by posing questions for discussion: How do different incentive structures determine the selection of outcome metrics and the design of research studies by academics and platform researchers, given the values and objectives of their respective institutions? What factors affect the evaluation of intervention feasibility for platforms that are not present for academics (for example, platform users' perceptions, measurability at scale, interaction, and longitudinal effects on metrics that are introduced in real-world deployments)? What are the mutually beneficial opportunities for collaboration (such as increased insight-sharing from platforms to researchers about user feedback regarding a diversity of intervention designs). Finally, we introduce a measurement attributes framework to aid development of feasible, meaningful, and replicable metrics for researchers and platform practitioners to consider when developing, testing, and deploying misinformation interventions.</description><subject>Big Data</subject><subject>Carnegie Endowment for International Peace - US</subject><fulltext>true</fulltext><rsrctype>report</rsrctype><creationdate>2023</creationdate><recordtype>report</recordtype><sourceid>ABWIU</sourceid><sourceid>AFVLS</sourceid><sourceid>ALSLI</sourceid><sourceid>AOXKD</sourceid><sourceid>DPSOV</sourceid><recordid>eNqNjL0KwkAQhNNYiPoOh33g8AcSS0NEi2BjH1az0cPLXty9y_N7CT6A1TDD9808eZeDaZAemB5BsFGVEUOt4w68caQu5JEHpLHIQRUvsBbpiaKAGnXte8c-kPEmLtFSFYIExi4aE1E4a-HueHpbJrMWrODql4tkfSpvxTnt2X0Ciq8Zx0OpN5nO9jutc739C_oC6H9ECw</recordid><startdate>20230109</startdate><enddate>20230109</enddate><creator>Green, Yasmin</creator><creator>Gully, Andrew</creator><creator>Roth, Yoel</creator><creator>Roy, Abhishek</creator><creator>Tucker, Joshua A</creator><creator>Wanless, Alicia</creator><general>Carnegie Endowment for International Peace - US</general><scope>ABWIU</scope><scope>AFVLS</scope><scope>ALSLI</scope><scope>AOXKD</scope><scope>DPSOV</scope></search><sort><creationdate>20230109</creationdate><title>Evidence-Based Misinformation Interventions: Challenges and Opportunities for Measurement and Collaboration</title><author>Green, Yasmin ; Gully, Andrew ; Roth, Yoel ; Roy, Abhishek ; Tucker, Joshua A ; Wanless, Alicia</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_reports_28085400903</frbrgroupid><rsrctype>reports</rsrctype><prefilter>reports</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Big Data</topic><topic>Carnegie Endowment for International Peace - US</topic><toplevel>online_resources</toplevel><creatorcontrib>Green, Yasmin</creatorcontrib><creatorcontrib>Gully, Andrew</creatorcontrib><creatorcontrib>Roth, Yoel</creatorcontrib><creatorcontrib>Roy, Abhishek</creatorcontrib><creatorcontrib>Tucker, Joshua A</creatorcontrib><creatorcontrib>Wanless, Alicia</creatorcontrib><collection>Social Science Premium Collection</collection><collection>Policy File Index</collection><collection>Politics Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Green, Yasmin</au><au>Gully, Andrew</au><au>Roth, Yoel</au><au>Roy, Abhishek</au><au>Tucker, Joshua A</au><au>Wanless, Alicia</au><format>book</format><genre>unknown</genre><ristype>RPRT</ristype><atitle>Evidence-Based Misinformation Interventions: Challenges and Opportunities for Measurement and Collaboration</atitle><jtitle>Policy File</jtitle><date>2023-01-09</date><risdate>2023</risdate><abstract>The lingering coronavirus pandemic has only underscored the need to find effective interventions to help internet users evaluate the credibility of the information before them. Yet a divide remains between researchers within digital platforms and those in academia and other research professions who are analyzing interventions. Beyond issues related to data access, a challenge deserving papers of its own, opportunities exist to clarify the core competencies of each research community and to build bridges between them in pursuit of the shared goal of improving user-facing interventions that address misinformation online. This paper attempts to contribute to such bridge-building by posing questions for discussion: How do different incentive structures determine the selection of outcome metrics and the design of research studies by academics and platform researchers, given the values and objectives of their respective institutions? What factors affect the evaluation of intervention feasibility for platforms that are not present for academics (for example, platform users' perceptions, measurability at scale, interaction, and longitudinal effects on metrics that are introduced in real-world deployments)? What are the mutually beneficial opportunities for collaboration (such as increased insight-sharing from platforms to researchers about user feedback regarding a diversity of intervention designs). Finally, we introduce a measurement attributes framework to aid development of feasible, meaningful, and replicable metrics for researchers and platform practitioners to consider when developing, testing, and deploying misinformation interventions.</abstract><pub>Carnegie Endowment for International Peace - US</pub></addata></record>
fulltext fulltext_linktorsrc
identifier
ispartof Policy File, 2023
issn
language eng
recordid cdi_proquest_reports_2808540090
source Policy File Index
subjects Big Data
Carnegie Endowment for International Peace - US
title Evidence-Based Misinformation Interventions: Challenges and Opportunities for Measurement and Collaboration
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T15%3A57%3A29IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_AOXKD&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=unknown&rft.atitle=Evidence-Based%20Misinformation%20Interventions:%20Challenges%20and%20Opportunities%20for%20Measurement%20and%20Collaboration&rft.jtitle=Policy%20File&rft.au=Green,%20Yasmin&rft.date=2023-01-09&rft_id=info:doi/&rft_dat=%3Cproquest_AOXKD%3E2808540090%3C/proquest_AOXKD%3E%3Cgrp_id%3Ecdi_FETCH-proquest_reports_28085400903%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2808540090&rft_id=info:pmid/&rfr_iscdi=true