Loading…

Comparing meta-analyses and preregistered multiple-laboratory replication projects

Many researchers rely on meta-analysis to summarize research evidence. However, there is a concern that publication bias and selective reporting may lead to biased meta-analytic effect sizes. We compare the results of meta-analyses to large-scale preregistered replications in psychology carried out...

Full description

Saved in:
Bibliographic Details
Published in:Nature human behaviour 2020-04, Vol.4 (4), p.423-434
Main Authors: Kvarven, Amanda, Strømland, Eirik, Johannesson, Magnus
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Many researchers rely on meta-analysis to summarize research evidence. However, there is a concern that publication bias and selective reporting may lead to biased meta-analytic effect sizes. We compare the results of meta-analyses to large-scale preregistered replications in psychology carried out at multiple laboratories. The multiple-laboratory replications provide precisely estimated effect sizes that do not suffer from publication bias or selective reporting. We searched the literature and identified 15 meta-analyses on the same topics as multiple-laboratory replications. We find that meta-analytic effect sizes are significantly different from replication effect sizes for 12 out of the 15 meta-replication pairs. These differences are systematic and, on average, meta-analytic effect sizes are almost three times as large as replication effect sizes. We also implement three methods of correcting meta-analysis for bias, but these methods do not substantively improve the meta-analytic results. Kvarven, Strømland and Johannesson compare meta-analyses to multiple-laboratory replication projects and find that meta-analyses overestimate effect sizes by a factor of almost three. Commonly used methods of adjusting for publication bias do not substantively improve results.
ISSN:2397-3374
2397-3374
DOI:10.1038/s41562-019-0787-z