Loading…

Data sharing upon request and statistical consistency errors in psychology: A replication of Wicherts, Bakker and Molenaar (2011)

Sharing research data allows the scientific community to verify and build upon published work. However, data sharing is not common practice yet. The reasons for not sharing data are myriad: Some are practical, others are more fear-related. One particular fear is that a reanalysis may expose errors....

Full description

Saved in:
Bibliographic Details
Published in:PloS one 2023-01, Vol.18 (4)
Main Authors: Aline Claesen, Wolf Vanpaemel, Anne-Sofie Maerten, Thomas Verliefde, Francis Tuerlinckx, Tom Heyman
Format: Article
Language:English
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue 4
container_start_page
container_title PloS one
container_volume 18
creator Aline Claesen
Wolf Vanpaemel
Anne-Sofie Maerten
Thomas Verliefde
Francis Tuerlinckx
Tom Heyman
description Sharing research data allows the scientific community to verify and build upon published work. However, data sharing is not common practice yet. The reasons for not sharing data are myriad: Some are practical, others are more fear-related. One particular fear is that a reanalysis may expose errors. For this explanation, it would be interesting to know whether authors that do not share data genuinely made more errors than authors who do share data. (Wicherts, Bakker and Molenaar 2011) examined errors that can be discovered based on the published manuscript only, because it is impossible to reanalyze unavailable data. They found a higher prevalence of such errors in papers for which the data were not shared. However, (Nuijten et al. 2017) did not find support for this finding in three large studies. To shed more light on this relation, we conducted a replication of the study by (Wicherts et al. 2011). Our study consisted of two parts. In the first part, we reproduced the analyses from (Wicherts et al. 2011) to verify the results, and we carried out several alternative analytical approaches to evaluate the robustness of the results against other analytical decisions. In the second part, we used a unique and larger data set that originated from (Vanpaemel et al. 2015) on data sharing upon request for reanalysis, to replicate the findings in (Wicherts et al. 2011). We applied statcheck for the detection of consistency errors in all included papers and manually corrected false positives. Finally, we again assessed the robustness of the replication results against other analytical decisions. Everything taken together, we found no robust empirical evidence for the claim that not sharing research data for reanalysis is associated with consistency errors.
format article
fullrecord <record><control><sourceid>doaj</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_9e97d72a3e3c410a9b31ca7c88dc2856</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_9e97d72a3e3c410a9b31ca7c88dc2856</doaj_id><sourcerecordid>oai_doaj_org_article_9e97d72a3e3c410a9b31ca7c88dc2856</sourcerecordid><originalsourceid>FETCH-doaj_primary_oai_doaj_org_article_9e97d72a3e3c410a9b31ca7c88dc28563</originalsourceid><addsrcrecordid>eNqtjs1OwzAQhC0kJArlHfYIEpXimOaHG7-CAzekHqPF2SZujR123UOOvDlWxSNwmtFo5tOcqIVuTbmqysKcqXORXVGsTVNVC_XzhAlBRmQXBjhMMQDT94EkAYYeJGFykpxFDzYGyZ6CnYGYIwu4AJPMdow-DvMd3Oft5HM5ucyJW9g4OxInuYEH3O-Jj8z36CkgMlyVhdbXS3W6RS90-acX6u3l-ePxddVH3HUTuy_kuYvoumMQeeiQ8yNPXUtt3dclGjL2VhfYfhptsbZN09uyWVfmP1m_CGdptg</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Data sharing upon request and statistical consistency errors in psychology: A replication of Wicherts, Bakker and Molenaar (2011)</title><source>PMC (PubMed Central)</source><source>Publicly Available Content (ProQuest)</source><creator>Aline Claesen ; Wolf Vanpaemel ; Anne-Sofie Maerten ; Thomas Verliefde ; Francis Tuerlinckx ; Tom Heyman</creator><creatorcontrib>Aline Claesen ; Wolf Vanpaemel ; Anne-Sofie Maerten ; Thomas Verliefde ; Francis Tuerlinckx ; Tom Heyman</creatorcontrib><description>Sharing research data allows the scientific community to verify and build upon published work. However, data sharing is not common practice yet. The reasons for not sharing data are myriad: Some are practical, others are more fear-related. One particular fear is that a reanalysis may expose errors. For this explanation, it would be interesting to know whether authors that do not share data genuinely made more errors than authors who do share data. (Wicherts, Bakker and Molenaar 2011) examined errors that can be discovered based on the published manuscript only, because it is impossible to reanalyze unavailable data. They found a higher prevalence of such errors in papers for which the data were not shared. However, (Nuijten et al. 2017) did not find support for this finding in three large studies. To shed more light on this relation, we conducted a replication of the study by (Wicherts et al. 2011). Our study consisted of two parts. In the first part, we reproduced the analyses from (Wicherts et al. 2011) to verify the results, and we carried out several alternative analytical approaches to evaluate the robustness of the results against other analytical decisions. In the second part, we used a unique and larger data set that originated from (Vanpaemel et al. 2015) on data sharing upon request for reanalysis, to replicate the findings in (Wicherts et al. 2011). We applied statcheck for the detection of consistency errors in all included papers and manually corrected false positives. Finally, we again assessed the robustness of the replication results against other analytical decisions. Everything taken together, we found no robust empirical evidence for the claim that not sharing research data for reanalysis is associated with consistency errors.</description><identifier>EISSN: 1932-6203</identifier><language>eng</language><publisher>Public Library of Science (PLoS)</publisher><ispartof>PloS one, 2023-01, Vol.18 (4)</ispartof><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,778,782</link.rule.ids></links><search><creatorcontrib>Aline Claesen</creatorcontrib><creatorcontrib>Wolf Vanpaemel</creatorcontrib><creatorcontrib>Anne-Sofie Maerten</creatorcontrib><creatorcontrib>Thomas Verliefde</creatorcontrib><creatorcontrib>Francis Tuerlinckx</creatorcontrib><creatorcontrib>Tom Heyman</creatorcontrib><title>Data sharing upon request and statistical consistency errors in psychology: A replication of Wicherts, Bakker and Molenaar (2011)</title><title>PloS one</title><description>Sharing research data allows the scientific community to verify and build upon published work. However, data sharing is not common practice yet. The reasons for not sharing data are myriad: Some are practical, others are more fear-related. One particular fear is that a reanalysis may expose errors. For this explanation, it would be interesting to know whether authors that do not share data genuinely made more errors than authors who do share data. (Wicherts, Bakker and Molenaar 2011) examined errors that can be discovered based on the published manuscript only, because it is impossible to reanalyze unavailable data. They found a higher prevalence of such errors in papers for which the data were not shared. However, (Nuijten et al. 2017) did not find support for this finding in three large studies. To shed more light on this relation, we conducted a replication of the study by (Wicherts et al. 2011). Our study consisted of two parts. In the first part, we reproduced the analyses from (Wicherts et al. 2011) to verify the results, and we carried out several alternative analytical approaches to evaluate the robustness of the results against other analytical decisions. In the second part, we used a unique and larger data set that originated from (Vanpaemel et al. 2015) on data sharing upon request for reanalysis, to replicate the findings in (Wicherts et al. 2011). We applied statcheck for the detection of consistency errors in all included papers and manually corrected false positives. Finally, we again assessed the robustness of the replication results against other analytical decisions. Everything taken together, we found no robust empirical evidence for the claim that not sharing research data for reanalysis is associated with consistency errors.</description><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>DOA</sourceid><recordid>eNqtjs1OwzAQhC0kJArlHfYIEpXimOaHG7-CAzekHqPF2SZujR123UOOvDlWxSNwmtFo5tOcqIVuTbmqysKcqXORXVGsTVNVC_XzhAlBRmQXBjhMMQDT94EkAYYeJGFykpxFDzYGyZ6CnYGYIwu4AJPMdow-DvMd3Oft5HM5ucyJW9g4OxInuYEH3O-Jj8z36CkgMlyVhdbXS3W6RS90-acX6u3l-ePxddVH3HUTuy_kuYvoumMQeeiQ8yNPXUtt3dclGjL2VhfYfhptsbZN09uyWVfmP1m_CGdptg</recordid><startdate>20230101</startdate><enddate>20230101</enddate><creator>Aline Claesen</creator><creator>Wolf Vanpaemel</creator><creator>Anne-Sofie Maerten</creator><creator>Thomas Verliefde</creator><creator>Francis Tuerlinckx</creator><creator>Tom Heyman</creator><general>Public Library of Science (PLoS)</general><scope>DOA</scope></search><sort><creationdate>20230101</creationdate><title>Data sharing upon request and statistical consistency errors in psychology: A replication of Wicherts, Bakker and Molenaar (2011)</title><author>Aline Claesen ; Wolf Vanpaemel ; Anne-Sofie Maerten ; Thomas Verliefde ; Francis Tuerlinckx ; Tom Heyman</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-doaj_primary_oai_doaj_org_article_9e97d72a3e3c410a9b31ca7c88dc28563</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Aline Claesen</creatorcontrib><creatorcontrib>Wolf Vanpaemel</creatorcontrib><creatorcontrib>Anne-Sofie Maerten</creatorcontrib><creatorcontrib>Thomas Verliefde</creatorcontrib><creatorcontrib>Francis Tuerlinckx</creatorcontrib><creatorcontrib>Tom Heyman</creatorcontrib><collection>DOAJ Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Aline Claesen</au><au>Wolf Vanpaemel</au><au>Anne-Sofie Maerten</au><au>Thomas Verliefde</au><au>Francis Tuerlinckx</au><au>Tom Heyman</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Data sharing upon request and statistical consistency errors in psychology: A replication of Wicherts, Bakker and Molenaar (2011)</atitle><jtitle>PloS one</jtitle><date>2023-01-01</date><risdate>2023</risdate><volume>18</volume><issue>4</issue><eissn>1932-6203</eissn><abstract>Sharing research data allows the scientific community to verify and build upon published work. However, data sharing is not common practice yet. The reasons for not sharing data are myriad: Some are practical, others are more fear-related. One particular fear is that a reanalysis may expose errors. For this explanation, it would be interesting to know whether authors that do not share data genuinely made more errors than authors who do share data. (Wicherts, Bakker and Molenaar 2011) examined errors that can be discovered based on the published manuscript only, because it is impossible to reanalyze unavailable data. They found a higher prevalence of such errors in papers for which the data were not shared. However, (Nuijten et al. 2017) did not find support for this finding in three large studies. To shed more light on this relation, we conducted a replication of the study by (Wicherts et al. 2011). Our study consisted of two parts. In the first part, we reproduced the analyses from (Wicherts et al. 2011) to verify the results, and we carried out several alternative analytical approaches to evaluate the robustness of the results against other analytical decisions. In the second part, we used a unique and larger data set that originated from (Vanpaemel et al. 2015) on data sharing upon request for reanalysis, to replicate the findings in (Wicherts et al. 2011). We applied statcheck for the detection of consistency errors in all included papers and manually corrected false positives. Finally, we again assessed the robustness of the replication results against other analytical decisions. Everything taken together, we found no robust empirical evidence for the claim that not sharing research data for reanalysis is associated with consistency errors.</abstract><pub>Public Library of Science (PLoS)</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 1932-6203
ispartof PloS one, 2023-01, Vol.18 (4)
issn 1932-6203
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_9e97d72a3e3c410a9b31ca7c88dc2856
source PMC (PubMed Central); Publicly Available Content (ProQuest)
title Data sharing upon request and statistical consistency errors in psychology: A replication of Wicherts, Bakker and Molenaar (2011)
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-16T01%3A39%3A41IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-doaj&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Data%20sharing%20upon%20request%20and%20statistical%20consistency%20errors%20in%20psychology:%20A%20replication%20of%20Wicherts,%20Bakker%20and%20Molenaar%20(2011)&rft.jtitle=PloS%20one&rft.au=Aline%20Claesen&rft.date=2023-01-01&rft.volume=18&rft.issue=4&rft.eissn=1932-6203&rft_id=info:doi/&rft_dat=%3Cdoaj%3Eoai_doaj_org_article_9e97d72a3e3c410a9b31ca7c88dc2856%3C/doaj%3E%3Cgrp_id%3Ecdi_FETCH-doaj_primary_oai_doaj_org_article_9e97d72a3e3c410a9b31ca7c88dc28563%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true