Loading…

On the evaluation of research software: the CDUR procedure [version 2; peer review: 2 approved]

Background: Evaluation of the quality of research software is a challenging and relevant issue, still not sufficiently addressed by the scientific community. Methods: Our contribution begins by defining, precisely but widely enough, the notions of research software and of its authors followed by a s...

Full description

Saved in:
Bibliographic Details
Published in:F1000 research 2019, Vol.8, p.1353-1353
Main Authors: Gomez-Diaz, Teresa, Recio, Tomas
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c4852-60ba6e701def0a3d7298ec0858d57cec7bf5f310fb1012f228d47b88eca6f1193
cites cdi_FETCH-LOGICAL-c4852-60ba6e701def0a3d7298ec0858d57cec7bf5f310fb1012f228d47b88eca6f1193
container_end_page 1353
container_issue
container_start_page 1353
container_title F1000 research
container_volume 8
creator Gomez-Diaz, Teresa
Recio, Tomas
description Background: Evaluation of the quality of research software is a challenging and relevant issue, still not sufficiently addressed by the scientific community. Methods: Our contribution begins by defining, precisely but widely enough, the notions of research software and of its authors followed by a study of the evaluation issues, as the basis for the proposition of a sound assessment protocol: the CDUR procedure. Results: CDUR comprises four steps introduced as follows: Citation, to deal with correct RS identification, Dissemination, to measure good dissemination practices, Use, devoted to the evaluation of usability aspects, and Research, to assess the impact of the scientific work. Conclusions: Some conclusions and recommendations are finally included. The evaluation of research is the keystone to boost the evolution of the Open Science policies and practices.  It is as well our belief that research software evaluation is a fundamental step to induce better research software practices and, thus, a step towards more efficient science.
doi_str_mv 10.12688/f1000research.19994.2
format article
fullrecord <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_6898aafe876c499ea343d0b45051a791</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_6898aafe876c499ea343d0b45051a791</doaj_id><sourcerecordid>2322803307</sourcerecordid><originalsourceid>FETCH-LOGICAL-c4852-60ba6e701def0a3d7298ec0858d57cec7bf5f310fb1012f228d47b88eca6f1193</originalsourceid><addsrcrecordid>eNp9kltr2zAYhsXYWEvWvzAEu9kukulgy1IGg5KuBwgUxno1hpDlT42DY2WS7dJ_PznuDunFriSk530-HT6E3lKyoExI-dFRQkiACCbYzYIqpbIFe4FOGcnEnGaEvfxnfoLOYtymAFGKC1a8RiecSpopkZ8ifdvibgMYBtP0pqt9i73Dv9U4etc9mADLA7S6uPuK98FbqPoA-PsAIY4J9gnvAUKKDTU8LDHDZp-wAaofb9ArZ5oIZ0_jDN1dfvm2up6vb69uVufruc1kzuaClEZAQWgFjhheFUxJsETmssoLC7YoXe44Ja6khDLHmKyyopSJMcJRqvgM3Uzeyput3od6Z8Kj9qbWhwUf7rUJXW0b0EIqaYwDWQibKQWGZ7wiZZaTnJpC0eT6PLn2fbmDykLbBdMcSY932nqj7_2QzJJzNR7mwyTYPItdn6_1uEZ4-gHF6DAWe_9ULPifPcRO7-pooWlMC76PmvF02cSTIqHvnqFb34c2PWuiOBVEECkSJSbKBh9jAPfnBJToQ_voo_bRh_ZJhhlaTkFnbN90jyOk_1L_D_8CbT7KDQ</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2331606086</pqid></control><display><type>article</type><title>On the evaluation of research software: the CDUR procedure [version 2; peer review: 2 approved]</title><source>Open Access: PubMed Central</source><source>Publicly Available Content (ProQuest)</source><creator>Gomez-Diaz, Teresa ; Recio, Tomas</creator><creatorcontrib>Gomez-Diaz, Teresa ; Recio, Tomas</creatorcontrib><description>Background: Evaluation of the quality of research software is a challenging and relevant issue, still not sufficiently addressed by the scientific community. Methods: Our contribution begins by defining, precisely but widely enough, the notions of research software and of its authors followed by a study of the evaluation issues, as the basis for the proposition of a sound assessment protocol: the CDUR procedure. Results: CDUR comprises four steps introduced as follows: Citation, to deal with correct RS identification, Dissemination, to measure good dissemination practices, Use, devoted to the evaluation of usability aspects, and Research, to assess the impact of the scientific work. Conclusions: Some conclusions and recommendations are finally included. The evaluation of research is the keystone to boost the evolution of the Open Science policies and practices.  It is as well our belief that research software evaluation is a fundamental step to induce better research software practices and, thus, a step towards more efficient science.</description><identifier>ISSN: 2046-1402</identifier><identifier>EISSN: 2046-1402</identifier><identifier>DOI: 10.12688/f1000research.19994.2</identifier><identifier>PMID: 31814965</identifier><language>eng</language><publisher>London: Faculty of 1000 Ltd</publisher><subject>Careers ; Citations ; Cognitive science ; Computer programs ; Computer science ; Funding ; Humanities and Social Sciences ; Library and information sciences ; Open access ; Open source software ; Public domain ; Researchers ; Scholarly communication ; Science ; Scientists ; Software ; Software quality</subject><ispartof>F1000 research, 2019, Vol.8, p.1353-1353</ispartof><rights>Copyright: © 2019 Gomez-Diaz T and Recio T</rights><rights>Copyright: © 2019 Gomez-Diaz T and Recio T. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>Attribution</rights><rights>Copyright: © 2019 Gomez-Diaz T and Recio T 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c4852-60ba6e701def0a3d7298ec0858d57cec7bf5f310fb1012f228d47b88eca6f1193</citedby><cites>FETCH-LOGICAL-c4852-60ba6e701def0a3d7298ec0858d57cec7bf5f310fb1012f228d47b88eca6f1193</cites><orcidid>0000-0002-1011-295X ; 0000-0002-7834-145X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2331606086/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2331606086?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,4024,25753,27923,27924,27925,37012,37013,44590,53791,53793,75126</link.rule.ids><backlink>$$Uhttps://hal.science/hal-03318921$$DView record in HAL$$Hfree_for_read</backlink></links><search><creatorcontrib>Gomez-Diaz, Teresa</creatorcontrib><creatorcontrib>Recio, Tomas</creatorcontrib><title>On the evaluation of research software: the CDUR procedure [version 2; peer review: 2 approved]</title><title>F1000 research</title><description>Background: Evaluation of the quality of research software is a challenging and relevant issue, still not sufficiently addressed by the scientific community. Methods: Our contribution begins by defining, precisely but widely enough, the notions of research software and of its authors followed by a study of the evaluation issues, as the basis for the proposition of a sound assessment protocol: the CDUR procedure. Results: CDUR comprises four steps introduced as follows: Citation, to deal with correct RS identification, Dissemination, to measure good dissemination practices, Use, devoted to the evaluation of usability aspects, and Research, to assess the impact of the scientific work. Conclusions: Some conclusions and recommendations are finally included. The evaluation of research is the keystone to boost the evolution of the Open Science policies and practices.  It is as well our belief that research software evaluation is a fundamental step to induce better research software practices and, thus, a step towards more efficient science.</description><subject>Careers</subject><subject>Citations</subject><subject>Cognitive science</subject><subject>Computer programs</subject><subject>Computer science</subject><subject>Funding</subject><subject>Humanities and Social Sciences</subject><subject>Library and information sciences</subject><subject>Open access</subject><subject>Open source software</subject><subject>Public domain</subject><subject>Researchers</subject><subject>Scholarly communication</subject><subject>Science</subject><subject>Scientists</subject><subject>Software</subject><subject>Software quality</subject><issn>2046-1402</issn><issn>2046-1402</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNp9kltr2zAYhsXYWEvWvzAEu9kukulgy1IGg5KuBwgUxno1hpDlT42DY2WS7dJ_PznuDunFriSk530-HT6E3lKyoExI-dFRQkiACCbYzYIqpbIFe4FOGcnEnGaEvfxnfoLOYtymAFGKC1a8RiecSpopkZ8ifdvibgMYBtP0pqt9i73Dv9U4etc9mADLA7S6uPuK98FbqPoA-PsAIY4J9gnvAUKKDTU8LDHDZp-wAaofb9ArZ5oIZ0_jDN1dfvm2up6vb69uVufruc1kzuaClEZAQWgFjhheFUxJsETmssoLC7YoXe44Ja6khDLHmKyyopSJMcJRqvgM3Uzeyput3od6Z8Kj9qbWhwUf7rUJXW0b0EIqaYwDWQibKQWGZ7wiZZaTnJpC0eT6PLn2fbmDykLbBdMcSY932nqj7_2QzJJzNR7mwyTYPItdn6_1uEZ4-gHF6DAWe_9ULPifPcRO7-pooWlMC76PmvF02cSTIqHvnqFb34c2PWuiOBVEECkSJSbKBh9jAPfnBJToQ_voo_bRh_ZJhhlaTkFnbN90jyOk_1L_D_8CbT7KDQ</recordid><startdate>2019</startdate><enddate>2019</enddate><creator>Gomez-Diaz, Teresa</creator><creator>Recio, Tomas</creator><general>Faculty of 1000 Ltd</general><general>Faculty of 1000</general><general>F1000 Research Limited</general><general>F1000 Research Ltd</general><scope>C-E</scope><scope>CH4</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88I</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M2P</scope><scope>M7P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope><scope>1XC</scope><scope>BXJBU</scope><scope>IHQJB</scope><scope>VOOES</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0002-1011-295X</orcidid><orcidid>https://orcid.org/0000-0002-7834-145X</orcidid></search><sort><creationdate>2019</creationdate><title>On the evaluation of research software: the CDUR procedure [version 2; peer review: 2 approved]</title><author>Gomez-Diaz, Teresa ; Recio, Tomas</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c4852-60ba6e701def0a3d7298ec0858d57cec7bf5f310fb1012f228d47b88eca6f1193</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Careers</topic><topic>Citations</topic><topic>Cognitive science</topic><topic>Computer programs</topic><topic>Computer science</topic><topic>Funding</topic><topic>Humanities and Social Sciences</topic><topic>Library and information sciences</topic><topic>Open access</topic><topic>Open source software</topic><topic>Public domain</topic><topic>Researchers</topic><topic>Scholarly communication</topic><topic>Science</topic><topic>Scientists</topic><topic>Software</topic><topic>Software quality</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Gomez-Diaz, Teresa</creatorcontrib><creatorcontrib>Recio, Tomas</creatorcontrib><collection>F1000Research</collection><collection>Faculty of 1000</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health &amp; Medical Collection (Proquest)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>ProQuest Biological Science Collection</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Science Database (ProQuest)</collection><collection>Biological Science Database</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>Hyper Article en Ligne (HAL)</collection><collection>HAL-SHS: Archive ouverte en Sciences de l'Homme et de la Société</collection><collection>HAL-SHS: Archive ouverte en Sciences de l'Homme et de la Société (Open Access)</collection><collection>Hyper Article en Ligne (HAL) (Open Access)</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>F1000 research</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Gomez-Diaz, Teresa</au><au>Recio, Tomas</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>On the evaluation of research software: the CDUR procedure [version 2; peer review: 2 approved]</atitle><jtitle>F1000 research</jtitle><date>2019</date><risdate>2019</risdate><volume>8</volume><spage>1353</spage><epage>1353</epage><pages>1353-1353</pages><issn>2046-1402</issn><eissn>2046-1402</eissn><abstract>Background: Evaluation of the quality of research software is a challenging and relevant issue, still not sufficiently addressed by the scientific community. Methods: Our contribution begins by defining, precisely but widely enough, the notions of research software and of its authors followed by a study of the evaluation issues, as the basis for the proposition of a sound assessment protocol: the CDUR procedure. Results: CDUR comprises four steps introduced as follows: Citation, to deal with correct RS identification, Dissemination, to measure good dissemination practices, Use, devoted to the evaluation of usability aspects, and Research, to assess the impact of the scientific work. Conclusions: Some conclusions and recommendations are finally included. The evaluation of research is the keystone to boost the evolution of the Open Science policies and practices.  It is as well our belief that research software evaluation is a fundamental step to induce better research software practices and, thus, a step towards more efficient science.</abstract><cop>London</cop><pub>Faculty of 1000 Ltd</pub><pmid>31814965</pmid><doi>10.12688/f1000research.19994.2</doi><tpages>1</tpages><orcidid>https://orcid.org/0000-0002-1011-295X</orcidid><orcidid>https://orcid.org/0000-0002-7834-145X</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2046-1402
ispartof F1000 research, 2019, Vol.8, p.1353-1353
issn 2046-1402
2046-1402
language eng
recordid cdi_doaj_primary_oai_doaj_org_article_6898aafe876c499ea343d0b45051a791
source Open Access: PubMed Central; Publicly Available Content (ProQuest)
subjects Careers
Citations
Cognitive science
Computer programs
Computer science
Funding
Humanities and Social Sciences
Library and information sciences
Open access
Open source software
Public domain
Researchers
Scholarly communication
Science
Scientists
Software
Software quality
title On the evaluation of research software: the CDUR procedure [version 2; peer review: 2 approved]
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-07T02%3A36%3A45IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=On%20the%20evaluation%20of%20research%20software:%20the%20CDUR%20procedure%20%5Bversion%202;%20peer%20review:%202%20approved%5D&rft.jtitle=F1000%20research&rft.au=Gomez-Diaz,%20Teresa&rft.date=2019&rft.volume=8&rft.spage=1353&rft.epage=1353&rft.pages=1353-1353&rft.issn=2046-1402&rft.eissn=2046-1402&rft_id=info:doi/10.12688/f1000research.19994.2&rft_dat=%3Cproquest_doaj_%3E2322803307%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c4852-60ba6e701def0a3d7298ec0858d57cec7bf5f310fb1012f228d47b88eca6f1193%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2331606086&rft_id=info:pmid/31814965&rfr_iscdi=true