Loading…

Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria

High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of q...

Full description

Saved in:
Bibliographic Details
Published in:Implementation science : IS 2015-11, Vol.10 (156), p.155-155, Article 155
Main Authors: Lewis, Cara C, Fischer, Sarah, Weiner, Bryan J, Stanick, Cameo, Kim, Mimi, Martinez, Ruben G
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c630t-6440914993bc4f8d532c9eaac17cddcb6961c8e0e2d2aa5a0c83a7d8730355d43
cites cdi_FETCH-LOGICAL-c630t-6440914993bc4f8d532c9eaac17cddcb6961c8e0e2d2aa5a0c83a7d8730355d43
container_end_page 155
container_issue 156
container_start_page 155
container_title Implementation science : IS
container_volume 10
creator Lewis, Cara C
Fischer, Sarah
Weiner, Bryan J
Stanick, Cameo
Kim, Mimi
Martinez, Ruben G
description High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24-34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings. Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile. We identified 104 instruments across eight constructs, with nearly half (n = 50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity. Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which imple
doi_str_mv 10.1186/s13012-015-0342-x
format article
fullrecord <record><control><sourceid>gale_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_4634818</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A447997421</galeid><sourcerecordid>A447997421</sourcerecordid><originalsourceid>FETCH-LOGICAL-c630t-6440914993bc4f8d532c9eaac17cddcb6961c8e0e2d2aa5a0c83a7d8730355d43</originalsourceid><addsrcrecordid>eNptkl1rFTEQhhdRbK3-AG8k4I03W5NNNh9eCKX4BYXe6HXISWZPU3aTY7J7bP-9s5xaW5EQMsy8zyQZ3qZ5zegpY1q-r4xT1rWU9S3lomtvnjTHTAnd9obqpw_io-ZFrdeUil5I_rw56mTPlaLyuNlfLrPPE1Qy5ELitBthgjS7OeZEqo-QPHwgLhFIVw7jQOptnWFCgScF9hF-kTyQmOpclpWsZKkxbQmWwgq3G1eRKghg1pc4Q4nuZfNscGOFV3fnSfPj86fv51_bi8sv387PLlovOZ1bKQQ1TBjDN14MOvS88wac80z5EPxGGsm8Bgpd6JzrHfWaOxW04pT3fRD8pPl46LtbNhMEjw8sbrS7EidXbm120T6upHhlt3lvcU5CM40N3t01KPnnAnW2U6wextElyEu1THGmtMSN0rf_SK_zUhJ-D1XKSEN7bf6qtm4EG9OQ8V6_NrVnQihjlOgYqk7_o8IVYIo-Jxgi5h8B7AD4kmstMNz_kVG7msUezGLRLHY1i71B5s3D4dwTf9zBfwN5obyZ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1779690589</pqid></control><display><type>article</type><title>Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria</title><source>Publicly Available Content Database</source><source>PubMed Central</source><creator>Lewis, Cara C ; Fischer, Sarah ; Weiner, Bryan J ; Stanick, Cameo ; Kim, Mimi ; Martinez, Ruben G</creator><creatorcontrib>Lewis, Cara C ; Fischer, Sarah ; Weiner, Bryan J ; Stanick, Cameo ; Kim, Mimi ; Martinez, Ruben G</creatorcontrib><description>High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24-34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings. Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile. We identified 104 instruments across eight constructs, with nearly half (n = 50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity. Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which implementation strategies work best, for which organizations, and under what conditions.</description><identifier>ISSN: 1748-5908</identifier><identifier>EISSN: 1748-5908</identifier><identifier>DOI: 10.1186/s13012-015-0342-x</identifier><identifier>PMID: 26537706</identifier><language>eng</language><publisher>England: BioMed Central Ltd</publisher><subject>Analysis ; Behavior ; Behavioral health care ; Bibliographic data bases ; Cognition &amp; reasoning ; Collaboration ; Diffusion of Innovation ; Evidence-Based Practice ; Funding ; Humans ; Mental health ; Mental Health Services - organization &amp; administration ; Methods ; Program Evaluation - methods ; Program Evaluation - standards ; Psychometrics ; Quantitative psychology ; Reproducibility of Results ; Science ; Studies ; Systematic Review</subject><ispartof>Implementation science : IS, 2015-11, Vol.10 (156), p.155-155, Article 155</ispartof><rights>COPYRIGHT 2015 BioMed Central Ltd.</rights><rights>Copyright BioMed Central 2015</rights><rights>Lewis et al. 2015</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c630t-6440914993bc4f8d532c9eaac17cddcb6961c8e0e2d2aa5a0c83a7d8730355d43</citedby><cites>FETCH-LOGICAL-c630t-6440914993bc4f8d532c9eaac17cddcb6961c8e0e2d2aa5a0c83a7d8730355d43</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC4634818/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/1779690589?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/26537706$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Lewis, Cara C</creatorcontrib><creatorcontrib>Fischer, Sarah</creatorcontrib><creatorcontrib>Weiner, Bryan J</creatorcontrib><creatorcontrib>Stanick, Cameo</creatorcontrib><creatorcontrib>Kim, Mimi</creatorcontrib><creatorcontrib>Martinez, Ruben G</creatorcontrib><title>Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria</title><title>Implementation science : IS</title><addtitle>Implement Sci</addtitle><description>High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24-34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings. Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile. We identified 104 instruments across eight constructs, with nearly half (n = 50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity. Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which implementation strategies work best, for which organizations, and under what conditions.</description><subject>Analysis</subject><subject>Behavior</subject><subject>Behavioral health care</subject><subject>Bibliographic data bases</subject><subject>Cognition &amp; reasoning</subject><subject>Collaboration</subject><subject>Diffusion of Innovation</subject><subject>Evidence-Based Practice</subject><subject>Funding</subject><subject>Humans</subject><subject>Mental health</subject><subject>Mental Health Services - organization &amp; administration</subject><subject>Methods</subject><subject>Program Evaluation - methods</subject><subject>Program Evaluation - standards</subject><subject>Psychometrics</subject><subject>Quantitative psychology</subject><subject>Reproducibility of Results</subject><subject>Science</subject><subject>Studies</subject><subject>Systematic Review</subject><issn>1748-5908</issn><issn>1748-5908</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2015</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><recordid>eNptkl1rFTEQhhdRbK3-AG8k4I03W5NNNh9eCKX4BYXe6HXISWZPU3aTY7J7bP-9s5xaW5EQMsy8zyQZ3qZ5zegpY1q-r4xT1rWU9S3lomtvnjTHTAnd9obqpw_io-ZFrdeUil5I_rw56mTPlaLyuNlfLrPPE1Qy5ELitBthgjS7OeZEqo-QPHwgLhFIVw7jQOptnWFCgScF9hF-kTyQmOpclpWsZKkxbQmWwgq3G1eRKghg1pc4Q4nuZfNscGOFV3fnSfPj86fv51_bi8sv387PLlovOZ1bKQQ1TBjDN14MOvS88wac80z5EPxGGsm8Bgpd6JzrHfWaOxW04pT3fRD8pPl46LtbNhMEjw8sbrS7EidXbm120T6upHhlt3lvcU5CM40N3t01KPnnAnW2U6wextElyEu1THGmtMSN0rf_SK_zUhJ-D1XKSEN7bf6qtm4EG9OQ8V6_NrVnQihjlOgYqk7_o8IVYIo-Jxgi5h8B7AD4kmstMNz_kVG7msUezGLRLHY1i71B5s3D4dwTf9zBfwN5obyZ</recordid><startdate>20151104</startdate><enddate>20151104</enddate><creator>Lewis, Cara C</creator><creator>Fischer, Sarah</creator><creator>Weiner, Bryan J</creator><creator>Stanick, Cameo</creator><creator>Kim, Mimi</creator><creator>Martinez, Ruben G</creator><general>BioMed Central Ltd</general><general>BioMed Central</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope></search><sort><creationdate>20151104</creationdate><title>Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria</title><author>Lewis, Cara C ; Fischer, Sarah ; Weiner, Bryan J ; Stanick, Cameo ; Kim, Mimi ; Martinez, Ruben G</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c630t-6440914993bc4f8d532c9eaac17cddcb6961c8e0e2d2aa5a0c83a7d8730355d43</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2015</creationdate><topic>Analysis</topic><topic>Behavior</topic><topic>Behavioral health care</topic><topic>Bibliographic data bases</topic><topic>Cognition &amp; reasoning</topic><topic>Collaboration</topic><topic>Diffusion of Innovation</topic><topic>Evidence-Based Practice</topic><topic>Funding</topic><topic>Humans</topic><topic>Mental health</topic><topic>Mental Health Services - organization &amp; administration</topic><topic>Methods</topic><topic>Program Evaluation - methods</topic><topic>Program Evaluation - standards</topic><topic>Psychometrics</topic><topic>Quantitative psychology</topic><topic>Reproducibility of Results</topic><topic>Science</topic><topic>Studies</topic><topic>Systematic Review</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Lewis, Cara C</creatorcontrib><creatorcontrib>Fischer, Sarah</creatorcontrib><creatorcontrib>Weiner, Bryan J</creatorcontrib><creatorcontrib>Stanick, Cameo</creatorcontrib><creatorcontrib>Kim, Mimi</creatorcontrib><creatorcontrib>Martinez, Ruben G</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>Implementation science : IS</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lewis, Cara C</au><au>Fischer, Sarah</au><au>Weiner, Bryan J</au><au>Stanick, Cameo</au><au>Kim, Mimi</au><au>Martinez, Ruben G</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria</atitle><jtitle>Implementation science : IS</jtitle><addtitle>Implement Sci</addtitle><date>2015-11-04</date><risdate>2015</risdate><volume>10</volume><issue>156</issue><spage>155</spage><epage>155</epage><pages>155-155</pages><artnum>155</artnum><issn>1748-5908</issn><eissn>1748-5908</eissn><abstract>High-quality measurement is critical to advancing knowledge in any field. New fields, such as implementation science, are often beset with measurement gaps and poor quality instruments, a weakness that can be more easily addressed in light of systematic review findings. Although several reviews of quantitative instruments used in implementation science have been published, no studies have focused on instruments that measure implementation outcomes. Proctor and colleagues established a core set of implementation outcomes including: acceptability, adoption, appropriateness, cost, feasibility, fidelity, penetration, sustainability (Adm Policy Ment Health Ment Health Serv Res 36:24-34, 2009). The Society for Implementation Research Collaboration (SIRC) Instrument Review Project employed an enhanced systematic review methodology (Implement Sci 2: 2015) to identify quantitative instruments of implementation outcomes relevant to mental or behavioral health settings. Full details of the enhanced systematic review methodology are available (Implement Sci 2: 2015). To increase the feasibility of the review, and consistent with the scope of SIRC, only instruments that were applicable to mental or behavioral health were included. The review, synthesis, and evaluation included the following: (1) a search protocol for the literature review of constructs; (2) the literature review of instruments using Web of Science and PsycINFO; and (3) data extraction and instrument quality ratings to inform knowledge synthesis. Our evidence-based assessment rating criteria quantified fundamental psychometric properties as well as a crude measure of usability. Two independent raters applied the evidence-based assessment rating criteria to each instrument to generate a quality profile. We identified 104 instruments across eight constructs, with nearly half (n = 50) assessing acceptability and 19 identified for adoption, with all other implementation outcomes revealing fewer than 10 instruments. Only one instrument demonstrated at least minimal evidence for psychometric strength on all six of the evidence-based assessment criteria. The majority of instruments had no information regarding responsiveness or predictive validity. Implementation outcomes instrumentation is underdeveloped with respect to both the sheer number of available instruments and the psychometric quality of existing instruments. Until psychometric strength is established, the field will struggle to identify which implementation strategies work best, for which organizations, and under what conditions.</abstract><cop>England</cop><pub>BioMed Central Ltd</pub><pmid>26537706</pmid><doi>10.1186/s13012-015-0342-x</doi><tpages>1</tpages><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1748-5908
ispartof Implementation science : IS, 2015-11, Vol.10 (156), p.155-155, Article 155
issn 1748-5908
1748-5908
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_4634818
source Publicly Available Content Database; PubMed Central
subjects Analysis
Behavior
Behavioral health care
Bibliographic data bases
Cognition & reasoning
Collaboration
Diffusion of Innovation
Evidence-Based Practice
Funding
Humans
Mental health
Mental Health Services - organization & administration
Methods
Program Evaluation - methods
Program Evaluation - standards
Psychometrics
Quantitative psychology
Reproducibility of Results
Science
Studies
Systematic Review
title Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-28T18%3A59%3A56IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Outcomes%20for%20implementation%20science:%20an%20enhanced%20systematic%20review%20of%20instruments%20using%20evidence-based%20rating%20criteria&rft.jtitle=Implementation%20science%20:%20IS&rft.au=Lewis,%20Cara%20C&rft.date=2015-11-04&rft.volume=10&rft.issue=156&rft.spage=155&rft.epage=155&rft.pages=155-155&rft.artnum=155&rft.issn=1748-5908&rft.eissn=1748-5908&rft_id=info:doi/10.1186/s13012-015-0342-x&rft_dat=%3Cgale_pubme%3EA447997421%3C/gale_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c630t-6440914993bc4f8d532c9eaac17cddcb6961c8e0e2d2aa5a0c83a7d8730355d43%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1779690589&rft_id=info:pmid/26537706&rft_galeid=A447997421&rfr_iscdi=true