Loading…

Assessing the assessment in emergency care training

Each year over 1.5 million health care professionals attend emergency care courses. Despite high stakes for patients and extensive resources involved, little evidence exists on the quality of assessment. The aim of this study was to evaluate the validity and reliability of commonly used formats in a...

Full description

Saved in:
Bibliographic Details
Published in:PloS one 2014-12, Vol.9 (12), p.e114663-e114663
Main Authors: Dankbaar, Mary E W, Stegers-Jager, Karen M, Baarveld, Frank, Merrienboer, Jeroen J G van, Norman, Geoff R, Rutten, Frans L, van Saase, Jan L C M, Schuit, Stephanie C E
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c692t-2464fcdf32159425b3b002af022815d7a32184eef737a94c0b67cdb3da0b61753
cites cdi_FETCH-LOGICAL-c692t-2464fcdf32159425b3b002af022815d7a32184eef737a94c0b67cdb3da0b61753
container_end_page e114663
container_issue 12
container_start_page e114663
container_title PloS one
container_volume 9
creator Dankbaar, Mary E W
Stegers-Jager, Karen M
Baarveld, Frank
Merrienboer, Jeroen J G van
Norman, Geoff R
Rutten, Frans L
van Saase, Jan L C M
Schuit, Stephanie C E
description Each year over 1.5 million health care professionals attend emergency care courses. Despite high stakes for patients and extensive resources involved, little evidence exists on the quality of assessment. The aim of this study was to evaluate the validity and reliability of commonly used formats in assessing emergency care skills. Residents were assessed at the end of a 2-week emergency course; a subgroup was videotaped. Psychometric analyses were conducted to assess the validity and inter-rater reliability of the assessment instrument, which included a checklist, a 9-item competency scale and a global performance scale. A group of 144 residents and 12 raters participated in the study; 22 residents were videotaped and re-assessed by 8 raters. The checklists showed limited validity and poor inter-rater reliability for the dimensions "correct" and "timely" (ICC = .30 and.39 resp.). The competency scale had good construct validity, consisting of a clinical and a communication subscale. The internal consistency of the (sub)scales was high (α = .93/.91/.86). The inter-rater reliability was moderate for the clinical competency subscale (.49) and the global performance scale (.50), but poor for the communication subscale (.27). A generalizability study showed that for a reliable assessment 5-13 raters are needed when using checklists, and four when using the clinical competency scale or the global performance scale. This study shows poor validity and reliability for assessing emergency skills with checklists but good validity and moderate reliability with clinical competency or global performance scales. Involving more raters can improve the reliability substantially. Recommendations are made to improve this high stakes skill assessment.
doi_str_mv 10.1371/journal.pone.0114663
format article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_1638004004</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A418138906</galeid><doaj_id>oai_doaj_org_article_2540e15597534e2b9601854a4039cac9</doaj_id><sourcerecordid>A418138906</sourcerecordid><originalsourceid>FETCH-LOGICAL-c692t-2464fcdf32159425b3b002af022815d7a32184eef737a94c0b67cdb3da0b61753</originalsourceid><addsrcrecordid>eNqNkluL1DAUx4so7rr6DUQLgujDjLm3eRGGxcvAwoK315Cmp50MbTKbtOJ-ezMz3WUq-yAJ5PY7_-Sc_LPsJUZLTAv8YevH4HS33HkHS4QxE4I-ys6xpGQhCKKPT-Zn2bMYtwhxWgrxNDsjnBNcIHKe0VWMEKN1bT5sINeHVQ9uyK3LoYfQgjO3udEB8iFo6xL5PHvS6C7Ci2m8yH5-_vTj8uvi6vrL-nJ1tTBCkmFBmGCNqRtKMJeM8IpWCBHdIEJKzOtCp4OSATQFLbRkBlWiMHVFa51muOD0Int91N11Pqop36iwoCVCLPVErI9E7fVW7YLtdbhVXlt12PChVToM1nSgCGcIMOcyCTMglRQIl5xphqg02sik9XG6bax6qE2qQdDdTHR-4uxGtf63YqRAotw_5t0kEPzNCHFQvY0Guk478OPh3VIWqRcJffMP-nB2E9XqlIB1jU_3mr2oWjFcYlpKJBK1fIBKrYbemmSOxqb9WcD7WUBiBvgztHqMUa2_f_t_9vrXnH17wm5Ad8Mm-m4crHdxDrIjaIKPMUBzX2SM1N7bd9VQe2-rydsp7NXpB90H3ZmZ_gUk__DO</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1638004004</pqid></control><display><type>article</type><title>Assessing the assessment in emergency care training</title><source>Open Access: PubMed Central</source><source>Publicly Available Content Database</source><creator>Dankbaar, Mary E W ; Stegers-Jager, Karen M ; Baarveld, Frank ; Merrienboer, Jeroen J G van ; Norman, Geoff R ; Rutten, Frans L ; van Saase, Jan L C M ; Schuit, Stephanie C E</creator><contributor>Tractenberg, Rochelle E.</contributor><creatorcontrib>Dankbaar, Mary E W ; Stegers-Jager, Karen M ; Baarveld, Frank ; Merrienboer, Jeroen J G van ; Norman, Geoff R ; Rutten, Frans L ; van Saase, Jan L C M ; Schuit, Stephanie C E ; Tractenberg, Rochelle E.</creatorcontrib><description>Each year over 1.5 million health care professionals attend emergency care courses. Despite high stakes for patients and extensive resources involved, little evidence exists on the quality of assessment. The aim of this study was to evaluate the validity and reliability of commonly used formats in assessing emergency care skills. Residents were assessed at the end of a 2-week emergency course; a subgroup was videotaped. Psychometric analyses were conducted to assess the validity and inter-rater reliability of the assessment instrument, which included a checklist, a 9-item competency scale and a global performance scale. A group of 144 residents and 12 raters participated in the study; 22 residents were videotaped and re-assessed by 8 raters. The checklists showed limited validity and poor inter-rater reliability for the dimensions "correct" and "timely" (ICC = .30 and.39 resp.). The competency scale had good construct validity, consisting of a clinical and a communication subscale. The internal consistency of the (sub)scales was high (α = .93/.91/.86). The inter-rater reliability was moderate for the clinical competency subscale (.49) and the global performance scale (.50), but poor for the communication subscale (.27). A generalizability study showed that for a reliable assessment 5-13 raters are needed when using checklists, and four when using the clinical competency scale or the global performance scale. This study shows poor validity and reliability for assessing emergency skills with checklists but good validity and moderate reliability with clinical competency or global performance scales. Involving more raters can improve the reliability substantially. Recommendations are made to improve this high stakes skill assessment.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0114663</identifier><identifier>PMID: 25521702</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Airway management ; Analysis ; Biology and Life Sciences ; Check lists ; Communication ; Education, Medical, Continuing ; Educational Measurement - methods ; Educational Measurement - standards ; Emergencies ; Emergency medical care ; Emergency Medical Services ; Emergency medicine ; Emergency services ; Health care ; Health education ; Humans ; Medical personnel ; Midwifery education ; Patients ; Physicians ; Quality assessment ; Reliability analysis ; Skills ; Social Sciences ; Studies ; Surveys and Questionnaires - standards ; Validity</subject><ispartof>PloS one, 2014-12, Vol.9 (12), p.e114663-e114663</ispartof><rights>COPYRIGHT 2014 Public Library of Science</rights><rights>2014 Dankbaar et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2014 Dankbaar et al 2014 Dankbaar et al</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c692t-2464fcdf32159425b3b002af022815d7a32184eef737a94c0b67cdb3da0b61753</citedby><cites>FETCH-LOGICAL-c692t-2464fcdf32159425b3b002af022815d7a32184eef737a94c0b67cdb3da0b61753</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/1638004004/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/1638004004?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793,75126</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/25521702$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Tractenberg, Rochelle E.</contributor><creatorcontrib>Dankbaar, Mary E W</creatorcontrib><creatorcontrib>Stegers-Jager, Karen M</creatorcontrib><creatorcontrib>Baarveld, Frank</creatorcontrib><creatorcontrib>Merrienboer, Jeroen J G van</creatorcontrib><creatorcontrib>Norman, Geoff R</creatorcontrib><creatorcontrib>Rutten, Frans L</creatorcontrib><creatorcontrib>van Saase, Jan L C M</creatorcontrib><creatorcontrib>Schuit, Stephanie C E</creatorcontrib><title>Assessing the assessment in emergency care training</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>Each year over 1.5 million health care professionals attend emergency care courses. Despite high stakes for patients and extensive resources involved, little evidence exists on the quality of assessment. The aim of this study was to evaluate the validity and reliability of commonly used formats in assessing emergency care skills. Residents were assessed at the end of a 2-week emergency course; a subgroup was videotaped. Psychometric analyses were conducted to assess the validity and inter-rater reliability of the assessment instrument, which included a checklist, a 9-item competency scale and a global performance scale. A group of 144 residents and 12 raters participated in the study; 22 residents were videotaped and re-assessed by 8 raters. The checklists showed limited validity and poor inter-rater reliability for the dimensions "correct" and "timely" (ICC = .30 and.39 resp.). The competency scale had good construct validity, consisting of a clinical and a communication subscale. The internal consistency of the (sub)scales was high (α = .93/.91/.86). The inter-rater reliability was moderate for the clinical competency subscale (.49) and the global performance scale (.50), but poor for the communication subscale (.27). A generalizability study showed that for a reliable assessment 5-13 raters are needed when using checklists, and four when using the clinical competency scale or the global performance scale. This study shows poor validity and reliability for assessing emergency skills with checklists but good validity and moderate reliability with clinical competency or global performance scales. Involving more raters can improve the reliability substantially. Recommendations are made to improve this high stakes skill assessment.</description><subject>Airway management</subject><subject>Analysis</subject><subject>Biology and Life Sciences</subject><subject>Check lists</subject><subject>Communication</subject><subject>Education, Medical, Continuing</subject><subject>Educational Measurement - methods</subject><subject>Educational Measurement - standards</subject><subject>Emergencies</subject><subject>Emergency medical care</subject><subject>Emergency Medical Services</subject><subject>Emergency medicine</subject><subject>Emergency services</subject><subject>Health care</subject><subject>Health education</subject><subject>Humans</subject><subject>Medical personnel</subject><subject>Midwifery education</subject><subject>Patients</subject><subject>Physicians</subject><subject>Quality assessment</subject><subject>Reliability analysis</subject><subject>Skills</subject><subject>Social Sciences</subject><subject>Studies</subject><subject>Surveys and Questionnaires - standards</subject><subject>Validity</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2014</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNqNkluL1DAUx4so7rr6DUQLgujDjLm3eRGGxcvAwoK315Cmp50MbTKbtOJ-ezMz3WUq-yAJ5PY7_-Sc_LPsJUZLTAv8YevH4HS33HkHS4QxE4I-ys6xpGQhCKKPT-Zn2bMYtwhxWgrxNDsjnBNcIHKe0VWMEKN1bT5sINeHVQ9uyK3LoYfQgjO3udEB8iFo6xL5PHvS6C7Ci2m8yH5-_vTj8uvi6vrL-nJ1tTBCkmFBmGCNqRtKMJeM8IpWCBHdIEJKzOtCp4OSATQFLbRkBlWiMHVFa51muOD0Int91N11Pqop36iwoCVCLPVErI9E7fVW7YLtdbhVXlt12PChVToM1nSgCGcIMOcyCTMglRQIl5xphqg02sik9XG6bax6qE2qQdDdTHR-4uxGtf63YqRAotw_5t0kEPzNCHFQvY0Guk478OPh3VIWqRcJffMP-nB2E9XqlIB1jU_3mr2oWjFcYlpKJBK1fIBKrYbemmSOxqb9WcD7WUBiBvgztHqMUa2_f_t_9vrXnH17wm5Ad8Mm-m4crHdxDrIjaIKPMUBzX2SM1N7bd9VQe2-rydsp7NXpB90H3ZmZ_gUk__DO</recordid><startdate>20141218</startdate><enddate>20141218</enddate><creator>Dankbaar, Mary E W</creator><creator>Stegers-Jager, Karen M</creator><creator>Baarveld, Frank</creator><creator>Merrienboer, Jeroen J G van</creator><creator>Norman, Geoff R</creator><creator>Rutten, Frans L</creator><creator>van Saase, Jan L C M</creator><creator>Schuit, Stephanie C E</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20141218</creationdate><title>Assessing the assessment in emergency care training</title><author>Dankbaar, Mary E W ; Stegers-Jager, Karen M ; Baarveld, Frank ; Merrienboer, Jeroen J G van ; Norman, Geoff R ; Rutten, Frans L ; van Saase, Jan L C M ; Schuit, Stephanie C E</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c692t-2464fcdf32159425b3b002af022815d7a32184eef737a94c0b67cdb3da0b61753</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2014</creationdate><topic>Airway management</topic><topic>Analysis</topic><topic>Biology and Life Sciences</topic><topic>Check lists</topic><topic>Communication</topic><topic>Education, Medical, Continuing</topic><topic>Educational Measurement - methods</topic><topic>Educational Measurement - standards</topic><topic>Emergencies</topic><topic>Emergency medical care</topic><topic>Emergency Medical Services</topic><topic>Emergency medicine</topic><topic>Emergency services</topic><topic>Health care</topic><topic>Health education</topic><topic>Humans</topic><topic>Medical personnel</topic><topic>Midwifery education</topic><topic>Patients</topic><topic>Physicians</topic><topic>Quality assessment</topic><topic>Reliability analysis</topic><topic>Skills</topic><topic>Social Sciences</topic><topic>Studies</topic><topic>Surveys and Questionnaires - standards</topic><topic>Validity</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dankbaar, Mary E W</creatorcontrib><creatorcontrib>Stegers-Jager, Karen M</creatorcontrib><creatorcontrib>Baarveld, Frank</creatorcontrib><creatorcontrib>Merrienboer, Jeroen J G van</creatorcontrib><creatorcontrib>Norman, Geoff R</creatorcontrib><creatorcontrib>Rutten, Frans L</creatorcontrib><creatorcontrib>van Saase, Jan L C M</creatorcontrib><creatorcontrib>Schuit, Stephanie C E</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Opposing Viewpoints database</collection><collection>Gale In Context: Science</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing &amp; Allied Health Database (ProQuest)</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>ProQuest Health and Medical</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database (Proquest)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Materials Science Database</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agriculture Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials science collection</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dankbaar, Mary E W</au><au>Stegers-Jager, Karen M</au><au>Baarveld, Frank</au><au>Merrienboer, Jeroen J G van</au><au>Norman, Geoff R</au><au>Rutten, Frans L</au><au>van Saase, Jan L C M</au><au>Schuit, Stephanie C E</au><au>Tractenberg, Rochelle E.</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Assessing the assessment in emergency care training</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2014-12-18</date><risdate>2014</risdate><volume>9</volume><issue>12</issue><spage>e114663</spage><epage>e114663</epage><pages>e114663-e114663</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>Each year over 1.5 million health care professionals attend emergency care courses. Despite high stakes for patients and extensive resources involved, little evidence exists on the quality of assessment. The aim of this study was to evaluate the validity and reliability of commonly used formats in assessing emergency care skills. Residents were assessed at the end of a 2-week emergency course; a subgroup was videotaped. Psychometric analyses were conducted to assess the validity and inter-rater reliability of the assessment instrument, which included a checklist, a 9-item competency scale and a global performance scale. A group of 144 residents and 12 raters participated in the study; 22 residents were videotaped and re-assessed by 8 raters. The checklists showed limited validity and poor inter-rater reliability for the dimensions "correct" and "timely" (ICC = .30 and.39 resp.). The competency scale had good construct validity, consisting of a clinical and a communication subscale. The internal consistency of the (sub)scales was high (α = .93/.91/.86). The inter-rater reliability was moderate for the clinical competency subscale (.49) and the global performance scale (.50), but poor for the communication subscale (.27). A generalizability study showed that for a reliable assessment 5-13 raters are needed when using checklists, and four when using the clinical competency scale or the global performance scale. This study shows poor validity and reliability for assessing emergency skills with checklists but good validity and moderate reliability with clinical competency or global performance scales. Involving more raters can improve the reliability substantially. Recommendations are made to improve this high stakes skill assessment.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>25521702</pmid><doi>10.1371/journal.pone.0114663</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1932-6203
ispartof PloS one, 2014-12, Vol.9 (12), p.e114663-e114663
issn 1932-6203
1932-6203
language eng
recordid cdi_plos_journals_1638004004
source Open Access: PubMed Central; Publicly Available Content Database
subjects Airway management
Analysis
Biology and Life Sciences
Check lists
Communication
Education, Medical, Continuing
Educational Measurement - methods
Educational Measurement - standards
Emergencies
Emergency medical care
Emergency Medical Services
Emergency medicine
Emergency services
Health care
Health education
Humans
Medical personnel
Midwifery education
Patients
Physicians
Quality assessment
Reliability analysis
Skills
Social Sciences
Studies
Surveys and Questionnaires - standards
Validity
title Assessing the assessment in emergency care training
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-24T19%3A38%3A09IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Assessing%20the%20assessment%20in%20emergency%20care%20training&rft.jtitle=PloS%20one&rft.au=Dankbaar,%20Mary%20E%20W&rft.date=2014-12-18&rft.volume=9&rft.issue=12&rft.spage=e114663&rft.epage=e114663&rft.pages=e114663-e114663&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0114663&rft_dat=%3Cgale_plos_%3EA418138906%3C/gale_plos_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c692t-2464fcdf32159425b3b002af022815d7a32184eef737a94c0b67cdb3da0b61753%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1638004004&rft_id=info:pmid/25521702&rft_galeid=A418138906&rfr_iscdi=true