Loading…

Initial Comparison of Resident and Attending Milestones Evaluations in Plastic Surgery

Background Graduate medical education has recently undergone a major archetypal shift toward competency-based evaluations of residentsʼ performance. The implementation of the Milestones program by the Accreditation Council for Graduate Medical Education (ACGME) is a core component of the shift, desi...

Full description

Saved in:
Bibliographic Details
Published in:Journal of surgical education 2017-09, Vol.74 (5), p.773-779
Main Authors: Yao, Amy, BS, Massenburg, Benjamin B., BA, Silver, Lester, MD, MS, Taub, Peter J., MD
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c414t-9d7e13a39e584702472b1251cf1fa734ad8ce08cc25adaf59cabc66aa06553383
cites cdi_FETCH-LOGICAL-c414t-9d7e13a39e584702472b1251cf1fa734ad8ce08cc25adaf59cabc66aa06553383
container_end_page 779
container_issue 5
container_start_page 773
container_title Journal of surgical education
container_volume 74
creator Yao, Amy, BS
Massenburg, Benjamin B., BA
Silver, Lester, MD, MS
Taub, Peter J., MD
description Background Graduate medical education has recently undergone a major archetypal shift toward competency-based evaluations of residentsʼ performance. The implementation of the Milestones program by the Accreditation Council for Graduate Medical Education (ACGME) is a core component of the shift, designed to ensure uniformity in measuring residency knowledge using a series of specialty-specific achievements. This study evaluates the correlation between residentsʼ self-evaluations and program directorsʼ assessments of their performance. Methods The study population comprised 12 plastic surgery residents, ranging from postgraduate year 1 to postgraduate year 6, enrolled in an integrated residency program at a single institution. Results Overall, average attending scores were lower than average resident scores at all levels except postgraduate year 6. Correlation between resident and attending evaluations ranged from 0.417 to 0.957, with the correlation of average scores of Patient Care (0.854) and Medical Knowledge (0.816) Milestones significantly higher than those of professional skillsets (0.581). “Patient care, facial esthetics” was the Milestone with the lowest average scores from both groups. Residents scored themselves notably higher than their attendingsʼ evaluations in Practice-based Learning and Improvement categories (+0.958) and notably lower in Medical Knowledge categories such as “Cosmetic Surgery, Trunk and Lower Extremities” (−0.375) and “Non-trauma hand” (−0.208). The total possible number of participants in this study was 12. The actual number of participants was 12 (100%). Conclusions The remarkable range of correlations suggests that expectations for performance standards may vary widely between residents and program directors. Understanding gaps between expectations and performance is vital to inform current and future residents as the restructuring of the accreditation process continues.
doi_str_mv 10.1016/j.jsurg.2017.02.001
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1874782411</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>1_s2_0_S1931720417300685</els_id><sourcerecordid>1874782411</sourcerecordid><originalsourceid>FETCH-LOGICAL-c414t-9d7e13a39e584702472b1251cf1fa734ad8ce08cc25adaf59cabc66aa06553383</originalsourceid><addsrcrecordid>eNqFkUtv1DAUhS0Eog_4BUjISzYJfsbOAqRq1NJKRa0osLU8zk3l4LEHO6k0_x4PU1h009W9i3Pu4zsIvaOkpYR2H6d2Kku-bxmhqiWsJYS-QMdUK90oIdnL2vecNooRcYROSpkIkaJn_Wt0xDSTvdD6GP28in72NuBV2mxt9iVFnEb8DYofIM7YxgGfzTPEwcd7_NUHKHOKUPD5gw2LnX2KBfuIb4Mts3f4rl4EefcGvRptKPD2sZ6iHxfn31eXzfXNl6vV2XXjBBVz0w8KKLe8B6mFIkwotqZMUjfS0Sou7KAdEO0ck3awo-ydXbuus5Z0UnKu-Sn6cJi7zen3Um8zG18chGAjpKWYSkMozQSlVcoPUpdTKRlGs81-Y_POUGL2QM1k_gI1e6CGMFOBVtf7xwXLegPDf88_glXw6SCA-uaDh2yK8xAdDD6Dm82Q_DMLPj_xu-Cjdzb8gh2UKS05VoKGmlIN5m6f6T5SqjghnZb8DyjqnXI</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1874782411</pqid></control><display><type>article</type><title>Initial Comparison of Resident and Attending Milestones Evaluations in Plastic Surgery</title><source>ScienceDirect Journals</source><creator>Yao, Amy, BS ; Massenburg, Benjamin B., BA ; Silver, Lester, MD, MS ; Taub, Peter J., MD</creator><creatorcontrib>Yao, Amy, BS ; Massenburg, Benjamin B., BA ; Silver, Lester, MD, MS ; Taub, Peter J., MD</creatorcontrib><description>Background Graduate medical education has recently undergone a major archetypal shift toward competency-based evaluations of residentsʼ performance. The implementation of the Milestones program by the Accreditation Council for Graduate Medical Education (ACGME) is a core component of the shift, designed to ensure uniformity in measuring residency knowledge using a series of specialty-specific achievements. This study evaluates the correlation between residentsʼ self-evaluations and program directorsʼ assessments of their performance. Methods The study population comprised 12 plastic surgery residents, ranging from postgraduate year 1 to postgraduate year 6, enrolled in an integrated residency program at a single institution. Results Overall, average attending scores were lower than average resident scores at all levels except postgraduate year 6. Correlation between resident and attending evaluations ranged from 0.417 to 0.957, with the correlation of average scores of Patient Care (0.854) and Medical Knowledge (0.816) Milestones significantly higher than those of professional skillsets (0.581). “Patient care, facial esthetics” was the Milestone with the lowest average scores from both groups. Residents scored themselves notably higher than their attendingsʼ evaluations in Practice-based Learning and Improvement categories (+0.958) and notably lower in Medical Knowledge categories such as “Cosmetic Surgery, Trunk and Lower Extremities” (−0.375) and “Non-trauma hand” (−0.208). The total possible number of participants in this study was 12. The actual number of participants was 12 (100%). Conclusions The remarkable range of correlations suggests that expectations for performance standards may vary widely between residents and program directors. Understanding gaps between expectations and performance is vital to inform current and future residents as the restructuring of the accreditation process continues.</description><identifier>ISSN: 1931-7204</identifier><identifier>EISSN: 1878-7452</identifier><identifier>DOI: 10.1016/j.jsurg.2017.02.001</identifier><identifier>PMID: 28259488</identifier><language>eng</language><publisher>United States: Elsevier Inc</publisher><subject>Adult ; Analysis of Variance ; Clinical Competence ; Competency-Based Education - organization &amp; administration ; education ; Education, Medical, Graduate - organization &amp; administration ; Female ; Humans ; Internship and Residency - organization &amp; administration ; Interpersonal and Communication Skills ; Male ; Medical Knowledge ; milestones ; Patient Care ; Practice-Based Learning and Improvement ; Professionalism ; Program Evaluation ; programs ; resident ; Self-Assessment ; Surgery ; Surgery, Plastic - education ; Systems-Based Practice ; United States</subject><ispartof>Journal of surgical education, 2017-09, Vol.74 (5), p.773-779</ispartof><rights>Association of Program Directors in Surgery</rights><rights>2017 Association of Program Directors in Surgery</rights><rights>Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c414t-9d7e13a39e584702472b1251cf1fa734ad8ce08cc25adaf59cabc66aa06553383</citedby><cites>FETCH-LOGICAL-c414t-9d7e13a39e584702472b1251cf1fa734ad8ce08cc25adaf59cabc66aa06553383</cites><orcidid>0000-0003-1751-9901</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27903,27904</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/28259488$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Yao, Amy, BS</creatorcontrib><creatorcontrib>Massenburg, Benjamin B., BA</creatorcontrib><creatorcontrib>Silver, Lester, MD, MS</creatorcontrib><creatorcontrib>Taub, Peter J., MD</creatorcontrib><title>Initial Comparison of Resident and Attending Milestones Evaluations in Plastic Surgery</title><title>Journal of surgical education</title><addtitle>J Surg Educ</addtitle><description>Background Graduate medical education has recently undergone a major archetypal shift toward competency-based evaluations of residentsʼ performance. The implementation of the Milestones program by the Accreditation Council for Graduate Medical Education (ACGME) is a core component of the shift, designed to ensure uniformity in measuring residency knowledge using a series of specialty-specific achievements. This study evaluates the correlation between residentsʼ self-evaluations and program directorsʼ assessments of their performance. Methods The study population comprised 12 plastic surgery residents, ranging from postgraduate year 1 to postgraduate year 6, enrolled in an integrated residency program at a single institution. Results Overall, average attending scores were lower than average resident scores at all levels except postgraduate year 6. Correlation between resident and attending evaluations ranged from 0.417 to 0.957, with the correlation of average scores of Patient Care (0.854) and Medical Knowledge (0.816) Milestones significantly higher than those of professional skillsets (0.581). “Patient care, facial esthetics” was the Milestone with the lowest average scores from both groups. Residents scored themselves notably higher than their attendingsʼ evaluations in Practice-based Learning and Improvement categories (+0.958) and notably lower in Medical Knowledge categories such as “Cosmetic Surgery, Trunk and Lower Extremities” (−0.375) and “Non-trauma hand” (−0.208). The total possible number of participants in this study was 12. The actual number of participants was 12 (100%). Conclusions The remarkable range of correlations suggests that expectations for performance standards may vary widely between residents and program directors. Understanding gaps between expectations and performance is vital to inform current and future residents as the restructuring of the accreditation process continues.</description><subject>Adult</subject><subject>Analysis of Variance</subject><subject>Clinical Competence</subject><subject>Competency-Based Education - organization &amp; administration</subject><subject>education</subject><subject>Education, Medical, Graduate - organization &amp; administration</subject><subject>Female</subject><subject>Humans</subject><subject>Internship and Residency - organization &amp; administration</subject><subject>Interpersonal and Communication Skills</subject><subject>Male</subject><subject>Medical Knowledge</subject><subject>milestones</subject><subject>Patient Care</subject><subject>Practice-Based Learning and Improvement</subject><subject>Professionalism</subject><subject>Program Evaluation</subject><subject>programs</subject><subject>resident</subject><subject>Self-Assessment</subject><subject>Surgery</subject><subject>Surgery, Plastic - education</subject><subject>Systems-Based Practice</subject><subject>United States</subject><issn>1931-7204</issn><issn>1878-7452</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><recordid>eNqFkUtv1DAUhS0Eog_4BUjISzYJfsbOAqRq1NJKRa0osLU8zk3l4LEHO6k0_x4PU1h009W9i3Pu4zsIvaOkpYR2H6d2Kku-bxmhqiWsJYS-QMdUK90oIdnL2vecNooRcYROSpkIkaJn_Wt0xDSTvdD6GP28in72NuBV2mxt9iVFnEb8DYofIM7YxgGfzTPEwcd7_NUHKHOKUPD5gw2LnX2KBfuIb4Mts3f4rl4EefcGvRptKPD2sZ6iHxfn31eXzfXNl6vV2XXjBBVz0w8KKLe8B6mFIkwotqZMUjfS0Sou7KAdEO0ck3awo-ydXbuus5Z0UnKu-Sn6cJi7zen3Um8zG18chGAjpKWYSkMozQSlVcoPUpdTKRlGs81-Y_POUGL2QM1k_gI1e6CGMFOBVtf7xwXLegPDf88_glXw6SCA-uaDh2yK8xAdDD6Dm82Q_DMLPj_xu-Cjdzb8gh2UKS05VoKGmlIN5m6f6T5SqjghnZb8DyjqnXI</recordid><startdate>20170901</startdate><enddate>20170901</enddate><creator>Yao, Amy, BS</creator><creator>Massenburg, Benjamin B., BA</creator><creator>Silver, Lester, MD, MS</creator><creator>Taub, Peter J., MD</creator><general>Elsevier Inc</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0003-1751-9901</orcidid></search><sort><creationdate>20170901</creationdate><title>Initial Comparison of Resident and Attending Milestones Evaluations in Plastic Surgery</title><author>Yao, Amy, BS ; Massenburg, Benjamin B., BA ; Silver, Lester, MD, MS ; Taub, Peter J., MD</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c414t-9d7e13a39e584702472b1251cf1fa734ad8ce08cc25adaf59cabc66aa06553383</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Adult</topic><topic>Analysis of Variance</topic><topic>Clinical Competence</topic><topic>Competency-Based Education - organization &amp; administration</topic><topic>education</topic><topic>Education, Medical, Graduate - organization &amp; administration</topic><topic>Female</topic><topic>Humans</topic><topic>Internship and Residency - organization &amp; administration</topic><topic>Interpersonal and Communication Skills</topic><topic>Male</topic><topic>Medical Knowledge</topic><topic>milestones</topic><topic>Patient Care</topic><topic>Practice-Based Learning and Improvement</topic><topic>Professionalism</topic><topic>Program Evaluation</topic><topic>programs</topic><topic>resident</topic><topic>Self-Assessment</topic><topic>Surgery</topic><topic>Surgery, Plastic - education</topic><topic>Systems-Based Practice</topic><topic>United States</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Yao, Amy, BS</creatorcontrib><creatorcontrib>Massenburg, Benjamin B., BA</creatorcontrib><creatorcontrib>Silver, Lester, MD, MS</creatorcontrib><creatorcontrib>Taub, Peter J., MD</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Journal of surgical education</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Yao, Amy, BS</au><au>Massenburg, Benjamin B., BA</au><au>Silver, Lester, MD, MS</au><au>Taub, Peter J., MD</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Initial Comparison of Resident and Attending Milestones Evaluations in Plastic Surgery</atitle><jtitle>Journal of surgical education</jtitle><addtitle>J Surg Educ</addtitle><date>2017-09-01</date><risdate>2017</risdate><volume>74</volume><issue>5</issue><spage>773</spage><epage>779</epage><pages>773-779</pages><issn>1931-7204</issn><eissn>1878-7452</eissn><abstract>Background Graduate medical education has recently undergone a major archetypal shift toward competency-based evaluations of residentsʼ performance. The implementation of the Milestones program by the Accreditation Council for Graduate Medical Education (ACGME) is a core component of the shift, designed to ensure uniformity in measuring residency knowledge using a series of specialty-specific achievements. This study evaluates the correlation between residentsʼ self-evaluations and program directorsʼ assessments of their performance. Methods The study population comprised 12 plastic surgery residents, ranging from postgraduate year 1 to postgraduate year 6, enrolled in an integrated residency program at a single institution. Results Overall, average attending scores were lower than average resident scores at all levels except postgraduate year 6. Correlation between resident and attending evaluations ranged from 0.417 to 0.957, with the correlation of average scores of Patient Care (0.854) and Medical Knowledge (0.816) Milestones significantly higher than those of professional skillsets (0.581). “Patient care, facial esthetics” was the Milestone with the lowest average scores from both groups. Residents scored themselves notably higher than their attendingsʼ evaluations in Practice-based Learning and Improvement categories (+0.958) and notably lower in Medical Knowledge categories such as “Cosmetic Surgery, Trunk and Lower Extremities” (−0.375) and “Non-trauma hand” (−0.208). The total possible number of participants in this study was 12. The actual number of participants was 12 (100%). Conclusions The remarkable range of correlations suggests that expectations for performance standards may vary widely between residents and program directors. Understanding gaps between expectations and performance is vital to inform current and future residents as the restructuring of the accreditation process continues.</abstract><cop>United States</cop><pub>Elsevier Inc</pub><pmid>28259488</pmid><doi>10.1016/j.jsurg.2017.02.001</doi><tpages>7</tpages><orcidid>https://orcid.org/0000-0003-1751-9901</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 1931-7204
ispartof Journal of surgical education, 2017-09, Vol.74 (5), p.773-779
issn 1931-7204
1878-7452
language eng
recordid cdi_proquest_miscellaneous_1874782411
source ScienceDirect Journals
subjects Adult
Analysis of Variance
Clinical Competence
Competency-Based Education - organization & administration
education
Education, Medical, Graduate - organization & administration
Female
Humans
Internship and Residency - organization & administration
Interpersonal and Communication Skills
Male
Medical Knowledge
milestones
Patient Care
Practice-Based Learning and Improvement
Professionalism
Program Evaluation
programs
resident
Self-Assessment
Surgery
Surgery, Plastic - education
Systems-Based Practice
United States
title Initial Comparison of Resident and Attending Milestones Evaluations in Plastic Surgery
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-21T13%3A38%3A00IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Initial%20Comparison%20of%20Resident%20and%20Attending%20Milestones%20Evaluations%20in%20Plastic%20Surgery&rft.jtitle=Journal%20of%20surgical%20education&rft.au=Yao,%20Amy,%20BS&rft.date=2017-09-01&rft.volume=74&rft.issue=5&rft.spage=773&rft.epage=779&rft.pages=773-779&rft.issn=1931-7204&rft.eissn=1878-7452&rft_id=info:doi/10.1016/j.jsurg.2017.02.001&rft_dat=%3Cproquest_cross%3E1874782411%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c414t-9d7e13a39e584702472b1251cf1fa734ad8ce08cc25adaf59cabc66aa06553383%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=1874782411&rft_id=info:pmid/28259488&rfr_iscdi=true