Loading…

Application of a Utility Analysis to Evaluate a Novel Assessment Tool for Clinically Oriented Physiology and Pharmacology

Multiple-choice questions are a gold-standard tool in medical school for assessment of knowledge and are the mainstay of licensing examinations. However, multiple-choice questions items can be criticized for lacking the ability to test higher-order learning or integrative thinking across multiple di...

Full description

Saved in:
Bibliographic Details
Published in:Advances in physiology education 2016-09, Vol.40 (3), p.304-312
Main Authors: Cramer, Nicholas, Asmar, Abdo, Gorman, Laurel, Gros, Bernard, Harris, David, Howard, Thomas, Hussain, Mujtaba, Salazar, Sergio, Kibble, Jonathan D
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multiple-choice questions are a gold-standard tool in medical school for assessment of knowledge and are the mainstay of licensing examinations. However, multiple-choice questions items can be criticized for lacking the ability to test higher-order learning or integrative thinking across multiple disciplines. Our objective was to develop a novel assessment that would address understanding of pathophysiology and pharmacology, evaluate learning at the levels of application, evaluation and synthesis, and allow students to demonstrate clinical reasoning. The rubric assesses student writeups of clinical case problems. The method is based on the physician's traditional postencounter Subjective, Objective, Assessment and Plan note. Students were required to correctly identify subjective and objective findings in authentic clinical case problems, to ascribe pathophysiological as well as pharmacological mechanisms to these findings, and to justify a list of differential diagnoses. A utility analysis was undertaken to evaluate the new assessment tool by appraising its reliability, validity, feasibility, cost effectiveness, acceptability, and educational impact using a mixed-method approach. The Subjective, Objective, Assessment and Plan assessment tool scored highly in terms of validity and educational impact and had acceptable levels of statistical reliability but was limited in terms of acceptance, feasibility, and cost effectiveness due to high time demands on expert graders and workload concerns from students. We conclude by making suggestions for improving the tool and recommend deployment of the instrument for low-stakes summative assessment or formative assessment.
ISSN:1043-4046
1522-1229
DOI:10.1152/advan.00140.2015