Loading…

Examining the Accuracy of a Conversation-Based Assessment in Interpreting English Learners' Written Responses. Research Report. ETS RR-21-03

Substantial progress has been made toward applying technology enhanced conversation-based assessments (CBAs) to measure the English-language proficiency of English learners (ELs). CBAs are conversation-based systems that use conversations among computer-animated agents and a test taker. We expanded...

Full description

Saved in:
Bibliographic Details
Published in:ETS research report series 2021-12
Main Authors: Lopez, Alexis A, Guzman-Orth, Danielle, Zapata-Rivera, Diego, Forsyth, Carolyn M, Luce, Christine
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Substantial progress has been made toward applying technology enhanced conversation-based assessments (CBAs) to measure the English-language proficiency of English learners (ELs). CBAs are conversation-based systems that use conversations among computer-animated agents and a test taker. We expanded the design and capability of prior conversation-based instructional and assessment systems and developed a CBA designed to measure the English language skills and the mathematics knowledge of middle school ELs. The prototype CBA simulates an authentic and engaging mathematics classroom where the test taker interacts with two virtual agents to solve math problems. We embedded feedback and supports that are triggered by how the CBA interprets students' written responses. In this study, we administered the CBA to middle school ELs (N = 82) residing in the United States. We examined the extent to which the CBA system was able to consistently interpret the students' responses (722 responses for the 82 students). The study findings helped us to understand the factors that affect the accuracy of the CBA system's interpretations and shed light on how to improve CBA systems that incorporate scaffolding.
ISSN:2330-8516