Loading…
Examining the Accuracy of a Conversation-Based Assessment in Interpreting English Learners' Written Responses. Research Report. ETS RR-21-03
Substantial progress has been made toward applying technology enhanced conversation-based assessments (CBAs) to measure the English-language proficiency of English learners (ELs). CBAs are conversation-based systems that use conversations among computer-animated agents and a test taker. We expanded...
Saved in:
Published in: | ETS research report series 2021-12 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | |
---|---|
cites | |
container_end_page | |
container_issue | |
container_start_page | |
container_title | ETS research report series |
container_volume | |
creator | Lopez, Alexis A Guzman-Orth, Danielle Zapata-Rivera, Diego Forsyth, Carolyn M Luce, Christine |
description | Substantial progress has been made toward applying technology enhanced conversation-based assessments (CBAs) to measure the English-language proficiency of English learners (ELs). CBAs are conversation-based systems that use conversations among computer-animated agents and a test taker. We expanded the design and capability of prior conversation-based instructional and assessment systems and developed a CBA designed to measure the English language skills and the mathematics knowledge of middle school ELs. The prototype CBA simulates an authentic and engaging mathematics classroom where the test taker interacts with two virtual agents to solve math problems. We embedded feedback and supports that are triggered by how the CBA interprets students' written responses. In this study, we administered the CBA to middle school ELs (N = 82) residing in the United States. We examined the extent to which the CBA system was able to consistently interpret the students' responses (722 responses for the 82 students). The study findings helped us to understand the factors that affect the accuracy of the CBA system's interpretations and shed light on how to improve CBA systems that incorporate scaffolding. |
format | article |
fullrecord | <record><control><sourceid>eric</sourceid><recordid>TN_cdi_eric_primary_EJ1340994</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ1340994</ericid><sourcerecordid>EJ1340994</sourcerecordid><originalsourceid>FETCH-eric_primary_EJ13409943</originalsourceid><addsrcrecordid>eNqFi8FKAzEQQHNQaNF-QmFunlKSTV3aY1siKp7WgscS4rSb0p0sM6nYf_Cj3YJ3T-_B492oceWc0YtHW4_URORojLGurmxdjdWP_w5dokQHKC3CKsYzh3iBvIcAm0xfyBJKyqTXQfATViIo0iEVSAQvVJB7xnL9PR1OSVp4w8A0bA_wwakUJGhQ-kzDOLvqkGM7SJ-5zMBv36FpdGW1cffqdh9OgpM_3qnpk99unjVyirueUxf4svOv1s3Ncjl3__Vfj5FOeQ</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Examining the Accuracy of a Conversation-Based Assessment in Interpreting English Learners' Written Responses. Research Report. ETS RR-21-03</title><source>Wiley Online Library Journals</source><source>EZB-FREE-00999 freely available EZB journals</source><creator>Lopez, Alexis A ; Guzman-Orth, Danielle ; Zapata-Rivera, Diego ; Forsyth, Carolyn M ; Luce, Christine</creator><creatorcontrib>Lopez, Alexis A ; Guzman-Orth, Danielle ; Zapata-Rivera, Diego ; Forsyth, Carolyn M ; Luce, Christine</creatorcontrib><description>Substantial progress has been made toward applying technology enhanced conversation-based assessments (CBAs) to measure the English-language proficiency of English learners (ELs). CBAs are conversation-based systems that use conversations among computer-animated agents and a test taker. We expanded the design and capability of prior conversation-based instructional and assessment systems and developed a CBA designed to measure the English language skills and the mathematics knowledge of middle school ELs. The prototype CBA simulates an authentic and engaging mathematics classroom where the test taker interacts with two virtual agents to solve math problems. We embedded feedback and supports that are triggered by how the CBA interprets students' written responses. In this study, we administered the CBA to middle school ELs (N = 82) residing in the United States. We examined the extent to which the CBA system was able to consistently interpret the students' responses (722 responses for the 82 students). The study findings helped us to understand the factors that affect the accuracy of the CBA system's interpretations and shed light on how to improve CBA systems that incorporate scaffolding.</description><identifier>ISSN: 2330-8516</identifier><language>eng</language><publisher>Educational Testing Service</publisher><subject>Accuracy ; Communicative Competence (Languages) ; Computer Assisted Testing ; English (Second Language) ; English Language Learners ; Item Analysis ; Language Proficiency ; Language Tests ; Mathematics Tests ; Middle School Students ; Peer Relationship ; Responses ; Second Language Learning ; Task Analysis ; Teacher Student Relationship ; Test Format ; Test Items ; Written Language</subject><ispartof>ETS research report series, 2021-12</ispartof><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>230,314,780,784,885</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ1340994$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>Lopez, Alexis A</creatorcontrib><creatorcontrib>Guzman-Orth, Danielle</creatorcontrib><creatorcontrib>Zapata-Rivera, Diego</creatorcontrib><creatorcontrib>Forsyth, Carolyn M</creatorcontrib><creatorcontrib>Luce, Christine</creatorcontrib><title>Examining the Accuracy of a Conversation-Based Assessment in Interpreting English Learners' Written Responses. Research Report. ETS RR-21-03</title><title>ETS research report series</title><description>Substantial progress has been made toward applying technology enhanced conversation-based assessments (CBAs) to measure the English-language proficiency of English learners (ELs). CBAs are conversation-based systems that use conversations among computer-animated agents and a test taker. We expanded the design and capability of prior conversation-based instructional and assessment systems and developed a CBA designed to measure the English language skills and the mathematics knowledge of middle school ELs. The prototype CBA simulates an authentic and engaging mathematics classroom where the test taker interacts with two virtual agents to solve math problems. We embedded feedback and supports that are triggered by how the CBA interprets students' written responses. In this study, we administered the CBA to middle school ELs (N = 82) residing in the United States. We examined the extent to which the CBA system was able to consistently interpret the students' responses (722 responses for the 82 students). The study findings helped us to understand the factors that affect the accuracy of the CBA system's interpretations and shed light on how to improve CBA systems that incorporate scaffolding.</description><subject>Accuracy</subject><subject>Communicative Competence (Languages)</subject><subject>Computer Assisted Testing</subject><subject>English (Second Language)</subject><subject>English Language Learners</subject><subject>Item Analysis</subject><subject>Language Proficiency</subject><subject>Language Tests</subject><subject>Mathematics Tests</subject><subject>Middle School Students</subject><subject>Peer Relationship</subject><subject>Responses</subject><subject>Second Language Learning</subject><subject>Task Analysis</subject><subject>Teacher Student Relationship</subject><subject>Test Format</subject><subject>Test Items</subject><subject>Written Language</subject><issn>2330-8516</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><recordid>eNqFi8FKAzEQQHNQaNF-QmFunlKSTV3aY1siKp7WgscS4rSb0p0sM6nYf_Cj3YJ3T-_B492oceWc0YtHW4_URORojLGurmxdjdWP_w5dokQHKC3CKsYzh3iBvIcAm0xfyBJKyqTXQfATViIo0iEVSAQvVJB7xnL9PR1OSVp4w8A0bA_wwakUJGhQ-kzDOLvqkGM7SJ-5zMBv36FpdGW1cffqdh9OgpM_3qnpk99unjVyirueUxf4svOv1s3Ncjl3__Vfj5FOeQ</recordid><startdate>202112</startdate><enddate>202112</enddate><creator>Lopez, Alexis A</creator><creator>Guzman-Orth, Danielle</creator><creator>Zapata-Rivera, Diego</creator><creator>Forsyth, Carolyn M</creator><creator>Luce, Christine</creator><general>Educational Testing Service</general><scope>ERI</scope><scope>GA5</scope></search><sort><creationdate>202112</creationdate><title>Examining the Accuracy of a Conversation-Based Assessment in Interpreting English Learners' Written Responses. Research Report. ETS RR-21-03</title><author>Lopez, Alexis A ; Guzman-Orth, Danielle ; Zapata-Rivera, Diego ; Forsyth, Carolyn M ; Luce, Christine</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-eric_primary_EJ13409943</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Accuracy</topic><topic>Communicative Competence (Languages)</topic><topic>Computer Assisted Testing</topic><topic>English (Second Language)</topic><topic>English Language Learners</topic><topic>Item Analysis</topic><topic>Language Proficiency</topic><topic>Language Tests</topic><topic>Mathematics Tests</topic><topic>Middle School Students</topic><topic>Peer Relationship</topic><topic>Responses</topic><topic>Second Language Learning</topic><topic>Task Analysis</topic><topic>Teacher Student Relationship</topic><topic>Test Format</topic><topic>Test Items</topic><topic>Written Language</topic><toplevel>online_resources</toplevel><creatorcontrib>Lopez, Alexis A</creatorcontrib><creatorcontrib>Guzman-Orth, Danielle</creatorcontrib><creatorcontrib>Zapata-Rivera, Diego</creatorcontrib><creatorcontrib>Forsyth, Carolyn M</creatorcontrib><creatorcontrib>Luce, Christine</creatorcontrib><collection>ERIC</collection><collection>ERIC - Full Text Only (Discovery)</collection><jtitle>ETS research report series</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Lopez, Alexis A</au><au>Guzman-Orth, Danielle</au><au>Zapata-Rivera, Diego</au><au>Forsyth, Carolyn M</au><au>Luce, Christine</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ1340994</ericid><atitle>Examining the Accuracy of a Conversation-Based Assessment in Interpreting English Learners' Written Responses. Research Report. ETS RR-21-03</atitle><jtitle>ETS research report series</jtitle><date>2021-12</date><risdate>2021</risdate><issn>2330-8516</issn><abstract>Substantial progress has been made toward applying technology enhanced conversation-based assessments (CBAs) to measure the English-language proficiency of English learners (ELs). CBAs are conversation-based systems that use conversations among computer-animated agents and a test taker. We expanded the design and capability of prior conversation-based instructional and assessment systems and developed a CBA designed to measure the English language skills and the mathematics knowledge of middle school ELs. The prototype CBA simulates an authentic and engaging mathematics classroom where the test taker interacts with two virtual agents to solve math problems. We embedded feedback and supports that are triggered by how the CBA interprets students' written responses. In this study, we administered the CBA to middle school ELs (N = 82) residing in the United States. We examined the extent to which the CBA system was able to consistently interpret the students' responses (722 responses for the 82 students). The study findings helped us to understand the factors that affect the accuracy of the CBA system's interpretations and shed light on how to improve CBA systems that incorporate scaffolding.</abstract><pub>Educational Testing Service</pub><tpages>15</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2330-8516 |
ispartof | ETS research report series, 2021-12 |
issn | 2330-8516 |
language | eng |
recordid | cdi_eric_primary_EJ1340994 |
source | Wiley Online Library Journals; EZB-FREE-00999 freely available EZB journals |
subjects | Accuracy Communicative Competence (Languages) Computer Assisted Testing English (Second Language) English Language Learners Item Analysis Language Proficiency Language Tests Mathematics Tests Middle School Students Peer Relationship Responses Second Language Learning Task Analysis Teacher Student Relationship Test Format Test Items Written Language |
title | Examining the Accuracy of a Conversation-Based Assessment in Interpreting English Learners' Written Responses. Research Report. ETS RR-21-03 |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T18%3A36%3A54IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-eric&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Examining%20the%20Accuracy%20of%20a%20Conversation-Based%20Assessment%20in%20Interpreting%20English%20Learners'%20Written%20Responses.%20Research%20Report.%20ETS%20RR-21-03&rft.jtitle=ETS%20research%20report%20series&rft.au=Lopez,%20Alexis%20A&rft.date=2021-12&rft.issn=2330-8516&rft_id=info:doi/&rft_dat=%3Ceric%3EEJ1340994%3C/eric%3E%3Cgrp_id%3Ecdi_FETCH-eric_primary_EJ13409943%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ericid=EJ1340994&rfr_iscdi=true |