Loading…

Evaluating Scoring Procedures for Context-Dependent Item Sets

Evaluated two strategies for scoring context-dependent test items: ignoring the depending and scoring dichotomously or modeling the dependence through polytomous scoring. Results for data from 38,965 examinees taking a professional examination show that dichotomous scoring may overestimate test info...

Full description

Saved in:
Bibliographic Details
Published in:Applied measurement in education 2003, Vol.16 (3), p.207
Main Authors: Keller, Lisa A, Swaminathan, Hariharan, Sireci, Stephen G
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites
container_end_page
container_issue 3
container_start_page 207
container_title Applied measurement in education
container_volume 16
creator Keller, Lisa A
Swaminathan, Hariharan
Sireci, Stephen G
description Evaluated two strategies for scoring context-dependent test items: ignoring the depending and scoring dichotomously or modeling the dependence through polytomous scoring. Results for data from 38,965 examinees taking a professional examination show that dichotomous scoring may overestimate test information, but polytomous scoring may underestimate test information. (SLD)
doi_str_mv 10.1207/S15324818AME1603_3
format article
fullrecord <record><control><sourceid>eric</sourceid><recordid>TN_cdi_eric_primary_EJ678485</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ericid>EJ678485</ericid><sourcerecordid>EJ678485</sourcerecordid><originalsourceid>FETCH-LOGICAL-e174t-15bc7bc94879c90803b1baf2be88f5af7015ba3fccfdf3f8686f5c6d947f675f3</originalsourceid><addsrcrecordid>eNotzL1OwzAUQGEPIFFaXgAx5AUCdvx3PTBUIZSiViAF5shx7kVBbVI5LoK3BwTTtxwdxi4FvxYFtze10LJQIGC5rYThspEnbMbB6dxKZc_Y-TS9c86Vc3rGbqsPvzv61A9vWR3G-OtzHAN2x4hTRmPMynFI-JnyOzzg0OGQsnXCfVZjmhbslPxuwot_5-z1vnopH_LN02pdLjc5CqtSLnQbbBucAuuC48BlK1pPRYsApD1Z_lN4SSFQR5LAgCEdTOeUJWM1yTm7-vti7ENziP3ex6-mejQWFGj5Dfa5R1w</addsrcrecordid><sourcetype>Index Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Evaluating Scoring Procedures for Context-Dependent Item Sets</title><source>ERIC</source><source>Taylor and Francis Social Sciences and Humanities Collection</source><creator>Keller, Lisa A ; Swaminathan, Hariharan ; Sireci, Stephen G</creator><creatorcontrib>Keller, Lisa A ; Swaminathan, Hariharan ; Sireci, Stephen G</creatorcontrib><description>Evaluated two strategies for scoring context-dependent test items: ignoring the depending and scoring dichotomously or modeling the dependence through polytomous scoring. Results for data from 38,965 examinees taking a professional examination show that dichotomous scoring may overestimate test information, but polytomous scoring may underestimate test information. (SLD)</description><identifier>ISSN: 0895-7347</identifier><identifier>DOI: 10.1207/S15324818AME1603_3</identifier><language>eng</language><subject>Adults ; Context Dependence ; Dichotomous Scoring ; Licensing Examinations (Professions) ; Polytomous Scoring ; Scoring ; Test Items</subject><ispartof>Applied measurement in education, 2003, Vol.16 (3), p.207</ispartof><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,4024,27923,27924,27925</link.rule.ids><backlink>$$Uhttp://eric.ed.gov/ERICWebPortal/detail?accno=EJ678485$$DView record in ERIC$$Hfree_for_read</backlink></links><search><creatorcontrib>Keller, Lisa A</creatorcontrib><creatorcontrib>Swaminathan, Hariharan</creatorcontrib><creatorcontrib>Sireci, Stephen G</creatorcontrib><title>Evaluating Scoring Procedures for Context-Dependent Item Sets</title><title>Applied measurement in education</title><description>Evaluated two strategies for scoring context-dependent test items: ignoring the depending and scoring dichotomously or modeling the dependence through polytomous scoring. Results for data from 38,965 examinees taking a professional examination show that dichotomous scoring may overestimate test information, but polytomous scoring may underestimate test information. (SLD)</description><subject>Adults</subject><subject>Context Dependence</subject><subject>Dichotomous Scoring</subject><subject>Licensing Examinations (Professions)</subject><subject>Polytomous Scoring</subject><subject>Scoring</subject><subject>Test Items</subject><issn>0895-7347</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2003</creationdate><recordtype>article</recordtype><sourceid>7SW</sourceid><recordid>eNotzL1OwzAUQGEPIFFaXgAx5AUCdvx3PTBUIZSiViAF5shx7kVBbVI5LoK3BwTTtxwdxi4FvxYFtze10LJQIGC5rYThspEnbMbB6dxKZc_Y-TS9c86Vc3rGbqsPvzv61A9vWR3G-OtzHAN2x4hTRmPMynFI-JnyOzzg0OGQsnXCfVZjmhbslPxuwot_5-z1vnopH_LN02pdLjc5CqtSLnQbbBucAuuC48BlK1pPRYsApD1Z_lN4SSFQR5LAgCEdTOeUJWM1yTm7-vti7ENziP3ex6-mejQWFGj5Dfa5R1w</recordid><startdate>2003</startdate><enddate>2003</enddate><creator>Keller, Lisa A</creator><creator>Swaminathan, Hariharan</creator><creator>Sireci, Stephen G</creator><scope>7SW</scope><scope>BJH</scope><scope>BNH</scope><scope>BNI</scope><scope>BNJ</scope><scope>BNO</scope><scope>ERI</scope><scope>PET</scope><scope>REK</scope><scope>WWN</scope></search><sort><creationdate>2003</creationdate><title>Evaluating Scoring Procedures for Context-Dependent Item Sets</title><author>Keller, Lisa A ; Swaminathan, Hariharan ; Sireci, Stephen G</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-e174t-15bc7bc94879c90803b1baf2be88f5af7015ba3fccfdf3f8686f5c6d947f675f3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2003</creationdate><topic>Adults</topic><topic>Context Dependence</topic><topic>Dichotomous Scoring</topic><topic>Licensing Examinations (Professions)</topic><topic>Polytomous Scoring</topic><topic>Scoring</topic><topic>Test Items</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Keller, Lisa A</creatorcontrib><creatorcontrib>Swaminathan, Hariharan</creatorcontrib><creatorcontrib>Sireci, Stephen G</creatorcontrib><collection>ERIC</collection><collection>ERIC (Ovid)</collection><collection>ERIC</collection><collection>ERIC</collection><collection>ERIC (Legacy Platform)</collection><collection>ERIC( SilverPlatter )</collection><collection>ERIC</collection><collection>ERIC PlusText (Legacy Platform)</collection><collection>Education Resources Information Center (ERIC)</collection><collection>ERIC</collection><jtitle>Applied measurement in education</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Keller, Lisa A</au><au>Swaminathan, Hariharan</au><au>Sireci, Stephen G</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><ericid>EJ678485</ericid><atitle>Evaluating Scoring Procedures for Context-Dependent Item Sets</atitle><jtitle>Applied measurement in education</jtitle><date>2003</date><risdate>2003</risdate><volume>16</volume><issue>3</issue><spage>207</spage><pages>207-</pages><issn>0895-7347</issn><abstract>Evaluated two strategies for scoring context-dependent test items: ignoring the depending and scoring dichotomously or modeling the dependence through polytomous scoring. Results for data from 38,965 examinees taking a professional examination show that dichotomous scoring may overestimate test information, but polytomous scoring may underestimate test information. (SLD)</abstract><doi>10.1207/S15324818AME1603_3</doi></addata></record>
fulltext fulltext
identifier ISSN: 0895-7347
ispartof Applied measurement in education, 2003, Vol.16 (3), p.207
issn 0895-7347
language eng
recordid cdi_eric_primary_EJ678485
source ERIC; Taylor and Francis Social Sciences and Humanities Collection
subjects Adults
Context Dependence
Dichotomous Scoring
Licensing Examinations (Professions)
Polytomous Scoring
Scoring
Test Items
title Evaluating Scoring Procedures for Context-Dependent Item Sets
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-27T08%3A01%3A06IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-eric&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Evaluating%20Scoring%20Procedures%20for%20Context-Dependent%20Item%20Sets&rft.jtitle=Applied%20measurement%20in%20education&rft.au=Keller,%20Lisa%20A&rft.date=2003&rft.volume=16&rft.issue=3&rft.spage=207&rft.pages=207-&rft.issn=0895-7347&rft_id=info:doi/10.1207/S15324818AME1603_3&rft_dat=%3Ceric%3EEJ678485%3C/eric%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-e174t-15bc7bc94879c90803b1baf2be88f5af7015ba3fccfdf3f8686f5c6d947f675f3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rft_ericid=EJ678485&rfr_iscdi=true