Loading…

Assessing Content Validity and Content Equivalence Using Structural Equation Modeling

Content validity is rarely evaluated based on empirical data, independent of that from a panel of content experts. In this article, procedures are used to construct parallel test forms based on statistical equivalence rather than content equivalence. This article describes an alternative approach to...

Full description

Saved in:
Bibliographic Details
Published in:Structural equation modeling 2002-04, Vol.9 (2), p.283-297
Main Authors: Ding, Cody S., Hershberger, Scott L.
Format: Article
Language:English
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Content validity is rarely evaluated based on empirical data, independent of that from a panel of content experts. In this article, procedures are used to construct parallel test forms based on statistical equivalence rather than content equivalence. This article describes an alternative approach to assessing content validity and content equivalence in terms of item-content structures and content area constructs. Structural equation modeling is applied to item-response data from 2 Regents College examinations to empirically verify content constructs developed by a panel of content experts and to examine content equivalence across-parallel test forms. The results suggest the different degrees of inconsistency and bias of content experts in assigning items to their corresponding content areas. The results also show that content equivalence across-parallel test forms is disputable based on randomly splitting methods. The implication and importance of the study is discussed in terms of test development.
ISSN:1070-5511
1532-8007
DOI:10.1207/S15328007SEM0902_7