Loading…

Measuring Response Style Stability Across Constructs With Item Response Trees

Individual response style behaviors, unrelated to the latent trait of interest, may influence responses to ordinal survey items. Response style can introduce bias in the total score with respect to the trait of interest, threatening valid interpretation of scores. Despite claims of response style st...

Full description

Saved in:
Bibliographic Details
Published in:Educational and psychological measurement 2022-04, Vol.82 (2), p.281-306
Main Author: Ames, Allison J.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Individual response style behaviors, unrelated to the latent trait of interest, may influence responses to ordinal survey items. Response style can introduce bias in the total score with respect to the trait of interest, threatening valid interpretation of scores. Despite claims of response style stability across scales, there has been little research into stability across multiple scales from the beneficial perspective of item response trees. This study examines an extension of the IRTree methodology to include mixed item formats, providing an empirical example of responses to three scales measuring perceptions of social media, climate change, and medical marijuana use. Results show extreme and midpoint response styles were not stable across scales within a single administration and 5-point Likert-type items elicited higher levels of extreme response style than the 4-point items. Latent trait of interest estimation varied, particularly at the lower end of the score distribution, across response style models, demonstrating as appropriate response style model is important for adequate trait estimation using Bayesian Markov chain Monte Carlo estimation.
ISSN:0013-1644
1552-3888
1552-3888
DOI:10.1177/00131644211020103