Loading…
Using hidden information and performance level boundaries to study student–teacher assignments: implications for estimating teacher causal effects
Summary A common problem in educational evaluation is estimating causal effects of interventions from non‐experimental data on students. Scores from standardized achievement tests often are used to adjust for differences in background characteristics of students in different non‐experimental groups....
Saved in:
Published in: | Journal of the Royal Statistical Society. Series A, Statistics in society Statistics in society, 2020-10, Vol.183 (4), p.1333-1362 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Summary
A common problem in educational evaluation is estimating causal effects of interventions from non‐experimental data on students. Scores from standardized achievement tests often are used to adjust for differences in background characteristics of students in different non‐experimental groups. An open question is whether, and how, these adjustments should account for the errors in test scores as measures of latent achievement. The answer depends on what information was used to assign students to non‐experimental groups. Using a case‐study of estimating teacher effects on student achievement, we develop two novel empirical tests about what information is used to assign students to teachers. We demonstrate that assignments are influenced by both information that is unobserved by the researcher, and error prone test scores. We develop a model that is appropriate for this complex selection mechanism and compare its results with common simpler estimators. We discuss implications for the broader problem of causal modelling with error prone confounders. |
---|---|
ISSN: | 0964-1998 1467-985X |
DOI: | 10.1111/rssa.12533 |