Loading…
Assessing scientific reasoning: a comprehensive evaluation of item features that affect item difficulty
The aim of this study was to improve the criterion-related test score interpretation of a text-based assessment of scientific reasoning competencies in higher education by evaluating factors which systematically affect item difficulty. To provide evidence about the specific demands which test items...
Saved in:
Published in: | Assessment and evaluation in higher education 2016-07, Vol.41 (5), p.721-732 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The aim of this study was to improve the criterion-related test score interpretation of a text-based assessment of scientific reasoning competencies in higher education by evaluating factors which systematically affect item difficulty. To provide evidence about the specific demands which test items of various difficulty make on pre-service teachers' scientific reasoning competencies, we applied a general linear mixed model which allows estimation of the impact of item features on the response observations. The item features had been identified during a standard setting process. Results indicate important predictive potential of one formal item feature (length of response options), two features based on cognitive demands (processing data from tables, processing abstract concepts) and one feature based on solid knowledge (specialist terms). The revealed predictive potential of item features was in accordance with the cognitive demands operationalised in our competence model. Thus, we conclude that the findings support the validity of our interpretation of the test scores as measures of scientific reasoning competencies. |
---|---|
ISSN: | 0260-2938 1469-297X |
DOI: | 10.1080/02602938.2016.1164830 |