Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DataCite
DublinCore
EndNote
NLM
RefWorks
RIS

Files

Abstract

The impact of item level cognitive supports is of considerable interest to assessment developers. The Alternate Assessment based on Modified Achievement Standards (AA-MAS) currently permits the modification of item design and the next-generation assessment systems will permit the presentation of accommodations based on the personal needs of each student. Understanding the impact of these cognitive supports, however, is difficult under descriptive Item Response Theory (IRT). This study uses explanatory modeling, specifically the Random Effects Linear Logistic Test Model with interaction components, to understand the impact of covariates on item response. The covariates considered include mathematics domain, cognitive complexity, type of cognitive supporthint, scaffold, language reduction, and general formatting--student AA-MAS eligibility status, and the interaction between eligibility status and each respective cognitive support. Results suggest that embedded hints or scaffolds do not significantly influence item response at a group level. Language reduction and general formatting, however, influence response with the degree of impact contingent on student eligibility status. To facilitate discussion, the author incorporates student response to post-pilot survey items and the operational AA-MAS performance standards. The additional sources of information assist in considering the results in the context of the next-generation assessment system(s). In order to facilitate explanatory item response modeling among large-scale assessment researchers, all results are presented sequentially and the practical problem of predictor missingness is addressed.

Details

PDF

Statistics

from
to
Export
Download Full History