Date of Award
12-2018
Degree Name
Doctor of Philosophy
Department
Educational Leadership, Research and Technology
First Advisor
Dr. Richard Zinser
Second Advisor
Dr. Louann Bierlein Palmer
Third Advisor
Dr. Sango Otieno
Keywords
multiple choice assessment, career and technical education, vocational assessment, classical test theory, hierarchical linear modeling (HLM)
Abstract
Critical decisions in Career and Technical Education (CTE) can be based on assessment outcomes, requiring an essential focus on continuous improvement initiatives to provide increased multiple choice (MC) assessment validity. Previous research has determined that the presence of flawed test items negatively impacts student success; therefore, an MC taxonomy has been established and used in the assessment industry. While rigorous and successful Career and Technical Education (CTE) programs that enroll high-level academic students exist, many CTE programs enroll students with a wide spectrum of academic ability. Because many CTE students often have inferior reading skills, it is necessary to take the lower-academic population into consideration when developing MC assessments for CTE programs.
This quantitative study examines the relationship between MC item difficulty and MC item length (focusing on MC item stem and response option lengths) for a national CTE employability assessment, which encompassed nearly 3,500 CTE student test-takers. Hierarchical Linear Modeling (HLM) analyzes the significance between the assessment outcome variables—proportion correct, discrimination index and point biserial correlation—with the predictor variables—stem length, response option length, Bloom’s taxonomy level, readability level and student demographics—for the total and lower-scoring groups of CTE students.
There are statistically significant correlations between MC item length and item analysis outcomes. Lengthy MC items increase the difficulty for most of the item analysis outcomes, with an increased difficulty for the lower-scoring group of test-takers. Best practices suggest that MC item writers develop concise items by avoiding extraneous wording. This research serves as a case study in assessment analysis in the context of CTE, and reinforces best practices. When assessment providers are mindful of the test-taking population, issues of bias may be avoided through solutions such as writing shorter, concise MC items for lower-level readers.
Access Setting
Dissertation-Open Access
Recommended Citation
Koepf, Tina M., "The Effect of Item Stem and Response Option Length on the Item Analysis Outcomes of a Career and Technical Education Multiple Choice Assessment" (2018). Dissertations. 3366.
https://scholarworks.wmich.edu/dissertations/3366