Date of Award
6-2004
Degree Name
Doctor of Philosophy
Department
Educational Studies
First Advisor
Dr. Brooks Applegate
Second Advisor
Dr. Jianping Shen
Third Advisor
Dr. Frances Lawrenz
Abstract
Two fundamental purposes exist for program evaluation: to document program results and to improve programs. These purposes are commonly called summative and formative evaluation. The fundamental questions related to these purposes are (1) what occurred in the program? and (2) how can the program be improved? The answer to the second question implies the need to explain why program results occurred.
This dissertation developed an approach to support formative evaluation and answering the question of why program results occur. This approach integratedmultisite evaluation, theory-based evaluation and structural equation modeling. The context for this dissertation was the National Science Foundation's Advanced Technological Education (ATE) program. ATE program logic indicated that project characteristics and organizational practices were positively related to levels of collaboration; levels of collaboration were positively related to productivity in materials development, professional development, and program improvement; and program improvement was positively related to student impact.
This study questioned (1) if the ATE program logic model fit the empirical data available from an annual survey of ATE projects, (2) if the model could be fitted, what could be concluded about the relationships between program characteristics and results, and (3) if the model fit was not optimal, could the program logic model be modified to improve the fit.
Robust maximum likelihood estimation, as implemented in LISREL 8.54, was used to determine the model fit. The ATE program logic model provided an overall acceptable fit to the data, though standardized path coefficients indicated that some components of the model were not supported. Results also suggested that the measurement of program characteristics, organizational practices, and materials development was poor. An alternative model with collaboration driving program improvement and professional development results, and program improvement driving student impact provided the strongest fit to the data.
The implications of this study were that structural equation modeling represents a promising analytical approach to support formative evaluation in multisite evaluation. Challenges in implementing the approach are articulating and measuring the program logic model, and achieving sufficient sample size. Recommendations for evaluation practitioners and future research are provided.
Access Setting
Dissertation-Open Access
Recommended Citation
Hanssen, Carl Edward, "Structural Equation Modeling as a Tool for Multisite Evaluation" (2004). Dissertations. 1104.
https://scholarworks.wmich.edu/dissertations/1104