Date of Award

12-2012

Degree Name

Doctor of Philosophy

Department

Educational Leadership, Research and Technology

First Advisor

Dr. Brooks Applegate

Second Advisor

Dr. Marianne DiPierro

Third Advisor

Dr. Michael Bamberger

Keywords

Health evaluation, research on evaluation, validity and adequacy, international health, public health evaluation, research design

Abstract

Poorly conducted international health intervention evaluations (IHIE) may place the health of recipients of the intervention at risk (Stoto & Cosler, 2005). Moreover, few IHIEs meet globally established minimum criteria for sound methodology (Forssman, Gupta, and Burgess, 2006; Tones, Tilford, and Robinson, 1990). This research investigates the level of rigor in a sample of IHIEs, and delineates the institutional policies governing the evaluations. Specifically, this research seeks to answer four questions: (1) What are policies, guidelines, and requirements for program evaluation and evaluation reports, posed by international donors for evaluators? (2) What are the common types of research designs used to evaluate international health interventions? (3) What are common components and contents of reports from evaluations of international health interventions? (4) What is the level of rigor of those designs use to evaluate international health interventions?

A sample of 55 online evaluation reports published between 2005 and 2010 representing seven organizations funding IHIEs are reviewed. The Evaluation Report Checklist (Miron, 2004) and the Checklist for Assessing Threats to Evaluation Validity (Bamberger, Rugh and Mabry, 2012) are used to quantify the adequacy of the reports.

Findings show there is considerable variability and flexibility among the organizational policies and guidelines governing the IHIEs. Checklist analysis of the reports reveals that very few of the evaluations use rigorous designs to address program impact. Furthermore, the written reports only reflect the required information as stipulated by their specific policies and guidelines. Unfortunately, this information is often not adequate to assess if purpose of the intervention has been successfully achieved, thus compromising the transparency of the evaluation report.

While this study examines a limited set of evaluation reports from IHIEs, the implications of this research suggest that international funding bodies need explicit policies and procedures that guide both program evaluation design and evaluation reporting. Second, greater attention to both evaluation design and the components of the written report are needed to adequately represent program impacts. Thirdly, coupling more rigorous evaluation designs that can fully address program impact with more systematic and comprehensive reporting will result in greater transparency, an element that all IHIPs funders expose.

Access Setting

Dissertation-Campus Only

Restricted to Campus until

12-15-2032

Share

COinS