Can Experimental Comparison Group Methods Match the Findings From a Random Asignment Evaluation of Mandatory Welfare-to-Work Programs
- Bloom, Howard, Charles Michalopoulos, Carolyn Hill, Ying Lei
- Manpower Demonstration Research Corporation,
for useful discussion. This paper was written while the authors were working at the national Headquarters of Progresa in Mexico City; the views in the paper do not reflect the official position of Progresa. Abstract Two previous studies have used data from a social experiment to measure the impact of Progresa on children's height. However most large scale nutrition interventions cannot afford to undertake a costly social experiment to measure program impact, leading to the question of whether program impact can be adequately estimated using non-experimental techniques. This study uses anthropometric data on beneficiary children taken from health centers to estimate program impact, and compares these estimates to those from the social experiment. The results show that the clinic based estimates are significantly smaller than those from the experiment. This result caused by two factors. First, significant differences between actual and listed treatment causes the impact estimator based on listed treatment to be biased downward (measurement error). Second, the clinic based data does not allow for the inclusion of an extensive set of control variables beyond child's age and sex. The omitted variables are positively associated with program participation but negatively associated with child height, resulting in a further downward bias in program effects.