Causal inferences on the effectiveness of complex social programs: Navigating assumptions,sources of complexity and evaluation design challenges |
| |
Affiliation: | 1. University of North Carolina at Chapel Hill, UNC Eshelman School of Pharmacy, Division of Pharmaceutical Outcomes and Policy, Asheville, NC 28804, United States;2. Georgia Southern University, Jiann-Ping Hsu College of Public Health, Department of Community Health Behavior & Education, Statesboro, GA 30460, United States;3. University of South Florida, Department of Educational Measurement and Research, Tampa, FL 33620, United States;1. Oregon State University, 118B Milam Hall, Corvallis, OR 97331, United States;2. California State University, Los Angeles, Los Angeles, CA, United States;3. University of Arizona, Tucson, AZ, United States;4. University of Maryland, College Park, Columbia, MD, United States;5. Yale University, CT, United States;1. #506 188 15th Avenue SW, Calgary, T2R 1S4, Canada;2. Department of Sociology, University of Calgary, 2500 University Dr NW, Calgary, AB T2N 1N4, Canada |
| |
Abstract: | This paper explores avenues for navigating evaluation design challenges posed by complex social programs (CSPs) and their environments when conducting studies that call for generalizable, causal inferences on the intervention’s effectiveness. A definition is provided of a CSP drawing on examples from different fields, and an evaluation case is analyzed in depth to derive seven (7) major sources of complexity that typify CSPs, threatening assumptions of textbook-recommended experimental designs for performing impact evaluations. Theoretically-supported, alternative methodological strategies are discussed to navigate assumptions and counter the design challenges posed by the complex configurations and ecology of CSPs. Specific recommendations include: sequential refinement of the evaluation design through systems thinking, systems-informed logic modeling; and use of extended term, mixed methods (ETMM) approaches with exploratory and confirmatory phases of the evaluation. In the proposed approach, logic models are refined through direct induction and interactions with stakeholders. To better guide assumption evaluation, question-framing, and selection of appropriate methodological strategies, a multiphase evaluation design is recommended. |
| |
Keywords: | Impact evaluations Experimental designs Mixed methods Causal inferences Complex social programs |
本文献已被 ScienceDirect 等数据库收录! |
|