Can a workbook work? Examining whether a practitioner evaluation toolkit can promote instrumental use |
| |
Institution: | 1. University of Kansas Medical Center, Department of Preventive Medicine and Public Health, Kansas City, KS, 66160, United States;2. Marshfield Clinic Research Foundation, Center for Clinical Epidemiology and Population Health, Marshfield, WI 54449, United States;3. University of Nebraska Medical Center, Division of Endocrinology, Omaha, NE 68198, United States;4. University of Kansas Medical Center, Department of Family Medicine, Kansas City, KS, 66160, United States;5. University of Kansas Medical Center, Department of Biostatistics, Kansas City, KS, 66160, United States;6. CUNY School of Public Health, New York, NY 10016, United States;7. University of Florida, College of Public Health and Health Professions, PO Box 100185, Gainesville, FL 32610, United States;1. Department of Nutrition and Food Systems, The University of Southern Mississippi, 118 College Drive #5172, Hattiesburg, MS 39406, United States;2. Agricultural Research Service, United States Department of Agriculture, Stoneville, MS 38776, United States;3. Department of Nutrition and Food Systems, The University of Southern Mississippi, 118 College Drive #5172, Hattiesburg, MS 3940, United States |
| |
Abstract: | In large-scale, multi-site contexts, developing and disseminating practitioner-oriented evaluation toolkits are an increasingly common strategy for building evaluation capacity. Toolkits explain the evaluation process, present evaluation design choices, and offer step-by-step guidance to practitioners. To date, there has been limited research on whether such resources truly foster the successful design, implementation, and use of evaluation findings. In this paper, we describe a multi-site project in which we developed a practitioner evaluation toolkit and then studied the extent to which the toolkit and accompanying technical assistance was effective in promoting successful completion of local-level evaluations and fostering instrumental use of the findings (i.e., whether programs directly used their findings to improve practice, see Patton, 2008). Forensic nurse practitioners from six geographically dispersed service programs completed methodologically rigorous evaluations; furthermore, all six programs used the findings to create programmatic and community-level changes to improve local practice. Implications for evaluation capacity building are discussed. |
| |
Keywords: | Evaluation capacity building Multi-site evaluations Evaluation toolkits Instrumental use |
本文献已被 ScienceDirect 等数据库收录! |
|