What do you want to learn about your program? A key activity during the evaluation Planning Stage is articulating questions that you would like the evaluation to answer. The logic model is a helpful way to think through different categories of questions you may have. Each element of the logic model is listed below with accompanying sample evaluation questions. These questions illustrate the myriad of questions the evaluation could address; you and your evaluator should discuss additional questions the program may want addressed.
- What are the political, economic, and social characteristics of the community in which the program operates?
- How (well) does the program align with the sponsoring organization's mission and priorities?
- Why do you think the problem your MRE program is addressing is amenable to intervention?
- Why do you think your program will help?
- What is your underlying "program theory" regarding why your program will address or prevent the problem in the target population?
- Does the target population need and want your program? What evidence do you have (statistics, anecdotes, personal experience) to suggest this?
- What resources (money, staff, facilities, technology, curriculum materials) are being used to run the program? Are these resources enough to deliver the kind of program you want?
- Who is staffing the program, and are they appropriately qualified?
Interventions and Activities
- What are the major interventions, or components of the program provided to participants (e.g., educational classes, support groups, mentoring, etc.)? What are the primary activities?
- What is the program's service delivery model? That is, where, when, how often, in what ways, and by whom are services provided?
- To what extent is the program recruiting and enrolling the intended target population? If not meeting expectations, why not?
- What are the characteristics of those enrolling in program services? How does this compare to those that are recruited but do not enroll in the program?
- Before the intervention, how do participants rate on immediate outcomes (i.e., knowledge, skills, attitudes towards marriage, intentions and behavior explicitly addressed in program content)? On which outcomes do participants appear to be faring well prior to intervention? Which outcome levels are of concern? Following program completion, how are participants faring on these outcomes?
- What are participants and instructors saying about their experiences with the program? What do they like? What don't they like? What suggestions do they have to improve the program? Would participants recommend the program to others?
- Before the intervention, how do participants rate on key subsequent outcomes (e.g., relationship satisfaction, couple communication)? On which outcomes do participants appear to be faring well prior to intervention? Which outcome levels are of concern?
- On which subsequent outcomes did participants experience statistically significant changes during the program or after program completion? Are these the outcomes most in need of change? Was the amount of change sufficient to bring them up to a desired level of functioning?
- Why are you intervening? What is the problem that you are trying to address or prevent, and for whom is it a problem?
- American Academy of Pediatrics. (2006). Evaluating your community-based program. Elk Grove Village, IL: Author.
- Taylor-Powell, E., Steele, S., & Douglas M. (1996). Planning a program evaluation. Madison, WI: University of Wisconsin-Extension, Cooperative Extension.
- U.S. Department of Health and Human Services, Administration for Children and Families, Office of Planning, Research & Evaluation. The Program Manager’s Guide to Evaluation Glossary.
- "Evaluation Resource Guide for Responsible Fatherhood Program," Office of Family Assistance.