impact evaluation

Download Report

Transcript impact evaluation

IMPACT EVALUATION
Impact Evaluation for
Evidence-Based Policy
Making
Arianna Legovini
Lead, Africa Impact Evaluation
Initiative
1
Answer Three Questions
IMPACT EVALUATION
• Why is evaluation valuable?
• What makes a good impact
evaluation?
• How to implement evaluation?
2
IE Answers: How do we turn this
teacher…
IMPACT EVALUATION
3
…into this teacher?
IMPACT EVALUATION
4
Why Evaluate?
IMPACT EVALUATION
• Need evidence on what works
– Allocate limited budget
– Fiscal accountability
• Improve program/policy overtime
– Operational research
– Managing by results
• Information key to sustainability
– Negotiating budgets
– Informing constituents and managing press
– Informing donors
5
Traditional M&E and
Impact Evaluation
• monitoring to track
implementation
efficiency (inputoutput)
IMPACT EVALUATION
• impact evaluation to
measure effectiveness
(output-outcome)
BEHAVIOR
MONITOR
EFFICIENCY
INPUTS
OUTPUTS
OUTCOMES
EVALUATE
EFFECTIVENESS
$$$
6
Question types and methods
IMPACT EVALUATION
• Process Evaluation / Monitoring:
▫Is program being implemented efficiently?
▫Is program targeting the right population?
▫Are outcomes moving in the right
direction?
Descriptive
analysis
• Impact Evaluation:
▫What was the effect of the program on
outcomes?
▫How would outcomes change under
alternative program designs?
▫Does the program impact people
differently (e.g. females, poor, minorities)?
▫Is the program cost-effective?
Causal
analysis
7
Which can be answered by
traditional M&E and which by IE?
IMPACT EVALUATION
• Are books being delivered as planned?
• Does de-worming increase school
attendance?
M&E
IE
• What is the correlation between enrollment
and school quality?
M&E
• Does the decentralized school management
lead to an increase in learning
achievement?
IE
8
Types of Impact Evaluation
IMPACT EVALUATION
• Efficacy:
– Proof of Concept
– Pilot under ideal conditions
• Effectiveness:
– At scale
– Normal circumstances & capabilities
– Lower or higher impact?
– Higher or lower costs?
9
So, use impact evaluation to….
IMPACT EVALUATION
• Test innovations
• Scale up what works (e.g. de-worming)
• Cut/change what does not (e.g. HIV
counseling)
• Measure effectiveness of programs (e.g.
JTPA )
• Find best tactics to e.g. change people’s
behavior (e.g. come to the clinic)
• Manage expectations
e.g. PROGRESA/OPORTUNIDADES
(Mexico)
10
Next question please
IMPACT EVALUATION
• Why is evaluation valuable?
• What makes a good impact
evaluation?
• How to implement evaluation?
11
Assessing impact
IMPACT EVALUATION
• examples
– How much do girl scholarships increase
school enrollment?
– What is the level of beneficiary’s
learning achievement with program
compared to without program?
• Compare same individual with &
without programs at the same point in
time
• Never observe same individual with
and without program at same point in
12
Solving the evaluation problem
IMPACT EVALUATION
• Counterfactual: what would have
happened without the program
• Need to estimate counterfactual
– i.e. find a control or comparison group
• Counterfactual Criteria
– Treated & counterfactual groups have
identical initial characteristics on
average,
– Only reason for the difference in
outcomes is due to the intervention
13
2 “Counterfeit” Counterfactuals
IMPACT EVALUATION
• Before and after:
– Same individual before the treatment
• Non-Participants:
– Those who choose not to enroll in
program
– Those who were not offered the
program
14
Before and After Example
IMPACT EVALUATION
• Food Aid
– Compare mortality before and after
– Find increase in mortality
– Did the program fail?
– “Before” normal year, but “after”
famine year
– Cannot separate (identify) effect of
food aid from effect of drought
15
Before and After
IMPACT EVALUATION
• Compare Y before and
after intervention
B
Before-after
counterfactual
A-B Estimated impact
• Control for time varying
factors
C
True
counterfactual
A-C True impact
A-B is under-estimated
Y
Before
After
C
A
B
B
t-1
Treatment
t
Time
16
Non-Participants….
IMPACT EVALUATION
• Compare non-participants to
participants
• Counterfactual: non-participant
outcomes
• Problem: why did they not
participate?
17
Exercise: Why participants and
non-participants differ?
• Children who come to
school and children who
do not?
• Communities that applied
for funds for a new
classroom and
communities that did not?
IMPACT EVALUATION
Access to
school
Poorer
Unmet
demand
More
organized
community
Achievement
Poverty
• Children who received
scholarships and children
Gender
18
Literacy program example
IMPACT EVALUATION
• Treatment offered
• Who signs up?
– Those who are illiterate
• Have lower education than those who
do not sign up
• Educated people are a poor estimate
of counterfactual
19
What's wrong?
IMPACT EVALUATION
• Selection bias: People choose to
participate for specific reasons
• Many times reasons are directly
related to the outcome of interest
• Cannot separately identify impact of
the program from these other
factors/reasons
20
Program placement example
IMPACT EVALUATION
• Government offers school inputs
program to schools with low
infrastructure
• Compare achievement in schools
offered program to achievement in
schools not offered
• Program targeted based on lack of
inputs, so
– Treatments have low achievement
– Counterfactuals have high achievement
• Cannot separately identify program
21
Need to know…
IMPACT EVALUATION
• Why some get program and others do
not
• How some get into treatment and other
in control group
• If reasons correlated with outcome
– cannot identify/separate program impact
from
– other explanations of differences in
outcomes
22
Possible Solutions…
IMPACT EVALUATION
• Guarantee comparability of treatment
and control groups
• ONLY remaining difference is
intervention
• In this workshop we will consider
– Experimental design/randomization
– Quasi-experiments
• Regression Discontinuity
• Double differences
– Instrumental Variables
23
These solutions all involve…
IMPACT EVALUATION
• Randomization
– Give all equal chance of being in
control or treatment groups
– Guarantees that all
factors/characteristics will be on
average equal between groups
– Only difference is the intervention
• If not, need transparent & observable
criteria for who is offered program
24
The Last Question
IMPACT EVALUATION
• Why is evaluation valuable?
• What makes a good impact evaluation?
• How to implement evaluation?
25
Implementation Issues
IMPACT EVALUATION
• Political economy
• Policy context
• Finding a good control
– Retrospective versus prospective
designs
– Making the design compatible with
operations
– Ethical Issues
• Relationship to “results” monitoring
26
Political Economy
IMPACT EVALUATION
• What is the policy purpose?
– In USA test innovations to national
policy, defend budget
– In RSA answer electorate
– In Mexico allocate budget to poverty
programs
– In IDA country pressure to demonstrate
aid effectiveness and scale up
– In poor country hard constraints and
ambitious targets: how to reach those
targets?
27
Evidence culture and incentives for
change
IMPACT EVALUATION
• Cultural shift
– From retrospective evaluation
Look back and judge
– To prospective evaluation
Decide what need to learn
Experiment with alternatives
Measure and inform
Adopt better alternatives overtime
• Change in incentives
– Rewards for changing programs that do not
work
28
– Rewards for generating knowledge
The Policy Context
IMPACT EVALUATION
• Address policy-relevant questions:
– What policy questions need answers?
– What outcomes answer those
questions?
– What indicators measures outcomes?
– How much of a change in the outcomes
would determine success?
• Example: teacher performance-based
pay
– Scale up pilot?
– Criteria: Need at least a 10% increase in
29
Opportunities for good designs
IMPACT EVALUATION
• Use opportunities to generate good
control groups
• Most programs cannot deliver
benefits to all those eligible
– Budgetary limitations:
• Eligible who get it are potential treatments
• Eligible who do not are potential controls
– Logistical limitations:
• Those who go first are potential treatments
• Those who go later are potential controls
30
Who gets the program?
IMPACT EVALUATION
• Eligibility criteria
– Are benefits targeted?
– How are they targeted?
– Can we rank eligible's priority?
– Are measures good enough for fine
rankings?
Who goes first?
• Roll out
– Equal chance to go first, second, third?
31
Ethical Considerations
IMPACT EVALUATION
• Do not delay benefits: Rollout based
on budget/administrative constraints
• Equity: equally deserving
beneficiaries deserve an equal
chance of going first
• Transparent & accountable method
– Give everyone eligible an equal chance
– If rank based on some criteria, then
32
Retrospective Designs
IMPACT EVALUATION
• Hard to find good control groups
– Must live with arbitrary or unobservable
allocation rules
• Administrative data
– good enough to reflect program was
implemented as described
• Need pre-intervention baseline
survey
– On both controls and treatments
– With covariates to control for initial
differences
33
Manage for results
IMPACT EVALUATION
• Retrospective evaluation cannot be used to
manage for results
• Use resources wisely: do prospective
evaluation design
– Better methods
– More tailored policy questions
– Precise estimates
– Timely feedback and program changes
– Improve results on the ground
34
Monitoring Systems
IMPACT EVALUATION
• Projects/programs regularly collect
data for management purposes
• Typical content
– Lists of beneficiaries
– Distribution of benefits
– Expenditures
– Outputs
– Ongoing process evaluation
• Information is needed for impact
evaluation
35
Evaluation uses administrative
information to:
IMPACT EVALUATION
• Verify who is beneficiary
• When started
• What benefits were actually delivered
Necessary condition for program
to have an impact:
• benefits need to get to targeted
beneficiaries
36
Improve use of administrative
data for IE
IMPACT EVALUATION
• Program monitoring data usually only
collected in areas where active
– Collect baseline for control areas as well
• Very cost-effective as little need
for additional special surveys
– Add a couple of outcome indicators
• Most IE’s use only monitoring data
37
Overall Messages
IMPACT EVALUATION
• Impact evaluation useful for
– Validating program design
– Adjusting program structure
– Communicating to finance ministry
& civil society
• A good evaluation design requires
estimating the counterfactual
– What would have happened to
beneficiaries
if had not
received the program
38
Design Messages
IMPACT EVALUATION
• Address policy questions
– Interesting is what government needs
and will use
• Stakeholder buy-in
• Easiest to use prospective designs
• Good monitoring systems &
administrative data can improve IE
and lower costs
39