presentation slides

Download Report

Transcript presentation slides

Strong Evaluation Designs for Programs
with Unexpected Consequences
Presented to the
United Nations Development Programme
February 20th, 2014
Jonathan A. Morell, Ph.D.
Director of Evaluation – Fulcrum Corporation
[email protected]
http://evaluationuncertainty.com
(734) 646-8622
© 2012 Jonathan Morell
The Essence of the Problem
Complex system behavior drives unexpected
outcomes
 Network effects
 Power law distributions
 Ignoring bifurcation points
 State changes and phase shifts
 Uncertain and evolving environments
 Feedback loops with different latencies
 Self organization and emergent behavior
 Ignoring full range of stable and unstable
conditions in a system
 Etc.
Guaranteed evaluation solution
 Post-test only
 Treatment group only
 Unstructured data collection
But we loose many evaluation tools
Time series data
Comparison groups
Specially developed surveys and interview
protocols
Qualitative and quantitative data collection
at specific times in a project’s life cycle
Etc.
Why the loss? Because establishing evaluation mechanisms require
Time
Effort
Money
Negotiations with program participants, stakeholders, and other parties
© 2012 Jonathan Morell
2
Some Examples of the Kinds of Problems we may Run Into
Program
Free and reduced
fees for postnatal services
Improve
agricultural yield
Improve access
to primary
education
© 2012 Jonathan Morell
Outcome Evaluation is
Looking for
Survey/interview
 Health indicators for
mother and child
 Child development
indicators
Possible Unexpected
Outcomes



Records, interviews,
observations
 Yield
 New system cost
 Profit

Records, surveys
 Attendance
 Graduation
 Life trajectory

Evaluation Design
Weakness
Drug and supply
hoarding
New sets of informal
fees
Lower than expected
use of service

Perverse effects of
increased wealth
disparities




Interaction with other
civil society
development projects
Networking effects of
connections



No interview or
observation to estimate
amount of fees
No way to correlate
fees with attendance or
client characteristics
No other communities
to check on other
reasons for disparity
No interviews to check
on consequences
disparities
Census of other civil
society projects
Data on interaction
among projects
Data on consequences
of interaction
Adding “Surprise” to Evaluation Planning
Funding
Deadlines
Logic models
Measurement
Program theory
Research design
Information use plans
Defining role of evaluator
Logistics of implementation
Planning to anticipate and respond to surprise
© 2012 Jonathan Morell
4
Overall Summary: Methods
Foreseeable
·
·
·
·
·
Unforeseeable
· Complex system behavior
makes prediction impossible no
matter how clever we are.
PS – do not assume that complex
systems are always unpredictable!
Get lucky
Knowledge from stakeholders
Good program theory
Use research literature
Use experts
Theory
Limiting time frames
Exploiting past experience
Forecasting & program monitoring
System based logic modeling
Retooling program theory
Agile methodology
Data choices
© 2012 Jonathan Morell
5
These methods are most useful early in evaluation life cycle
Foreseeable
·
·
·
·
·
Unforeseeable
Get lucky
Knowledge from stakeholders
Good program theory
Use research literature
Use experts
· Complex system behavior
makes prediction impossible no
matter how clever we are.
PS – do not assume that complex
systems are always unpredictable!
Theory
Limiting time frames
Exploiting past experience
Let’s look at this one.
© 2012 Jonathan Morell
6
Example Improve Access to Primary Education
Outcome Evaluated For
Records, surveys
 Attendance
 Graduation
 Life trajectory
Possible Unexpected Outcomes


Interaction with other civil
society development projects
Networking effects of
connections
A Relevant Theory: We Know About Phase
Shifts When Network Connections Increase



Census of other civil society
projects
Data on interaction among
projects
Data on consequences of
interaction
Evaluation
Redesign




© 2012 Jonathan Morell
Evaluation Design Weakness
Identify other civil society programs
Measure connections
Ignore details of which programs are connected
Collect data frequently to detect timing of change
These methods are most useful for detecting leading indicators
Foreseeable
·
·
·
·
·
Unforeseeable
· Complex system behavior
makes prediction impossible no
matter how clever we are.
PS – do not assume that complex
systems are always unpredictable!
Get lucky
Knowledge from stakeholders
Good program theory
Use research literature
Use experts
Forecasting & program monitoring
System based logic modeling
Let’s look at
this one.
© 2012 Jonathan Morell
The trick is to do a
little better than the
Delphic oracle
8
Example: Agricultural Yield
Outcome Evaluated For
Records, interviews,
observations
 Yield
 New system cost
 Profit
Possible Unexpected Outcomes

Perverse effects of increased wealth
disparities
Evaluation Design Weakness


No other communities to check
on other reasons for disparity
No interviews to check on
consequences disparities
Evaluation Methodology: Expand Monitoring Outside Boarders of Agriculture Program
Evaluation Redesign
© 2014 Jonathan Morell
Adopt a “whole community” perspective
 Identify a wide range of social indicators
 Identify a diverse set of key informants
 Conduct regular open-ended interviewing
Agile Evaluation
Foreseeable
Unforeseeable
Let’s look at
this one.
·
·
·
·
·
Get lucky
Knowledge from stakeholders
Good program theory
Use research literature
Use experts
How can an evaluation be designed to change?
· Complex system behavior
makes prediction impossible no
matter how clever we are.
PS – do not assume that complex
systems are always unpredictable!
Data choices
Agile methodology
Retooling program theory
© 2012 Jonathan Morell
10
Example Free / Reduced Fees for Post-Natal Services
Outcome Evaluated For
Survey/interview
 Health indicators for mother
and child
 Child development indicators
Possible Unexpected Outcomes



Drug and supply hoarding
New sets of informal fees
Lower than expected use of
service
Evaluation Design Weakness


No interview or observation to
estimate amount of fees
No way to correlate fees with
attendance or client
characteristics
Add a process component to the evaluation design
Survey of mothers to assess total cost of service
Open ended interviews with clinic staff about consequences of the new system for their work lives
Nice to say, but agile evaluation can be expensive
Do we want both?
Do we want only one of these tactics?
These are the kinds of questions that have to be added to all the other decisions we make when designing an
evaluation
© 2014 Jonathan Morell