Integrating Evaluation into the Design of the Minnesota

Download Report

Transcript Integrating Evaluation into the Design of the Minnesota

Integrating Evaluation into the
Design of the Minnesota
Demonstration Project
Paint Product Stewardship Initiative
St. Paul, MN
May 1, 2008
Matt Keene, Evaluation Support Division
National Center for Environmental Innovation
Office of Policy, Economics and Innovation
US Environmental Protection Agency
Purpose of the Presentation
 Provide the PPSI with an understanding of
the work of the evaluation committee and the
process of integrating evaluation into the
design of the Minnesota Demonstration
Project.
2
Presentation Outline
1. Introduction to Program Evaluation
2. The Evaluation Committee & Goal 6
3. Integrating Evaluation into MN
4. Questions, Comments, and Next Steps
3
Program Evaluation
 Definition
• A systematic study that uses measurement and
analysis to answer specific questions about how
well a program is working to achieve its outcomes
and why.
4
 Orientation/Approaches to Evaluation
• Accountability
External Audience
• Learning & Program Improvement
Internal/External Audiences
Measurement and Evaluation
Program Evaluation:
A systematic study that uses measurement and
analysis to answer specific questions about how
well a program is working to achieve its
outcomes and why.
Performance Measurement:
A basic and necessary component of program
evaluation that consists of the ongoing
monitoring and reporting of program progress
and accomplishments, using pre-selected
performance measures.
5
The Evaluation Committee & Goal 6
 Evaluation Team (Committee)
 Purpose and Function of the Evaluation Team
 Funding and Support
 The Work Plan and Goal 6
6
The Evaluation Committee & Goal 6
 What will we evaluate?
• Paint, Management Systems, Education, Markets
 Why are we evaluating the program?
• Leadership, Learning, Transfer
 Can we evaluate this project?
• We must integrate evaluation into the project
• We need a framework to follow…and we are
building it as we go
• Initially, integrating evaluation into your program
is a design and planning activity
7
Integrating Evaluation into The
Minnesota Project
Program
1. Evaluation
Policy
2. Evaluation
Methodology
Documentation
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
1. Data Sources
2. Collection Methods
& Strategy
3. Analysis Tools
4. Data Collection
5. Data Management
Questions
1. Context
2. Audience
3. Communication
4. Use
Measures
Select and Describe the Program

is our program
Describing the MN
program
 Mission
 Goals and
objectives
 Logic model
Program
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
Evaluation Questions
What are the critical
questions to understanding
the success of the MN
program?
Program
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
Questions
Examples of Draft Questions
 Has this been a collaborative and cooperative
process?
Program
 How successful is the PSO?
 How effective are education
and outreach materials?
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
 How effective are the paint
management systems?
 What are the best options for a national
system?
Questions
Evaluation Questions
What contextual factors
may influence the answers
to each question?
Who are the audiences
for each question?
•What’s the best way to
communicate with each
audience?
•How might each audience
use the answer to each question?
Program
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
Questions
1. Context
2. Audience
3. Communication
4. Use
Evaluation Questions



What are the critical questions to
understanding the success of the
MN program?
What contextual factors may
influence the answers to each
question?
Who are the audiences for each
question?
•
•
What’s the best way to communicate
with each audience?
How might each audience use the
answer to each question?
Program
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
Questions
1. Context
2. Audience
3. Communication
4. Use
Performance Measures
 What can we measure to
answer each question?
 Where can we find the
information for each
measure?
Program
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
 How can we collect
the information?
1. Data Sources
2. Collection Methods
& Strategy
3. Analysis Tools
4. Data Collection
5. Data Management
 Given our questions
and information to be
collected, what will be an
effective collection
strategy?
Questions
1. Context
2. Audience
3. Communication
4. Use
Measures
Performance Measures
What analytical tools will
give us the most useful
information?
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
How will we implement
the collection strategy?
How will we manage
the data?
Program
Integrating
Evaluation into
Program Design
1. Data Sources
2. Collection Methods
& Strategy
3. Analysis Tools
4. Data Collection
5. Data Management
Questions
1. Context
2. Audience
3. Communication
4. Use
Measures
Performance Measures
What can we measure to answer each question?
Where can we find the information for each
measure?
Program
What methods are best suited for
measure?
What analytical tools will give
the most useful information?
Given our questions and
to be collected, what
collection strategy?
•How will we implement the
strategy?
•How will we manage the data?
each
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
us
Integrating
Evaluation into
Program Design
information
will be our
1. Data Sources
2. Collection Methods
& Strategy
3. Analysis Tools
4. Data Collection
5. Data Management
collection
Measures
Questions
1. Context
2. Audience
3. Communication
4. Use
Documentation:
Methodology & Policy
 Evaluation Methodology
 The process of integrating
evaluation generates a
framework for our
methodology
 Evaluation Policy
 Guide MN & PPSI
Program
1. Evaluation
Methodology
2. Evaluation
Policy
Documentation
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
1. Data Sources
2. Collection Methods
& Strategy
3. Analysis Tools
4. Data Collection
5. Data Management
 Guides strategy and
planning for evaluation
and program management
Questions
1. Context
2. Audience
3. Communication
4. Use
Measures
Check the Logic and Flow
 Revisit the process and the decisions made
 Look for the flow in the process
and identify potential breaks
 Identify potential obstacles to
our approach to
understanding and managing
the performance of the MN
demonstration program

1. Evaluation
Methodology
2. Evaluation
Policy
Documentation
1. Team
2. Mission
3. Goals & Objectives
4. Logic Model
Integrating
Evaluation into
Program Design
1. Data Sources
2. Collection Methods
& Strategy
3. Analysis Tools
4. Data Collection
5. Data Management
cycle is integrating – next cycle
begins implementation
1st
Program
Questions
1. Context
2. Audience
3. Communication
4. Use
Measures
Questions, Comments and Clarifications
1. Questions for the Demonstration Committee
2. Questions for the Evaluation Committee
1. Introduction to Program Evaluation
2. The Evaluation Committee & Goal 6
3. Integrating Evaluation into MN
19
Thank You!
Evaluation Support Division
National Center for Environmental Innovation
Office of Policy, Economics and Innovation
U.S. Environmental Protection Agency
20
Matt Keene
(202) 566-2240
[email protected]
www.epa.gov/evaluate