PPA 722 – Quantitative Analysis Professors Allard and Duncombe

Download Report

Transcript PPA 722 – Quantitative Analysis Professors Allard and Duncombe

Objectives for Session Eleven
Turn in Data Assignment
Discuss Hypothetical Formative
Evaluation Design
Analyzing Qualitative Data
Report Writing, Organization, and
Utilization
Strengths and Weaknesses of Different
Data Collection Strategies
What Findings or Patterns
Emerged from
Your Analyses?
STOP-DWI in NY State
STOP-DWI is implemented in NY to
reduce drunken-driving and alcoholrelated accidents
Program theory
Implementation/Service Delivery
– Checkpoints, tougher penalties, public service
announcements, and education programs
How would we know if it was being
implemented well?
Evaluation Design Proposal for
STOP-DWI in NY State
Program Description
– Who, where, what, how
– Steps in delivery, logic model, resources/inputs
Evaluation Questions and Justification
Data Collection Strategy
– Focus on 4 stages: checkpoints, adjudication in
courts, advertisements, and education programs
– Select methods that are appropriate for given stage
– Mix of methods – quantitative and qualitative
– Validity and reliability
– Sampling
Analyzing Data
Approach to Quantitative Analysis
– Missing data and Outliers
– Description of Respondents
– Dimensions of Interest/Relationships among Key
Variables – move to more sophisticated analysis
Approach to Qualitative Analysis
– Confidence in your data?
– What do you know about the program?
– What questions or relationships are you most
interested in? What does your data tell you about
these questions or relationships?
Strengths and Weaknesses of
Different Qualitative Techniques
Differences across observation techniques
Differences between focus groups and
interviews
Document review = highly variable
Note that a variety of techniques are
useful in a formative evaluation setting
Integrating Qualitative and
Quantitative Data
New programs
Program effects or measures are hard to
quantify
Qualitative data to provide insight into or
corroborate quantitative findings
Demands of client and context
Report Writing
Description of the Program
Presentation of Analysis
– Implemented as planned or intended?
– What affected implementation?
Interpretation of the Findings
– What do your data suggest or mean for the
stakeholders?
– Justify and support with data
Conclusions and Recommendations
– Prioritize, feasibility
Factors Undermining Utility of
Evaluation Reports
Poor quality
Lack generalizability
Lack program description
Mismatch between research questions
and questions data can answer
Not completed on time
Characteristics of commitment of
organization
Readings for Next Time
Leon, Dziegielewski, and Tubiak. “A Program
Evaluation Of A Juvenile Halfway House:
Considerations For Strengthening Program
Components.”
D’Emidio-Caston and Brown. “The Other
Side Of The Story: Student Narratives on the
California Drug, Alcohol, and Tobacco
Education Programs.”