Day 1 - WordPress.com

Download Report

Transcript Day 1 - WordPress.com

Communicating effectively
evaluation findings
GIZ evaluation workshop
14-16 October 2014
Kampala, Uganda
Glenn O’Neil
Workshop schedule
Day 1: communiction theory, communication
in evaluation, communication plan and the
evaluation report
Day 2: Promotion, messages, tools and
presentations
Day 3: Practical implementation
Workshop objective
Participants improve their know-how and
skills in analysis, report writing and
promotional tactics in order to communicate
evaluating findings effectively
Exercise
What are the challenges you face in
communication and evaluation?
Note down keywords on Post-It notes
and post them on the whiteboards
1. Definitions and
communication theory
Evaluation
The systematic and objective assessment of
an on-going or completed project,
programme or policy, its design,
implementation and results. The aim is to
determine the relevance and fulfilment of
objectives, efficiency, effectiveness, impact
and sustainability. (DAC-OECD, 2002)
Communication
The process of people sharing thoughts,
ideas and feelings with each other in
commonly understood ways
Organisational communication:
Activities that are dedicated to the management of
communications between an organisation and its
publics.
Why communicate evaluation findings?
Feedback
Validate
Inform
Improve
Aid
Influence
Change
Communication theory
Message
Organisation
Public
Silver bullet theory
Communication theory
Minimal effects theory
Communication theory
win-win
Two-way symmetrical communications
Communication process
Message
Encoding
Channel
Decoding
Noise
Sender
Receiver
Feedback
Frame of
reference
Environment
Frame of
reference
Theory to practice
What can communication theory and its study tell
us?
1.
2.
3.
4.
Environment and sender are important
The more targeted the more effective
Level of interactivity dictates level of influence
What influences understanding is not only the
content but the way the message is
communicated
5. To make people aware they need to be reached
through different means
6. Communication rarely works 100%!
Exercise
Communication challenges in the evaluation
process
Read the scenario assigned to you and
determine what is the communication
challenge? Channel, the sender, encoding,
etc.?
2. Communicating before,
during and after the evaluation
What makes an effective evaluation?
1. Takes into account the context of the evaluation
2. Identifies the evaluation audiences and involves
them early
3. Communicates frequently and reports interim
results
4. Tailors reporting to audience needs
5. Reports results in a timely manner and to a variety
of audiences
6. Presents vivid and concrete illustrations of findings
7. Use clear and jargon-free language
Source: Torres, R. T., Preskill, H. S. & Piontek, M. E. (2005). Evaluation strategies for communicating and reporting: Enhancing learning in organizations. Sage Publications.
Before, during and after
• Communications needs to be planned and
thought about in the initial stages of the
evaluation
• Responsibility for communications is often shared
between the commissioner and the evaluator(s)
• A lot of communications is internal (i.e. to staff,
partners and governing bodies) and not external
• Often we think about only communicating findings
but we need to communicate before and during
the evaluation process
Source: Torres, R. T., Preskill, H. S. & Piontek, M. E. (2005). Evaluation strategies for communicating and reporting: Enhancing learning in organizations. Sage Publications.
WHY?
Before, during and after
Before
During
After
•
•
•
•
•
•
•
•
• Disseminate findings
• Create awareness
• Promote usage
Develop ownership
Create awareness
Build relationships
Set expectations
Support access
Faciliate progress
Communicate progress
Present initial findings
(validation)
Source: Torres, R. T., Preskill, H. S. & Piontek, M. E. (2005). Evaluation strategies for communicating and reporting: Enhancing learning in organizations. Sage Publications.
Before – key points
• A lot of communication before involves planning
for during and after – but not only
• Communicating before often involves a lot of
direct contact and discussion with stakeholders
that are part of the evaluation process
• Issuing the Terms of Reference is an opportunity
to communicate to broader audiences
During - key points
• The level of communications during the
evaluation will vary depending upon the
participative nature of the evaluation
• As the evaluation team move towards results they
need to consider how much they want to share
their initial results
• Communication during the evaluation needs to be
flexible – e.g. a delay in data collection may mean
more regular updates are needed
During - key points
Before, during and after scheme
Before
Standard
evaluation
products
Terms of
reference
Workshop
Communication
activities
(examples)
Website
text
During
Inception
report
Preliminary
findings
Main lead for
communication
•
•
•
•
Online
discussion
group
Blog
Video
report
Infographic
Webinar
Email
updates
Social
media page
Media
articles
1 page
snapshot
Develop ownership
Create awareness
Build relationships
Set expectations
Commissioner
Evaluator
Evaluation
report
1-1
discussions
Communication
planning steps Design communication plan of action
Communication
objectives
(examples)
After
Consider key messages
•
•
•
•
Support access
Facilitate process
Communicate progress
Support validation of
findings
Refine communication plan
• Disseminate findings
• Create awareness
• Encourage use
Exercise
Before, during and after
-Break into groups
-Read the case study:
-Consider what are 3 opportunities for
communications in this evaluation
-Consider what are 3 challenges for
communications in this evaluation
3. The communication plan
The communication plan
Start with two questions:
1. What is particular about this evaluation context?
---- > situation analysis
1. How can communications support this evaluation?
---- > Objectives
The communication plan
Stakeholder/audience analysis
1. Primary: Management, sponsors, donors,
programme staff, governance
2. Secondary: Partners, related-programme staff
3. Tertiary: Like-minded bodies, potential partners,
academics, persons working in sector
The communication plan
Communication activities
• These are not your daily activities to keep the
evaluation running; more so activities that are planned
systematically to support your objectives
• Choice of activities depends upon many factors,
including;
–
–
–
–
Suitability for audiences
Level of interactivity desired
“Depth” of information to be communicated
Budget available
Activities - communication plan
Activities - communication plan
Communication activity / tool
Before During After
Interactivity
Reach
1-1 discussion
Workshop
Presentation
Email update
Online discussion group
Webinar
Social media page
E-newsletter
Web-based text
Press release
Brochure & flyer
Blog
Broadcast media (e.g. radio programme)
Interim report
Interactive web page
Drama / theatre
Photostory
Video report
Final report
Executive summary
1 page Snapshot (of findings)
Infographics /scorecards
Opinion editorial (media article)
X
X
X
X
X
X
X
X
X
X
X
High
High
High
High
High
High
High
Medium
Medium
Low
Low
High
Low
Low
High
High
Medium
Medium
Low
Low
Low
Low
Low
Personal
Personal
Personal
Personal
Medium
Medium
Mass
Medium
Mass
Mass
Medium
Mass
Mass
Medium
Mass
Medium
Medium
Mass
Mass
Mass
Mass
Mass
Mass
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
The communication plan - template
Situation analysis
Objectives
Stakeholders/audiences
Primary:
Main needs:
Before
During After
Secondary:
Main needs:
Before
During After
Tertiary:
Main needs:
Before
During After
Communication activities
Activity/tool
Audience(s):
When:
Which
Objective
By
whom?
Exercise
The communication plan
-Break into groups
-Create a communication plan!
- Report back to plenary
4. Creating the evaluation
report
Exercise
What are the Do’s and Don’ts of the
evaluation report?
Note down keywords on Post-It notes
and post them on the whiteboards
The evaluation report
• The report remains a key communication tool
• It also serves as a source from which other
summaries and tools will be drawn from
• We can consider best practices in terms of:
• Structure
• Content
• Design
The evaluation report - structure
• There is not a set structure but most reports contain:
•
•
•
•
•
•
•
•
Executive summary
Introduction
Short description/background of project/programme
Methodology description
Findings
Conclusions and recommendations (lessons learnt – optional)
Annexes
Findings can be organised on the basis of the evaluation
questions/criteria, the theory of change, major programme
activities or themes that emerge from the evaluation
• Regardless of structure you need to respond to the
evaluation questions!
The evaluation report -content
• Write in a logical manner
• Avoid compartmentalizing your findings
• Link findings to conclusions to recommendations
The evaluation report -content
Example of a logical sequence
You answer
directly the
question
You provide
evidence to
suppport
your answer
Q. Is the Central Registry an effective tool in obtaining information and extracting disaster management information?
This review has found that the Central Registry has been limited in its effectiveness in providing disaster management
information. The CR has not substantially contributed to users obtaining information and consequently using this
information to facilitate the rapid delivery of emergency assistance.
The out-of-date nature of the information was raised consistently by survey and interview respondents as being a
major impediment to the CR being an effective tool. What is the extent of out-of-date information on the CR? Based
on a content analysis of contact data for six CR directories, it was found that on average only 34% of the content has
been updated in 2013 and 2014 with wide variation per directory as show below.
Display key
data in a
graphic form
You expand
upon the initial
findings
Provide
supporting
evidence
You indicate
the what this
means
Updated content
Directory 1
Directory 2
Directory 3
Directory 4
Directory 5
Directory 6
19%
59%
4%
62%
26%
36%
Further to the existence of other sources, users, potential users and stakeholders questioned if the concept of the CR,
an online centralised database, was the most efficient tool for obtaining and extracting information on disaster
management…All major donor governments interviewed shared this opinion as summarised by this comment of an
emergency management official and CR user:
“The use of the CR seems a little bit outdated due to the fact that requests for assistance are not made
out of the resources listed in the directory but out of needs in the actual situation and response is offered
from where the resources are available.”
Conclusion: The out-of-date nature, the incompleteness of the information in CR and the poor user experience it
offers (discussed further below) have all impeded the small number of users that sought to use the tool for the
purpose for which it was conceived… Consequently, the CR is not an integral or central part of emergency response
processes of governments, NGOs and the UN – including OCHA
37
The evaluation report -design
Use of headings and
sub-headings
Use of lists
& boxes to
break up
text
Contrast
between
highlight
box & text
Alignment of
information
Repetition
of colours
& fonts
Use tables, charts, illustrations to break up the text--- examples on next pages
The evaluation report -design
Examples: showing project’s activities/progress over time
The evaluation report -design
Examples: summarising progress
The evaluation report -design
Examples: scorecards to summarise progress/rate activities
Estimated
progress
Key initiatives
Estimated
contribution
of
organisation
Strength
evidence
Strand I: Improving and Making the Case for aid to fight hunger and poverty
I-1:The EU’s leadership on quality and quantity of aid is rebuilt, putting pressure on other donors to follow suit
Rebuild EU leadership on quality and quantity of aid
Increase political and public support for ODA in Spain
Increase political and public support for ODA in Germany
Increase understanding of importance of EU Parliament elections
Type
Policy paper
Main use
Detailed position on
policy
Rating (1-5)
4/5
Report
Often a direct
response to a given
process.
One guide has been
produced with a
second one planned.
Often released to
summarise
conclusions and
recommendations of a
conference or as a
comment on a
process
5/5
Guide
Declaration /
statement
4/5
3/5
Comment
High quality content, issues
seen with follow-up and wider
usage.
High quality products targeting
specific processes.
The guide on the UPR process
was well appreciated. Difficult
to assess its full usage.
Statements often effective in
taking a position on a key
point. Follow up of
declarations and their utility
difficult to determine.
of
The evaluation report -design
Examples: illustrating processes
Fax
orders
5%
Web
orders
9%
OP/ASSIST
promotion
22%
Direct orders
from delegations
38%
National
societies
Civil society
Armed forces
Promotional
mail out
25%
IOs / NGOs
ICRC
delegations
Exercise
The evaluation report
-
Break into groups
Review the case materials provided
Carry out exercise
Report to the plenary
Summary of day one