Transcript Slide 1
Using Dashboards to present data
to your Board:
Quality and Patient Safety
Aunyika Moonan, PhD, MSPH, CPHQ
SCHA’s Director of Quality Measurement Services
SC AHQ, May 9, 2008
Objectives
• What is a dashboard?
• Making the case why boards need to be
on board?
• How do you get your board to improve
quality and patient safety?
• What data do you include in dashboards?
• Which performance improvement tools do
you use with the board?
• How do you present your data to the
board?
Purpose of a dashboard?
A dashboard is a powerful took to keep
leaders focused on the organization’s key
issues and strategies. Well-chosen
performance indicators displayed at a glance
format help identify areas that are doing well
and need improvement. Dashboard can
include indicators such as financial viability,
clinical outcomes, patient safety, quality of
care or satisfaction rates.
Use of a Dashboard
• Focus senior executives’ attention
• Link to organization’s aims/goals and
strategic plan
• Few pages
• Show Improvement
Board Leadership is a critical ingredient to
achieving better, safer care:
Survey link better outcomes are associated with hospitals where...
1. The board spends >25% of time on quality issues(p =
0.009);
2. The board receives a formal quality performance
measurement report (p=0.005);
3. There is a high level of interaction between the board
and the medical staff on quality strategy (p=0.021);
4. The senior executives’ compensation is based inpart
on QI performance (p=0.008);
5. The CEO is identified as the person with the greatest
impact on QI (p=0.01)
Kroch et al. Hospital Boards and Quality Dashboards. J Patient Safety. Volume 2, Number 1. March 2006
So…How do you get your Board to
improve quality and patient safety?
1. Board Recruitment: Choosing Board
members with the right stuff
2. Education: Educate the board
Bader and Associates Governance Consultants. Great Boards, Spring 2006, Volume VI, No.1
How do you get your Board to
improve quality and patient safety?
3. Measurement: Use measures to focus
broad work on what’ important
4. High Expectations: Pursue perfection
How do you get your Board to
improve quality and patient safety?
5. Culture Promotion: Pay more attention
to culture
6. Board Time: Exercise leaders powerful
influence
7. Recognition and Rewards: Recognize
and reward excellence
What type of data do you include?
Boards ask two types of questions about
quality and safety
How good
is our
care?
Is our care
getting
better?
• How do we compare to others like us?
• Are we on an acceptable track to
achieve our key quality and safety
objectives or do we change direction?
• If not, why not? Is the strategy wrong,
or is it not being executed effectively?
Forward slides adapted from James L. Reinertsen, M.D: Boards, Dashboards and Data (IHI)
Purpose of
Measurement
Research
Comparison Improvement
or
Accountability
Key question
“What is the
truth?”
“Are we better or
worse than…?”
“Are we getting
better?”
Measurement
requirements
and
characteristics
Complete,
accurate,
controlled,
glacial pace,
expensive
Risk adjusted, with
denominators,
attributable to
individuals or orgs,
validity
Real time, raw
counts,
consistent
operational
definitions, utility
Typical displays
Comparison of
control and
experimental
populations
Performance
relative to
benchmarks and
standards…
Run charts,
control charts,
time between
events…
Adapted from Solberg,Mosser, McDonald Jt Comm J Qual Improv. 1997 Mar;23(3):135-47.
Example of an answer to
“How good is our care?”
Date of this report is
October 24, 2006
Hospital could be “green”
but still worse than median
of comparison group
Compared to
others
Another example of “How do we compare?”
Hospital Adverse Events per 1,000 Patient Days
Using IHI Global Trigger Tool
Our Hospital, May 2007
Current IHI Best
0
5
25
40
50
IHI Average
75
100
Number of Adverse Events per 1,000 Patient Days
Adverse Events Include (but are not limited to):
• Allergic rash
• Excessive bleeding, unintentional trauma of a blood vessel
• Respiratory depression requiring intubation due to pain medications
• Hyperkalemia as the result of overdose of potassium
• Lethargy/shakiness associated with low serum glucose
• Drug-induced renal failure
• Surgical site infection, sepsis, infected lines, other hospital-acquired infections
• Internal bleeding following the first surgery and requiring a second surgery to stop
the bleeding
• Atelectasis, skin breakdown, pressure sores
• DVT or pulmonary embolism during a hospital stay
Source: Roger Resar, John Whittington, IHI Collaborative
125
150
What Boards should know about data on
“How good are we and how do we compare to
others?”
•
•
•
•
Upside
Often risk adjusted
Apples to Apples
Source of pride
Source of energy for
improvement
•
•
•
•
•
•
Downside
Time lag
Static
“the data must be wrong”
you become complacent
How you look depends on
how others perform
Standards and
Benchmarks are full of
defects
Recommendations for Board use of
“How do we compare to others?”
1. Ask this question to help you set aims,
but don’t use these sorts of reports to
oversee and guide improvement at
each meeting.
2. Compare to the best, not the 50th %tile
3. Always make sure you know how
“Green” is determined
Boards ask two types of questions about
quality and safety
1. How good is our care?
– How do we compare to others like us?
2. Is our care getting better?
– Are we on an acceptable track to achieve
our key quality and safety objectives or
do we need to change direction?
– If not, why not? Is the strategy wrong, or
is it not being executed effectively?
What data should you include to your board?
The Board question “are we going to achieve our aims?” requires
management to have a strategic theory
Big Dots
(Pillars, BSC…)
What are your
key strategic
aims? How
good must we
be, by when?
What are the
system-level
measures of
those aims?
Drivers
Projects
(Ops Plan)
Down deep, what
really has to be
changed, or put in
place, in order to
achieve each of
these goals? What
are you tracking to
know whether
these drivers are
changing?
What set of
projects will move
the Drivers far
enough, fast
enough, to
achieve your
aims? How will we
know if the
projects are being
executed?
Example Dashboard for Harm
(for
System
Level 5M Lives Campaign)
12 0
70
Measure: Global
Harm Trigger Tool
80
Ha r m f r om
hi gh a l e r t
me ds
60
50
40
40
20
10
12 0
0
10 0
20
H a n d wa sh i n
g
30
Jan
Fe b
Mar
Apr
May
80
Gl o b a l H a r m
Tr i gge r Tool
60
10 0
60
58
40
Projects: High
alert meds,
surgical
complications,
pressure ulcers,
CHF, MRSA
0
Jan
Mar
Apr
May
20
18
16
14
12
S ur gi c a l
c ompl i c a t i o
ns
10
8
6
4
2
0
Jan
56
20
Fe b
Fe b
Mar
Apr
May
35
54
30
25
52
0
Jan
Fe b
Mar
Apr
May
C ul t ur e of
di sc i pl i ne
on sa f e t y
r ul e s
50
48
20
P r e ssu r e
Ulcer s
15
10
5
46
0
Jan
44
Mar
Apr
May
10
42
Jan
Drivers:
Handwashing, culture
of discipline, and
teamwork
Fe b
12
Fe b
Mar
A pr
May
8
6
MRS A
4
70
2
60
0
Jan
50
Fe b
Mar
Apr
May
16
14
40
12
T e a m wo r k
30
10
CHF
R e a d m i ssi o
ns
8
6
20
4
2
10
0
Jan
0
Jan
Fe b
Mar
Apr
May
Fe b
Mar
Apr
May
Performance Improvement Tools
to use with the Board
• Run or Trend Charts
• Control Charts
19
20
Control Chart
• Statistical Process Control-dynamic view
• Types of Variation
– Common Cause Variation-points between
control limits in no particular pattern;
normally expected from process
– Special Cause Variation-arises form sources
not inherent in process; points outside limits,
exhibit special patterns
67-71
21
Control Charts
22
Measure
Current Performance
h
time
Acute MI Core No display6 over
Decile
National, 4h decile State
Measures
Congestive Heart
4th Decile National, 2nd decile State
Failure Core Measures
Pneumonia Core
3rd Decile National, 1st Decile State
Measures
Press-Ganey Patient
57% Rate us “Excellent”
Satisfaction
OR Turnover Time
22 minutes
Falls
7 per 1000 patient days
Medication Mix
Errorsof system,5.1
per 1000 patient days (from Nurse
project
Variance Reports)
measures
Total Knee and Hip
1.2%
Infection Rates
Mostly comparison
measures
Surgical Site Infection
Rates for Cardiac
Surgery
Time to answer nurse
Low standards for
call
lights on all
“Green”
Med/Surg Units
4.2%
Goal for 2007
2 state decile or above
nd
2nd State decile or above
2nd State decile or above
Statistically significant improvement
i.e 62% “Excellent” rating
15 minutes
Less than 5 per 1000 patient days
Less than 7 per 1000 patient days
Less than 4.1 %
i.e. Better (lower) than 50th %tile for
NNIS
Less than 10.4% i.e. Better (lower)
than 50th %tile for NNIS
We are developing a standard measure, We are aiming to achieve significant
and will report in future meetings to
improvement in timeliness of
Board on this initiative
response to patients concerns.
Is our quality and safety getting better?
Are we going to achieve our aims?
• To answer these questions for Boards…
– The aims should be clearly displayed and understood
– A few system-level measure(s) and drivers should be
graphically displayed over time
– The measures should be displayed monthly, at worst,
and should be close to “real time”
– Measures of critical initiatives (projects that must be
executed to achieve the aim) should be available if
needed to answer the Board’s questions
Data to include:
The full Board should review the System-level
Measures (Big Dots.) The Board and mainly the
Board Quality Committee should review both the
System-level Measures and the Key Drivers of
those Measures. Occasionally, but not often, the
Board will need to see measures of Key Projects,
but these are generally the responsibility of
management to oversee and execute.
Common Flaws in Dashboards
• No system-level measures or aims
• Hodge-podge of system, driver, and project measures
• Static measures
• Too many measures
• Mixture of “How do we compare to others” and “are we
getting better?” measures
• Low, unclear standards for “green”
Summary of Best Practices for Quality and
Safety Dashboards for Boards
• Separate the two types of oversight questions
– How good is our quality? How do we compare to others?
– Are we getting better? Are we on track to achieve our aims?
• Ask the comparison question annually, when setting quality
and safety aims. Avoid use of comparative data to track
improvement.
• Frame your aims with reference to the theoretical ideal, and
to the “best in the world,” not to benchmarks
Summary of Best Practices for Quality and
Safety Dashboards for Boards
• Ask the ‘improvement question’ at every meeting, and track
with a dashboard that shows real-time data on system level
and driver measures displayed on run charts
• Demand that management develop annual quality and safety
aims
• Do not put project-level measures (often about one unit,
disease, or department) on the Board’s dashboard but have
it prepared in case they ask
Data Presentation to the board
Great data presented poorly will
not achieve your goals!
Include:
– Magnitude
– Direction
– Variability
– Rate
• Quick and easy format- callouts, annotate
• Provide conclusions with your data
• Connect data to organizational strategy
Aunyika Moonan
SCHA’s Director of Quality Measurement Services
803-796-3080
[email protected]