dashboard - Homework Market

Download Report

Transcript dashboard - Homework Market

Boards, Dashboards, and Data
From the Top: Getting the Board on Board
1-3 p.m., June 11, 2007
Boston, Massachusetts
James L. Reinertsen, M.D.
Boards ask two types of questions
about quality and safety
1. How good is our care?
─ How do we compare to others like us?
2. Is our care getting better?
─ Are we on track to achieve our key quality
and safety objectives?
─ If not, why not? Is the strategy wrong, or is it
not being executed effectively?
For all of these questions…
In God we trust.
All others bring data.
Yes, but what data?
Purpose of
Measurement
Research
Comparison or
Accountability
Improvement
Key question
“What is the
truth?”
“Are we better or
worse than…?”
“Are we getting
better?”
Penalty for being
wrong
Misdirection for
the profession
Misdirected
reward or
punishment
Misdirection for
an initiative
Measurement
requirements
and
characteristics
Complete,
accurate,
controlled,
glacial pace,
expensive
Risk adjusted, with
denominators,
attributable to
individuals or orgs,
validity
Real time, raw
counts,
consistent
operational
definitions, utility
Typical displays
Comparison of
control and
experimental
populations
Performance
relative to
benchmarks and
standards…
Run charts,
control charts,
time between
events…
Adapted from Solberg,Mosser, McDonald Jt Comm J Qual Improv. 1997 Mar;23(3):135-47.
Example of an answer to
“How good is our care?”
Date of this report is
October 24, 2006
Hospital could be “green”
but still worse than
median of comparison
group
Compared to
others
Another example of “How do we compare?”
Hospital Adverse Events per 1,000 Patient Days
Using IHI Global Trigger Tool
Current IHI Best
5
0
25
40
50
Our Hospital, May 2007
IHI Average
75
100
125
Number of Adverse Events per 1,000 Patient Days
Adverse Events Include (but are not limited to):
• Allergic rash
• Excessive bleeding, unintentional trauma of a blood vessel
• Respiratory depression requiring intubation due to pain medications
• Hyperkalemia as the result of overdose of potassium
• Lethargy/shakiness associated with low serum glucose
• Drug-induced renal failure
• Surgical site infection, sepsis, infected lines, other hospital-acquired infections
• Internal bleeding following the first surgery and requiring a second surgery to stop
the bleeding
• Atelectasis, skin breakdown, pressure sores
• DVT or pulmonary embolism during a hospital stay
Source: Roger Resar, John Whittington, IHI Collaborative
150
What Boards should know about data on
“How good are we and how do we compare
to others?”
•
•
•
•
Upside
Often risk adjusted
Apples to Apples
Source of pride
Source of energy for
improvement
•
•
•
•
•
•
Downside
Time lag (months)
Static (no data over time)
If you look bad, energy is
wasted on “the data must
be wrong”
If you look good, you
become complacent
How you look depends on
how others perform
Standards and
Benchmarks are full of
defects (“The cream of the
crap”)
Recommendations for Board use of
“How do we compare to others?”
1. Ask this question to help you set aims,
and perhaps annually thereafter, but
don’t use these sorts of reports to
oversee and guide improvement at each
meeting.
2. Compare to the best, not the 50th %tile
•
e.g. Toyota Specs
3. Always make sure you know how “Green”
is determined
Boards ask two types of questions
about quality and safety
1. How good is our care?
─ How do we compare to others like us?
2. Is our care getting better?
─ Are we on track to achieve our key quality
and safety objectives?
─ If not, why not? Is the strategy wrong, or is it
not being executed effectively?
Example: Immanuel St. Joseph’s Mayo Health System Board’s
Satisfybetter?”
Our Patients
answer to the question “Is our mortality rate1.1
getting
Inpatient Mortality
Immanuel St. Joseph's
Deaths per 1000 Discharges
40
Variable
Monthly
12mo rolling rate
35
30
25
20
15
Benchmark
10
Mon.
Yr
Mar Jun Sep Dec Mar Jun Sep Dec Mar Jun Sep Dec Mar Jun Sep Dec
2003 2003 2003 2003 2004 2004 2004 2004 2005 2005 2005 2005 2006 2006 2006 2006
Available
in January 2007!
8/7/2006; Prepared by Immanuel St. Joseph's-Mayo Health System Quality Resources
Department
Is our quality and safety getting better?
Are we going to achieve our aims?
• To answer these questions for Boards…
─ The aims should be clearly displayed and understood
─ A few system-level measure(s) should be graphically
displayed over time
─ The measures should be displayed monthly, at worst,
and should be close to “real time”
─ Measures do not necessarily need to be risk adjusted
─ Measures of critical initiatives (projects that must be
executed to achieve the aim) should be available if
needed to answer the Board’s questions
The Board question “are we going to achieve our aims?”
requires management to have a strategic theory
Big Dots
Drivers
Projects
(Pillars, BSC…)
What are your
key strategic
aims? How
good must we
be, by when?
What are the
system-level
measures of
those aims?
(Core Theory of Strategy)
(Ops Plan)
Down deep, what
really has to be
changed, or put in
place, in order to
achieve each of
these goals? What
are you tracking to
know whether
these drivers are
changing?
What set of
projects will move
the Drivers far
enough, fast
enough, to
achieve your aims?
How will we know
if the projects are
being executed?
The ideal dashboard will
display a cascaded set of
measures that reflect the
“theory of the strategy.”
Example Dashboard for Harm
(for 5M Lives Campaign)
12 0
System Level
Measure: Global
Harm Trigger
Tool
70
10 0
60
80
40
40
10
12 0
0
10 0
Jan
Fe b
Mar
Apr
May
80
Gl o b a l H a r m
Tr i gge r Tool
58
40
56
20
20
H a n d wa sh i n
g
30
20
60
Ha r m f r om
hi gh a l e r t
me ds
60
50
54
Projects: High
alert meds,
surgical
complications,
pressure
ulcers, CHF,
MRSA
0
Jan
Fe b
Mar
Apr
May
20
18
16
14
12
S ur gi c a l
c ompl i c a t i o
ns
10
8
6
4
2
0
Jan
Fe b
Mar
Apr
May
35
30
25
52
0
Jan
Fe b
Mar
Apr
May
C ul t ur e of
di sc i pl i ne
on sa f e t y
r ul e s
50
48
20
P r e ssu r e
Ulcer s
15
10
5
46
0
Jan
44
Mar
Apr
May
10
42
Jan
Drivers:
Handwashing,
culture of
discipline, and
teamwork
Fe b
12
Fe b
Mar
A pr
May
8
6
MRS A
4
70
2
60
0
Jan
50
Fe b
Mar
Apr
May
16
14
40
12
T e a m wo r k
30
10
CHF
R e a d m i ssi o
ns
8
6
20
4
2
10
0
Jan
0
Jan
Fe b
Mar
Apr
May
Fe b
Mar
Apr
May
The full Board should review the Systemlevel Measures (Big Dots.) The Board
Quality Committee should review both the
System-level Measures and the Key Drivers
of those Measures. Occasionally, but not
often, the Board will need to see measures
of Key Projects, but these are generally the
responsibility of management to oversee
and execute.
Common Flaws in Dashboards
• No system-level measures or aims (so it’s possible to quality
and safety to be worse, and yet to achieve “green” on all the
measures the Board sees!)
• Hodge-podge of system, driver, and project measures (so
the Board doesn’t know what’s important)
• Static measures (so the Board has to take management’s
word that “we’re on track to achieve our aims”
• Too many measures (so the Board doesn’t understand any
of them)
• Mixture of “How do we compare to others” and “are we
getting better?” measures (so the Board doesn’t know what
questions to ask)
• Low, unclear standards for “green” (so the Board becomes
complacent despite significant opportunities for
improvement!)
Can you identify the flaws in the
following “dashboard?”
Measure
Acute MI Core
Measures
Congestive Heart
Failure Core Measures
Pneumonia Core
Measures
Press-Ganey Patient
Satisfaction
OR Turnover Time
Falls
Medication Errors
Total Knee and Hip
Infection Rates
Surgical Site Infection
Rates for Cardiac
Surgery
Time to answer nurse
call lights on all
Med/Surg Units
Current Performance
6 Decile National, 4h decile State
Goal for 2007
2 state decile or above
4th Decile National, 2nd decile State
2nd State decile or above
3rd Decile National, 1st Decile State
2nd State decile or above
57% Rate us “Excellent”
Statistically significant improvement
i.e 62% “Excellent” rating
15 minutes
Less than 5 per 1000 patient days
Less than 7 per 1000 patient days
h
22 minutes
7 per 1000 patient days
5.1 per 1000 patient days (from Nurse
Variance Reports)
1.2%
4.2%
nd
Less than 4.1 %
i.e. Better (lower) than 50th %tile for
NNIS
Less than 10.4% i.e. Better (lower)
than 50th %tile for NNIS
We are developing a standard measure, We are aiming to achieve significant
and will report in future meetings to
improvement in timeliness of
Board on this initiative
response to patients concerns.
Measure
Current Performance
h
over National,
time 4h decile State
Acute MI Core No display
6 Decile
Measures
Congestive Heart
4th Decile National, 2nd decile State
Failure Core Measures
Pneumonia Core
3rd Decile National, 1st Decile State
Measures
Press-Ganey Patient
57% Rate us “Excellent”
Satisfaction
OR Turnover Time
22 minutes
Falls
7 per 1000 patient days
Medication Mix
Errorsof system,
5.1 per 1000 patient days (from Nurse
Variance Reports)
project measures
Total Knee and Hip
1.2%
Infection Rates
Mostly comparison
measures
Surgical Site Infection
Rates for Cardiac
Surgery
Time to answer nurse
Low
standards
call
lights
on all
for “Green”
Med/Surg Units
4.2%
Goal for 2007
2 state decile or above
nd
2nd State decile or above
2nd State decile or above
Statistically significant improvement
i.e 62% “Excellent” rating
15 minutes
Less than 5 per 1000 patient days
Less than 7 per 1000 patient days
Less than 4.1 %
i.e. Better (lower) than 50th %tile for
NNIS
Less than 10.4% i.e. Better (lower)
than 50th %tile for NNIS
We are developing a standard measure, We are aiming to achieve significant
and will report in future meetings to
improvement in timeliness of
Board on this initiative
response to patients concerns.
Summary of Best Practices for Quality
and Safety Dashboards for Boards
• Separate the two types of oversight questions
─ How good is our quality? How do we compare to others?
─ Are we getting better? Are we on track to achieve our aims?
• Ask the comparison question annually, when setting quality
and safety aims. Avoid use of comparative data to track
improvement.
• Frame your aims with reference to the theoretical ideal, and
to the “best in the world,” not to benchmarks
• Ask the ‘improvement question’ at every meeting, and track
with a dashboard that shows real-time data on system level
and driver measures displayed on run charts
• Demand that management develop a “theory of the strategy
to achieve the annual quality and safety aims
• Do not put project-level measures (often about one unit,
disease, or department) on the Board’s dashboard