Transcript Slide 1

Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Understanding medicare’s quality
indicators
Kristen Geissler, MS, PT, MBA, CPHQ
Associate Director, Navigant Consulting, Inc.
May 15, 2009
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Table of contents
Section 1
»
Background and Measures
Section 2
»
Abstraction 101
Section 3
»
Coding Guidelines versus HQA Spec Manual
Section 4
»
Concurrent Review of Quality Indicators
Section 5
»
Takeaways
Section 6
»
Questions
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Section 1
Background and Measures
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Sponsorship

Regulatory:
– Sponsored by federal or state agencies:

–

Hospital Quality Alliance—Hospital Compare
Usually consistent, consensus-built standards and
definitions
Private/for-profit:
– Sponsored by non-governmental agencies:


–
HealthGrades®
U.S. News & World Report
Methodology and definitions often not consensus-built
and may not be public or reproducible
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Background of quality measures in
hospitals

Several different types of quality measures:
– Process:


–
Outcome:



–
What happened with the patient?
“30-day mortality of patients with pneumonia”
Much more complex, as risk adjustment must be used
Patient-reported:

–
Was a specific recommendation done?
“Evaluation of left ventricular function”
“HCAHPS Patient Perception Survey”
Facility-reported:


Hospital infection rates, fall rates
Concerns with hospitals using different measurement methodology
and intensity of review
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Background of quality measures in
hospitals

Two very different types of quality reporting:
– “Active” data capture and transmittal:


–
Used for Core Measures—abstractor reviews each chart for
specific data elements
Very complex data abstraction rules
“Passive” data retrieval:



Used by agencies such as HealthGrades
Used for CMS mortality measures and several new measures
for FY10—readmission and AHRQ Patient Safety Indicators
Based strictly on administrative/coding data
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Background of quality measures in
hospitals

The Joint Commission has been requiring hospitals to collect and
submit data on Core Measures since 2001

Dept HHS announced the Quality Initiative in 2001

Now called the Hospital Quality Alliance, a collaborative effort
between multiple organizations, JC, CMS, AHA, NQF

Data is published on The Joint Commission’s Web site under
“Quality Check” and CMS’ “Hospital Compare”

CMS-specific program related to the IPPS (Inpatient Prospective
Payment System) is called Reporting Hospital Quality Data for
Annual Payment Update (RHQDAPU) (pronounced ‘rackdapoo’)

Many current state and other payer efforts that publish this and other
data
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
CMS—Value-Based Purchasing

CMS currently has a ‘pay to report’ model (RHQDAPU):
–
Hospitals that do not report data on the required measures will not
receive 2% of the market basket update
–
Will change to “pay-to-perform” with VBP (Value-Based Purchasing):



Payments to achieve and exceed
Possible penalties for low performance
Possible timeline (as outlined in the November 2007 proposal to Congress)
–
FY2009—current RHQDAPU requirements of reporting 27 measures
–
FY2010—VBP: 100% based on public reporting
–
FY2011—VBP: 50% based on public reporting; 50% based on
performance
–
FY2012—VBP: 100% based on performance
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Publicly reported quality data
Organization
Web site
CMS
www.hospitalcompare.hhs.gov
The Joint Commission
www.qualitycheck.org
HealthGrades®
www.healthgrades.com
The Leapfrog Group
www.leapfroggroup.org
Local state or regional initiatives
Search the Internet
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Documentation and quality
measures

Core Measures focus on quality-of-care
processes (versus outcome):
–
–
–

Did we assess?
Did we prescribe?
Did we administer (a medication)?
Many times, becomes more of a documentation
issue
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
FY 2009 quality measures

Heart Attack:
–
AMI—1 - Aspirin at arrival
–
AMI—2 - Aspirin prescribed at discharge
–
AMI-—3 - ACE inhibitor or ARB for left ventricular systolic dysfunction
–
AMI—4 - Adult smoking cessation advice/counseling
–
AMI—5 - Beta-blocker prescribed at discharge
–
AMI—6 - Beta-blocker at arrival (to be retired after March 31, 2009)
–
AMI—7a - Fibrinolytic agent received w/in 30 minutes of hospital arrival
–
AMI—8a - Primary percutaneous coronary intervention (PCI) received w/in
90 minutes of hospital arrival

Heart Failure:
–
HF—1 - Discharge instructions
–
HF—2 - Left ventricular function assessment
–
HF—3 - ACE inhibitor or ARB for left ventricular systolic dysfunction
–
HF—4 - Adult smoking cessation advice/counseling
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
FY 2009 quality measures
(continued)

Pneumonia:
–
PN—1 - Oxygenation assessment (retired after 1/1/09)
–
PN—2 - Pneumococcal vaccination status
–
PN—3b - Blood culture performed before first antibiotic received in hospital
–
PN—4 - Adult smoking cessation advice/counseling
–
PN—5c - Initial antibiotic received w/in six hours of hospital arrival
–
PN—6 - Appropriate initial antibiotic selection
–
PN—7 - Influenza vaccination status

Surgical Care Improvement Project:
–
SCIP—Inf-1 - Prophylactic antibiotic received w/in one hour prior to surgical incision
–
SCIP—Inf-2 - Prophylactic antibiotic selection for surgical patients
–
SCIP—Inf-3 - Prophylactic antibiotics discontinued w/in 24 hours after surgery
end time
–
SCIP—Inf-4 - Cardiac surgery patients w/ controlled 6AM postoperative serum
glucose
–
SCIP—Inf-6 - Surgery patients w/ appropriate hair removal
–
SCIP—VTE-1 - Venous thromboembolism (VTE) prophylaxis ordered for surgery
patients
–
SCIP—VTE-2 - VTE prophylaxis w/in 24 hours pre/post surgery
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
FY 2009 quality measures—
(continued)

Mortality:
–
–
–

AMI—30-day mortality—Medicare patients
Heart failure—30-day mortality—Medicare
patients
Pneumonia—30-day mortality—Medicare
patients
Patient Experience:
–
HCAHPS patient survey
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
What does the public see?
Hospital Compare (CMS)—Core Measures
www.hospitalcompare.hhs.gov – accessed 1/27/09
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
What does the public see?
Hospital Compare (CMS)—Mortality Measures
www.hospitalcompare.hhs.gov – accessed 1/27/09
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
FY 2010—Quality measures—added

Surgical Care Improvement Project:
–
SCIP—Card 2 - Surgery patients on a beta-blocker prior to arrival who
received a beta-blocker during the perioperative period (January 1, 2009,
discharges)

Readmission Measures:
–
Heart failure 30-day risk standardized readmission (Medicare claims only)
–
AMI and PN 30-day readmission will likely be finalized for FY10 at a later
date
–
Claims data for July 1, 2007–June 30, 2008

Nursing Sensitive Measures:
–
Failure to rescue—Medicare claims only
–
Claims data for July 1, 2007–June 30, 2008

Cardiac Surgery Measures:
–
Participation in a Systematic Database for Cardiac Surgery
–
Data collection window between July 1, 2009–August 15, 2009
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
FY 2010—Quality measures
(continued)

AHRQ—Patient Safety Indicators (PSI) and Inpatient Quality Indicators
(IQI):
–
Abdominal aortic aneurysm (AAA) mortality rate (with or without volume)
–
Hip fracture mortality rate
–
Mortality for selected medical conditions (composite)
–
Mortality for selected surgical procedures (composite)
–
Complication/patient safety for selected indicators (composite)
–
Death among surgical patients with treatable serious complications
–
Iatrogenic pneumothorax, adult
–
Postoperative wound dehiscence
–
Accidental puncture or laceration
–
Medicare claims only (only for FY10—will change to all-payer beyond FY10)
–
Medicare Claims data for July 1, 2007–June 30, 2008
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
A word on risk adjustment

What measures need to be risk adjusted?
–
Outcome measures (mortality, complications, readmissions):


What is the methodology?
–
Many different methodologies:




APR-DRGTM (3M)
HealthGradesTM proprietary
CMS mortality risk adjustment
How is risk adjustment used?
–
Actual rate versus expected rate:


Adjusts for complexity of care and severity of illness
i.e., sicker patients would have a higher expected rate of mortality
What’s the bottom line?
–
The more accurately the coding/claims data represents the severity
of illness of the patient, the more accurately the risk-adjustment methodology will
be applied
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Section 2
Abstraction 101
Understanding the abstraction
process

Abstraction process:
–
Cases are selected for review based on principal ICD-9 diagnosis/procedure
–
Entire inpatient medical record must be reviewed
–
Data elements (demographic and measure-related) must be abstracted for each
eligible case:

–
–

Up to 35 measure-related elements per record
Data elements are entered into hospital software (a JC-accredited performance
measurement program)
Software determines exclusions from a measure set or individual measure
Quality control:
–
Very complex set of abstraction rules issued nationally:

–
CMS audits five random cases (across all measure sets) each quarter:

–
Updated/changed every April/October
Hospitals must achieve 80% agreement
Joint Commission requires reconciliation of principal diagnosis count of measure
sets each quarter
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Quality review and abstraction
process
Abstraction on discharge
Software
determines what
records to review
based on ICD-9
codes
Quality abstractor
reviews entire
record and
abstracts data
Abstracted data is
entered into the
software system
New data abstracted may contradict with findings on
concurrent review. Spec manual has very specific rules on
contradicting information.
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Software system
transmits data to
Joint Commission
and to federal data
clearinghouse
Quality indicator challenges

No certification or competency criteria for quality indicator abstractors

Very complex abstraction rules:
–
Specification manual made of many different sections
–
Rare to find a single-source answer to a question

Limitations on documentation clarifications that can be added to the record after discharge

Some quality indicator definitions/guidelines contradict coding guidelines

If incorrect principal diagnosis/working DRG is applied during admission, incorrect Core
Measure Set may be applied:
–


Pneumonia versus Heart Failure
Even if record is reviewed concurrently, entire record must be reviewed on discharge
Many ‘precedents’ must be understood in order to determine whether measure was met or
not
–
ACEI/ARB for left ventricular systolic dysfunction:

How does HQA define ‘left ventricular systolic dysfunction’
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Abstraction resources

Specification Manual
–
www.qualitynet.org—go to “Hospital Inpatient” tab, and then “Specification
Manual” or
http://qualitynet.org/dcs/ContentServer?cid=1141662756099&pagename=QnetP
ublic%2FPage%2FQnetTier2&c=Page
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Abstraction resources

Specification
Manual:
–
Pick appropriate
discharge time
period
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Abstraction resources

Specification Manual:
–
Different sections of the manual
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Abstraction resources

Specification Manual:
–
Qnet Quest—FAQs
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Section 3
Coding guidelines versus
HQA Spec Manual
From the physician’s perspective …

Some examples of the many things we ask of our providers:

Joint Commission standards:
–
Critical lab values
–
Verbal orders
–
Clinical indications for diagnostic tests

AHA/CMS/OIG inpatient coding guidelines for reimbursement:
–
Acuity/specificity
–
Spelling out up/down arrows
–
Linking diagnoses to cause

Hospital Quality Alliance:
–
Contraindications to medications
–
Timing of antibiotics
–
Discontinuation of antibiotics

CMS Professional Fee coding guidelines for reimbursement:
–
Order for the consultation
–
Complexity of care
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Examples of conflicts between quality indicators
and coding guidelines

Quality indicator rules

Coding rules
1.
Left ventricular function may be
derived directly from an echo report
(does not have to be redocumented
by the treating physician)
1.
Left ventricular function must be
documented by a treating physician
in order to be coded
2.
Patient’s smoking status and evidence
of smoking cessation education may
be derived from nursing notes
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
2.
Patient’s smoking status must be
documented by a physician to be
coded
Quality versus coding

Examples of documentation that may be provided by someone other than
a treating physician in Quality Indicators:

Nursing:
–
Medication administration or prescription on discharge
–
Medication allergies
–
Participation in clinical trials
–
Smoking history & evidence of smoking cessation education
–
Discharge instructions
–
Vaccination status
–
Blood culture collection

Non-Treating Physician:
–
EF from echo report
–
ST-segment elevation from ECG report
–
Chest x-ray results
–
Culture results from pathology report
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
From the physician’s perspective

Collaborate with quality and accreditation staff on physician
education

Ensure multiple points of review for new medical record forms or
reminder sheets:
– Does it meet the needs of many different stakeholders?
– Does it contradict any of the stakeholder’s guidelines?

Encourage physician advisors to collaborate with one another

Physicians and other providers need to hear one message
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Section 4
Concurrent Review of Quality Indicators
Quality indicator concurrent review
models

Possible models:

Target review population (i.e., just Medicare) OR all payers

All quality indicators OR targeted selection of indicators

All CDS’ review for quality indicators OR specialty CDS reviews for quality
indicators
Concurrent
reviewer reviews
record based on
some trigger
criteria
(admitting dx,
CM report, etc.)
Based on
‘presumed’ final
diagnosis, chart
is reviewed for
quality
indicators
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Reviewer may
query physician
for specific
quality indicator
criteria
Reviewer may or
may not document
findings on
worksheet or in
software.
Patient discharged.
Pros/cons of CDS’ performing
quality indicator concurrent review

Opportunities:

Decreases number of reviewers interfacing with physicians

Many top query opportunities are also core measure sets (HF, PN, AMI)

Appropriate identification of principal diagnosis before discharge

Challenges:

Increased material to learn

Contradicting rules

Continued competence will require updates every six months

May be disconnect in target population (i.e., Medicare versus all payers)
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Section 5
Takeaways
What can you do Monday morning?

Look at your quality scores online.

Introduce yourself to the quality abstractors.

Set up education session or lunch & learn with the quality team.

Find out what tools the quality team uses to help physicians with
documentation:
– Checklists
– Reminder stickers
– Preprinted progress notes
– Order sets

Meet with your physician advisor and discuss opportunities.

COLLABORATE!
Copyright © NCI 2009 Confidential and proprietary,
shall not be transferred or distributed
Section 6
Questions