Data Quality Toolbox for Registrars

Download Report

Transcript Data Quality Toolbox for Registrars

Data Quality Toolbox for
Registrars
MCSS Workshop
December 9, 2003
Elaine Collins
Quality Data Toolbox
•
•
•
•
•
•
•
•
Artisan
Medium
Raw Materials
Shaping tools
Directions
Measuring tools
Final Product
Goodness
Registrar
Computerized data
Medical information
Knowledge, skills
Standards
Editing “tools”
Cancer record
Match to standards
Quality Data - Goodness
•
•
•
•
•
Accurate
Consistent
Complete
Timely
Maintain shape across transformation and
transmission
Measuring Tools
•
•
•
•
•
Reabstracting studies
Structured queries and visual review
Text editing
EDITS
MCSS routine review
Exercises
• MCSS reabstracting study – 2003
• Sites: Breast, Corpus uteri, Lung,
Melanoma, Testis, Soft tissue sarcoma
• 2000 diagnosis year
• 12 facilities
• Review of reported data – Structured query
• Review of reported data – Text editing
Reabstracting Studies
• Compares original medical record with
reported cancer record
• Considered the “gold standard”
• Labor-intensive; all records used at initial
abstracting may not be available; biased by
reabstractor’s training and skills
Structured Queries
• Compares coding across series of records
sorted by selected characteristics
• Useful for finding pattern discrepancies
across many records
• Manual process; some comparisons may be
converted to automated edits
Text Editing
• Compares text with coded values for
individual records
• Useful for immediately identifying coding
problems
• Manual process; most effective on
completion of each individual case
EDITS
• Checks range validity for many fields,
comparability of few fields for individual records
• Automated process, can be applied on completion
of each record or on preparation of batch report;
warnings and over-rides are alternatives to failures
• Expansion of interfield edits requires careful logic
Edits Analysis
• Edits to be included in MCSS Set
• Edits in Hospital/Staging Edit Sets – C edits are
included in confidential data set
• No Text Edits displayed
• Criteria
–
–
–
–
–
Valid codes/dates
Alpha/numeric
Timing
Interfield comparisons
Absolute conditions
MCSS Review
• Requests values for missing or unknown
data; resolves conflicts between data items
from multiple facilities and between data
items updated by single facility
• Allows incorporation of information from
multiple facilities
• Review for limited number of conditions
Same Discrepancies Found on Different Reviews
300
250
200
150
100
50
0
CANCER
EXTENT
STAGE
SURGERY
Reabstracting
216
155
275
149
Visual
99
110
159
66
Text
79
74
77
42
EDITS
0
16
1
5
MCSS
22
4
4
0
Cancer Registrar –
Resource for Quality Data
ICD-O
Medical
Record
Facility
System
Facility
Staff
Committees
CDC
Physician
Patient
Cancer
Research
AJCC
SEER
Other
Registries
Protocols
COC
Registrar
Central
Registry
Cancer
Control
NCDB
NAACCR
Quality
Monitors
NAACCR
Public
Data Inputs
•
•
•
•
•
•
•
Patient data from facility systems
Medical record reports and notes
Pathology reports
Staging forms
Communication with physician offices
Communication with other registries
Communication with patients
Process Inputs
• Registrar training, knowledge, skills
• Coding standards – ICD-O-3, COC, AJCC,
SEER, NAACCR
• Interpretations of standards – I&R, SEER
Inquiry, Ask NAACCR
• Medical literature – printed and online
• Registry software data implementations
Sources of Error
•
•
•
•
•
•
•
Patient data from facility systems
Medical record reports and notes
Pathology reports
Staging forms
Communication with physician offices
Communication with other registries
Communication with patients
Sources of Error
• Registrar training, knowledge, skills
• Coding standards – ICD-O-3, COC, AJCC,
SEER, NAACCR
• Interpretations of standards – I&R, SEER
Inquiry, Ask NAACCR
• Medical literature – printed and online
• Registry software data implementations
Types of Errors
•
•
•
•
Missing/conflicting data
Shared data errors
Timing/coding errors
Standards and interpretations – ambiguities,
omissions, confusions, contradictions
• Discrepancies among local/central registry
practice and national standards
Software Implementations
• Discrepancies between implementations and
national standards
• Lack of registrar knowledge/training on
correspondence between registry and
exported data
• Logic errors in matching registry data to
reporting formats
• Conversion errors
AJCC Staging Dilemma
• Are pathologic nodes required for
pathologic stage grouping?
• How do Minnesota registrars answer this
question?
Clinical/Pathologic Staging in Study
BREAST CORPUS
LUNG
MELAN
TESTIS
SARCO
STAGE GROUPING
Single Group
cTcNcM, cST
cTcNpM, cST
pTcNcM, cST
pTpNcM, cST
pTpNpM, cST
cTcNcM, pST
pTcNcM, pST
pTpNcM, pST
pTpNpM, pST
9
2
54
18
2
1
3
21
1
31
30
27
1
1
10
3
1
2
1
1
6
2
5
74
37
40
4
20
6
c99, p99
cST, p99
c99, pST
cST, pST
3
4
4
13
6
1
5
7
9
1
6
3
No Staging
1
4
7
5
Two Groups
3
3
Collaborative
Staging
• Provides specific rules for coding known vs
unknown staging elements
• Accommodates “best” stage for AJCC stage
assignment
AHIMA 75th Annual Conference
October, 2003 Minneapolis:
Coming Events
•
•
•
•
Data mining
ICD-10-CM
SNOMED
Natural language processing
AHIMA 75th Annual Conference
October, 2003 Minneapolis:
Challenges
• What is our professional purpose?
• How do we envision ourselves as
professionals?
Foundation for Quality Data
• Registrar’s commitment to registry purpose
• Registrar’s knowledge, understanding of
cancer data
• Registrar’s management of communication
technologies
• Registrar’s advocacy for data use
SUMMARY
• Consistent recording and reporting of
quality cancer data requires commitment.
• Routine and regular review of data patterns
facilitates data knowledge and quality.
• Passing EDITS assists but does not ensure
data quality.
• Data standards change, use the manuals.
• Welcome Collaborative Stage.