Transcript Reason8
Chapter 8
Assessing & Reducing
the Human Error Risk
Human error by James Reason
Presented by:
Billy greenwell &
elisabeth strunk
Probabilistic Risk Assessment
Models plant functions as trees:
Event trees – How can this failure occur?
Fault trees – What could happen if…
Two objectives:
1. Identify potential areas of significant risk
and suggest improvements.
2. Quantify overall risk from plant operation.
2
Crucifix
Garlic
Wooden
Stake
Vampire++?
Success
N
Fail
N
Success
N
Fail
N
Success
N
Fail
N
Success
N
Fail
Y
Available
Available
Not
Available
Vampire is
stalking you.
Available
Not
Available
Not
Available
Bitten by
Vampire
Crucifix
Fails
Missing
Vampire
Immune
Garlic
Fails
Wooden
Stake Fails
Stake
Misses
Heart
Stake/
Mallet
Missing
PRA Procedure
1. Identify sources of potential hazard.
2. Identify initiating events that could lead to
this hazard.
3. Establish possible sequences from initiating
events using event trees.
4. Quantify each event sequence.
Frequency of initiating event
Probability that safety systems fail when needed.
5. Determine overall plant risk.
5
PRA Shortcomings
Helps identify where additional safety
mechanisms are needed, but…
Events often assumed independent.
Neglects common-mode failures.
Quantification can be difficult.
Does not account for human failures.
6
Human Reliability Analysis
Attempts to quantify
likelihood of human
error in various
situations.
Magical HRA
number can then be
plugged into PRA.
7
A Jungle of Cute Names
THERP
OATS
HCR
TESEO
Confusion Matrix
SLIM
SHARP
8
Technique for Human Error
Rate Prediction (THERP)
Most popular means of performing HRA.
Models people as pieces of equipment.
You either perform or you screw up.
Probability trees model various tasks
with success/fail outcomes for each.
Performance-shaping factors can influence
success/failure probabilities.
9
THERP Procedure
1. Identify system functions susceptible to
human error.
2. List and analyze related human operations.
3. Estimate error probabilities using expert
judgment and available data.
Data for basic errors given in tables.
Analyst can adjust probabilities with PSFs.
4. Estimate effects of human errors on system
failure events.
10
THERP Shortcomings
Focuses on procedural errors that
occurred prior to an accident/incident.
Expert judgment is highly variable.
More of an art than a science.
11
Operator Action Trees (OATS)
Introduced notion of cognitive error.
Cognitive errors typically occur after an
accident/incident has occurred.
Failure to detect accident event.
Failure to diagnose event & devise remedy.
Failure to implement remedy correctly.
12
Time-Reliability Curves
Model probability an accident will persist
beyond a certain time interval.
tT = tO – tI – tA
tT
tO
tI
tA
time available for thinking
time available for fixing problem
time before symptoms are displayed
time taken to execute remedy
How do we choose these values?
13
OATS Shortcomings
Relies on “best guesses” to fill in values
for quantification curve equation.
Uses a single time-reliability for all
cognitive activities (SB, RB, KB).
HCR addresses this issue, but doesn’t
actually model error.
14
TESEO
Yields error probability through application of
five parameters:
K1
K2
K3
K4
K5
type of activity (0.001..0.1)
temporary stress factor
(routine: 0.5..10, non-routine: 0.1..10)
operator qualities (0.5..3)
activity anxiety factor (1..3)
activity ergonomic factor (0.7..10)
Now we have lots of things to guess about…
15
Confusion Matrix
Designed to models errors in responding
to abnormal circumstances.
Seeks to identify modes of misdiagnosis.
Still relies on experts’ qualitative
probability assessments.
But, it does have the coolest name.
16
Success Likelihood Index
Methodology (SLIM)
Developed to provide a means of
eliciting & structuring expert judgment.
Available as two software packages:
SLIM-SAM: derives SLIs
SLIM-SARAH: performs sensitivity & costbenefit analyses
Results can be heavily biased by
calibration data.
17
Systematic Human Action
Reliability Procedure (SHARP)
Assists practitioners in
selecting appropriate HRA
techniques.
It’s like the paperclip for HRA.
It looks like you’re
performing a human
risk assessment.
Would you like help?
If you need a tool to tell you
which tool you need, it’s time
for a different toolbox…
18
Validation, Anyone?
Williams: How accurate is all this stuff?
Experts’ direct assessments of human
error probabilities are typically better
than those generated by HRA models.
HRA models tended to exhibit low
accuracy and high variability.
19
So what should HRA do?
Compatible with and complement PRA
Scrutable, verifiable, and repeatable
Quantification of crew success probability as a
function of time
Account for different cognitive activities
Incorporate performance-shaping factors
Comparable to existing data
Simple to implement and use
Help generate insight and understanding
20
Assessment & Reduction
Assessment is the first step to reduction.
We can’t reduce errors if we don’t know
what errors are likely to occur.
PRA and HRA indicate where we should
focus our efforts to make systems error
tolerant.
21
PRA & Risk Management
PRA presents accepted risk
Inputs to PRA are expected risks
Better eliminate additional risks
Substandard components & materials
Difference between real plant & PRA model
Bad management
22
And Now Stuff We
Haven’t Tried…
Helping People Not Screw Up
“Push” and “Pull”
KIW vs. KIH
Use both to create conceptual model
Simplify tasks to avoid thinking
Outline
Make execution & evaluation visible
Text games
24
Helping People Not Screw Up
Some More
Natural mappings
Intention – action
Action – effect
System state – perceived state
System state – needs
Intention – expectation
Decision Constraints
Plan for errors
Standardize
25
Helping Out the Operator
Felicitous extensions of normal
brainpower
Cognitive prostheses
Who should we tailor these for?
What about dependence?
26
Memory Aids for Maintenance
Personnel
Many nuclear accidents come from
maintenance failures
Certain steps in a process are errorprone
Why not provide electronic checklists?
Portable Interactive
Maintenance Auxiliary
(PIMA)
27
Training Issues
Combining theory and heuristics
Simulation has its limitations
Let operators learn from mistakes
28
Ecological Interface Design
Most errors are not stochastic
Want representations to support different
levels of reasoning
Addresses:
Errors related to learning and adaptation
Interference among control structures
Lack of resources
Human variability
at SB, RB, & KB levels
Ten Guidelines for Improved System Design
29
Self-knowledge About Error
Types and Mechanisms
We train pilots by showing them the bad
things their senses will do and telling
them why they do it
Why not train nuclear operators the
same way with error mechanisms?
Hmmm…the Korean Air pilots didn’t find
that very helpful
30
Historical Epilogue
Human engineers & ergonomists
Theoreticians & cognitive psychologists
Cognitive ergonomists (HCI) & HRA
– Tenerife, Flixborough, & TMI –
Academics and practitioners got together
– Chernobyl –
Organizational ergonomists or sociologists?
31
Happy Halloween!