Center for Operator Performance_ Fall API 2014 Presentation

Download Report

Transcript Center for Operator Performance_ Fall API 2014 Presentation

Center for Operator
Performance
An Industry-Academia Collaboration
www.operatorperforma
nce.org







How can I make expert operators faster?
Are alarm targets valid?
Is it worth changing exist displays to current
practices?
Are high fidelity simulators worth the cost?
How do I create a hierarchy
What does a good overview look like?
Should I use a large monitor for the console
operators?

Operating companies and DCS suppliers
researching ways reduce error and enhance
management of abnormal situations
◦ Tools
◦ Benchmarks
◦ Knowledge


Operating company driven
Structure
◦ Board – Set all aspects of COP, fund projects
◦ Supporting – Share IP, input & participate on
projects (no voting rights)
◦ Permanent – Wright State University & Beville


No minimum participation requirement
Semi-annual meetings
Overview
· Display Content (I & II)
Displays (T)
· Display Design
Handbook
· Display Evaluation
Toolkit (I & II)
Alarm
Work
· Procedure Analyzer Formats (B)
Envelope
· Handhelds
· Knowledge
Large Screen
Usage
Management
Procedure
· Color Use
Icons
Background
Color
C
r
cu
c
t, a
n
e
t
sis
n
o
Event Based
Delivery
Formatting,
Integrating Data
Optimal personnel, minimal errors
Interface
System
Event
Prediction III
& IV (T)
Technology Impact,
Skill Degradation
· Alarm Rates (I & II)
· Event Prediction (I & II)
· Data Mining Fatigue Data
Automation &
Alarm Mgt
Control Room
Scorecard
Few, well
managed
abnormal
situations
Data Mining for
Troubleshooting
Fatigue
Ri
gh
ing
Safety
t
ti
o
o
nf
h Low Workload Culture/Risk
s
or
le
m
b
Taking
u
Impact,
Field
vs.
at
ro
ion
t
e
Board
Tasks
v
,r
i
t
Simulator Benefit,
igh
ac
o
r
tf
Training
Manuals
,p
or
e
t
m
a
Incident
DB Mining
· Decision Making Exercises
· Use of Cues
ShadowBox
(T)
Operator
· Simulator Survey
· Operator Expertise
at
,r
· Training Methods
· Mental Models
igh
tt
im
e

Decision Making Exercises
◦ Improve console operator performance
◦ Increase process engineer understanding

Alarm rates
◦ Don’t add personnel to meet current industry limits

Fatigue
◦ When should you be concerned on increase in operator
errors

Job Aids
◦ Why it will reduce product contamination

Creation of overview display
◦ 4-second assessment of process health

Event Prediction & Mitigation
◦ Stop upsets before they happen
◦ Simple operating envelope

Semantic Procedure Analyzer
◦ 90% reduction in procedure volume
◦ Tailored delivery

Shadowbox
◦ Scenario based training
◦ Capture of expertise

Procedure Automation/Alarm Management
◦ Is it worth the money







Event Prediction & Mitigation
Overview Displays
Decision Making Exercises
Shadowbox
Alarm Rates
Large screens
Job Aids
compressor surge
•
5 principal components
Surge anticipated before operator observation
- Fault predicted in 25 of 28 cases
- Average anticipation time: 287 minutes
9
Amenable to one scan – one point representation
x
x
5
120
4
100
x
80
1
Time (hrs)

60
40
x
3
20
x
2
0
x3
x
x
4
2
x
5
x1
Source/
Method
RADIAL PLOTS
Hulls
Fault #
3
10 110
11 19
14
9
15 201
16 50
19
centr
oids
1
1
5
3
34
31
18
Average delay
other methods
Detection Delay (mins)
131
271
6
865
304
140
Tamura
and
Zhang (2008)
Russell, Chiang, and Braatz (2000)
Tsujita
(2006)
PCA PCA DPC DPC CVA CVA CVA PCA PCA
Imp.
KPCA KICA
2
2
2
2
2
T
Q A T A Q Ts
Tr
Q
Q
T
KICA
288 147 303 150 75
912 33 585 21 876
12
3
18
3
6
2220
2031
936 591 597 588 42
246
69
33
3
132
81
3
27
33
33
99
150
153 624
60
69
3
27
27
51
57
3
27
21
42
45
3
21
9
11







Event Prediction & Mitigation
Overview Displays
Decision Making Exercises
Shadowbox
Alarm Rates
Large screens
Job Aids

Task 1:
◦ Review current displays, screen shots, and mockups
and assess the interface designs against relevant
human factors research and design principles
◦ Assemble requirements for “Level 1” Overview Display
information requirements.

Task 2:
◦ Develop initial concepts for a Overview Display and
Navigation that reduce clutter and support system
assessment
◦ Develop workflow for designing and implementing
Overview Displays

Task 3:
◦ Prototype, test, and collect metrics on Overview
Displays that utilizes these concepts.
13
Supports
Key
Decisions
Structure
and Layout
Display
Hierarchy
Conventions
and Coding
High-level
Situation
Awareness
Quantitative
vs.
Qualitative
What data should be included in a display?
Content
- Do I need every single data point?
- What data points are important?
- What data should be fused into information?
How should the information be organized?
Organization
Format
- What information is needed for high level situation
awareness?
- How do I choose the information for overviews down to
details?
- What information should be grouped together?
- Which sets of information should be grouped across
screens?
How should the information be formatted?
- What is the best frame of reference?
- What is the best way to move across screens and into
details?
- What colors should be used?







Event Prediction & Mitigation
Overview Displays
Decision Making Exercises
Shadowbox
Alarm Rates
Large screens
Job Aids
Project
Adapted military training
exercises to process control.
Military use DMX to train platoon
leaders to make faster and more
accurate decisions during urban
operations.
 Impact
Proved to be low-cost and easy to
apply method to enhance decision
making.

◦ One-hour periodically at
beginning of shift
◦ Identified skill/knowledge gaps
◦ Identified lost practices
◦Helps build mental model







Event Prediction & Mitigation
Overview Displays
Decision Making Exercises
Shadowbox
Alarm Rates
Large screens
Job Aids

Master’s thesis at Naval Postgraduate School.

Focus: Unconventional firefighting challenges.
◦ e.g., terrorist attack, earthquake.
Method

◦ Present complex scenarios.
◦ Trainees record their impressions, responses, and decisions in
a 1” square box.
◦ Trainees compare their responses to the responses of a panel
of SMEs.

Strengths
◦ Enable trainees to see the world through the eyes of an expert.
◦ Appreciate the mental models of experts.
◦ SME does not have to be present.

Description of a scenario,
including pictures and diagrams,
taking several pages.

Variety of boxes:

Separate booklet:
– Prioritize information to remember
– Prioritize goals
– Prioritize actions
– Seek information
– Anticipatory thinking
– Prioritize information
– Cue detection using video clips
23







Event Prediction & Mitigation
Overview Displays
Decision Making Exercises
Shadowbox
Alarm Rates
Large screens
Job Aids
250
200
Reaction Time (Seconds)
Mean(RT-Sec) Category
150
100
50
0
High
1 - 10
Min.
Caution
High
2 - 10 Min.
Low
Caution
5 - 10 Min.
High
Low
Caution
10 - 10 Min.
High
Low
Caution
20 - 10 Min.
High
Novice
Expert

1 alarm
2 alarms
per minute per minute
31.8
93.0
24.2
47.7
Novice and Expert Operators reaction time for solving
an alarm cannot be distinguished from one another
except at the alarm rate of 20 alarms per 10 minutes
◦ Novices performed significantly slower than operators at 20
alarms per 10 minutes







Event Prediction & Mitigation
Overview Displays
Decision Making Exercises
Shadowbox
Alarm Rates
Large screens
Job Aids

Average response times
4.30 seconds
Distant = 4.64 seconds
◦ Nearby =
◦

Distance of the target is
marginally significant
◦ p = 0.08

Longest response time in
sections 1 and 9







Event Prediction & Mitigation
Overview Displays
Decision Making Exercises
Shadowbox
Alarm Rates
Large screens
Job Aids
Accuracy
1.2
1
0.8
0.6
0.4
Task Time
0.2
0







Event Prediction & Mitigation
Overview Displays
Decision Making Exercises
Shadowbox
Alarm Rates
Large screens
Job Aids

Love to have ya

Benefits

Attend a meeting

Check out website
◦ Like two more at board level ($50K/year)
◦ Supporting Members ($25K/year)
◦ Two-year commitment
◦
◦
◦
◦
◦
Guide research – it’s for your benefit
Research tailored to your site through participation
Exchange of ideas with industry colleagues
Shape industry norms
Cost sharing (you do it in exploration)
◦ No cost or obligation
◦ Test drive COP
◦ www.operatorperformance.org
Center for Operator
Performance
An Industry-Academia Collaboration
www.operatorperforma
nce.org