Transcript Attachments

Managing Your Integrations
an
Operations Science Approach
Steven Rdzak
Infrastructure Services
[email protected]
1
Some Background
• webMethods customer since 1999
– Started with ActiveWorks
• ES 5.0.1
– 80% of integrations are ES based
– 5 ES Broker Servers In Production
– 125-150 Adapter instances (ILA, Db, XML), many co-located with the
resource they serve
• IS 4.6
– 20% of integrations are IS based
– 1 IS production server for external customers
– 1 IS production server for internal customers
• 350+ Integration Components
– Team of 8 integration developers
• Average of 1 million component activations a week
2
What is Integration Management?
• A continuous improvement story for EAI
• Application of improvement tools
– Measurement tools, Analysis tools
– Control Charts, Pareto Charts
• Application of improvement techniques
– Operations Reviews
– Corrective Action Plans
• Logical flow of analysis and improvement initiatives
– Measure the operation (you must inspect what you expect!)
– Change or modify the behavior
– Achieve the desired results (no errors, better performance)
• A.k.a Operations Science
3
Steps to Successful Integration
Management
1. Design the Integration
• Standard patterns
• Common audit components
2. Measure the Integration
• Result metrics
• Measurement framework
3. Analyze the Integration
• Key Analysis Questions
• Analysis framework
• Control Charts, Pareto Charts
4. Improve the Integration
• Operations reviews
• Corrective action plans
5. Optimize the Integration
• Standardization
• Institutionalization
4
Design the Integration
• Standard Patterns
– Each Integration Component (Flow or EI) sets up standard Try/Catch
Block
– Common audit sub-components are included in each component build
out
– Starting timestamp recorded before other processing
– On completion, success or error is recorded depending on outcome.
– Integration component duration is calculated from start timestamp
– Peer review ensures standards are followed
• Common Audit Components
– Error and Debug Logging
• Persists integration name, component name, key data, message
– Metrics Capture
• Persists component name, outcome, timing data (enqueue duration,
component duration), outcome message
– Custom design at Level 3
• Never used built-in capabilities like EntLogger adapter
5
Design the Integration (cont.)
• Capture Architecture
Broker
Source
Adapter
IS
Target
Adapter
Common
Audit
Components
Common
Audit
Components
IS Server
Common
Audit Flows/
Services
Service
Adapter
JDBC
Audit Database
6
Measure the Integration
• Result Metrics
– Indicators that measure the integration for conformance or
non-conformance
– Should be represented in a quantifiable way
– Customers can and should validate the chosen indicators
• Result Metric Types
– Better
• Represents a quality indicator based on what is important to the
integration customer. Example: < 1% error rate for order submit
component.
– Faster
• Represents a time indicator based on overall time to perform the
integration. Example: component processing time of 5 seconds
or less.
– Cheaper
• Represents a value indicator based on what is important for the
company. Example: productivity measurement like orders/day.
The integration must process > 500 orders per day.
7
Measure the Integration (cont.)
• Our Measurement Framework
– Scheduled nightly batch run to summarize result metrics
Tables (query)
MET_INTEGRATION
MET_COMPONENT
MET_COMPLETION_TRANSLATION
Tables (query)
L3_AWORKS_METRICS_UPDATE
Audit Database
Tables (insert)
MET_SUMMARY
MET_SUMMARY_BREAKDOWN
MET_SUMMARY_DURATION
1
3
4
2
1 .. N
Reporting Adapter
1.
2.
3.
4.
Capture Adapter
Query for Integrations/Components to capture
For each component PR&W for daily summary data
Query for daily summary data
Process Data and Update daily summary data
8
Analyze the Integration
• Key Questions To Ask During Analysis
Past
Is the integration in
control?
Integration produces
consistent output
Present
Is the integration
performing to
specifications?
Integration is capable
in its current
environment
Future
Is the integration able Integration is capable
to adapt or be
of performing in a
flexible?
future environment
9
Analyze the Integration (cont.)
• Our Analysis Framework
– Low tech reporting architecture utilizing MS Excel spreadsheets Pivot
Table and Charts fed via an external data query from the Audit database
– Reports and Charts are generated from summarized result metrics
Metrics by Day 2003 08 August
Metrics Counts
YTD
Metrics Duration
by Component
Date Range
Metrics Current
Day
ODBC
ODBC
ODBC
ODBC
Audit Database
10
Analyze the Integration (cont.)
• Control Chart
11
99.96%
1597
1000
0
No serial-tracked
part number
record found for
part
2000
Received Null
Data in NID
Equipment
Canonical
4000
Clarify returned
StatusCode(Java
Exception)
8000
No site record
found
7000
Asset tag
already exists
10000
Serial number
already exists for
part
Count
Analyze the Integration (cont.)
• Pareto Chart
Top 10 Errors
9371
120.00%
9000
93.64%
99.99%
6
Error
100.00%
640985.04%
5000
3000
1173
100.00%
80.00%
6000
60.00%
50.50%
40.00%
20.00%
1
0.00%
12
Improve the Integration
• Operations Reviews
– Formally assess the operation weekly
– Prioritize improvement efforts
– Why have a review?
• Ensure integrations are meeting customer expectations and
achieving corporate objectives
• Review success of corrective actions and formulate new action
plans
– Who should attend?
• Integration designers/developers
• Integration stakeholders
13
Improve the Integration (cont.)
• Corrective Action Plans
– A plan targeted on a specific integration improvement, problem
or defect.
– Can be formal or informal
– Key Points in the Plan
•
•
•
•
•
•
What happened
Why did it happen
What correction is being taken
What will this accomplish
Who is responsible
When will it be completed
14
Optimize the Integration
• Standardization
– Simplification
• Streamline the methods and procedures used to create your
integrations
• Promote use of standard components
– Training
• Can be formal or informal
• Institutionalization
– Peer review of integration design
– Make each developer an owner of his or her integration
performance
15
Some Takeaways
• An integration that is not measured is not managed
• Don’t go overboard and try to measure the world, you
will end up with no time left for improvement efforts
• Data collection, storage and analysis are activities that
add no value until they are used to control or improve
your integration.
16