Talk by Andrew B. Kahng()
Download
Report
Transcript Talk by Andrew B. Kahng()
A Metrics System for Continuous
Improvement of Design Technology
Andrew B. Kahng and Stefanus Mantik
DARPA
Motivation: Complexity of the Design Process
Ability to make silicon has outpaced ability to design it
Complex data, system interactions
SOC
more functionality and customization, in less time
design at higher levels of abstraction, reuse existing design
components
customized circuitry must be developed predictably, with less risk
Key question: “Will the project succeed, i.e., finish on
schedule and under budget while meeting performance
goals?”
SOC design requires an organized, optimized design
process
12/9/99
2
Value of CAD Tools Improvement Not Clear
What is $ value of a “better” scheduler, mapper, placer?
What is $ value of GUI, usability, …?
What is the right objective?
min wirelength routable
min literals amenable to layout
Value well-defined only in context of overall design
process
12/9/99
3
What is the Design Process?
Not like any “flow/methodology” bubble chart
backs of envelopes, budgeting wars
changed specs, silent decisions, e-mails, lunch discussions
ad hoc assignments of people, tools to meet current needs
proprietary databases, incompatible scripts/tools, platformdependent GUIs, lack of usable standards
design managers operate on intuition, engineers focus on tool
shortcomings
Why did it fail?
“CAD tools”
“inexperienced engineers”
Must measure to diagnose, and diagnose to improve
12/9/99
4
What Should be Measured?
Many possibilities
running a tool with wrong options, wrong subset of standard
bug in a translator/reader
assignment of junior designer to project with multiple clocks
difference between 300 MHz and 200 MHz in the spec
changing an 18-bit adder into a 28-bit adder midstream
decision to use domino in critical paths
one group stops attending budget/floorplan meetings
Solution: record everything, then mine the data
12/9/99
5
Design Process Data Collection
What revision of what block was what tool called on?
by whom?
when?
how many times? With what keystrokes?
What happened within the tool as it ran?
what was CPU/memory/solution quality?
what were the key attributes of the instance?
what iterations/branches were made, under what conditions?
What else was occurring in the project?
e-mails, spec revisions, constraint and netlist changes, …
Everything is fair game; bound only by server bandwidth
12/9/99
6
Unlimited Range of Possible Diagnoses
User performs same operation repeatedly with nearly
identical inputs
tool is not acting as expected
solution quality is poor, and knobs are being twiddled
Email traffic in a project:
missed deadline, missed revised deadline; people disengaged;
project failed
On-line docs always open to particular page
command/option is unclear
12/9/99
7
METRICS System Architecture
Web
Browsers
Tool
xmitter
Tool
xmitter
Tool
xmitter
Wrapper or
embedded
Java
Applets
Inter/Intra-net
Server
Reporting
Data-Mining
Metrics Data Warehouse
12/9/99
8
METRICS Transmitter
No functional change to the tool
use API to send the available metrics
initToolRun()
sendMetrics()
Low overhead
example: standard-cell placer using
Metrics API < 2% runtime overhead
even less overhead with buffering
sendMetrics()
sendMetrics()
Won’t break the tool on transmittal failure
12/9/99
child process handles transmission
while parent process continues its job
sendMetrics()
9
METRICS Transmitter
EDA
Tool
XML
Tool wrapper
EDA
Tool
XML
Java
Servlet
SQL
Oracle8i
XML
API
Inter/Intra-net
12/9/99
10
Transmitter Example
/** API Example **/
int main(int argc, char * argv[ ] )
{
...
toolID = initToolRun( projectID, flowID );
...
printf( “Hello World\n” );
sendMetric( projectID, flowID, toolID,
“TOOL_NAME”, “Sample” );
sendMetric( projectID, flowID, toolID,
“TOOL_VERSION”, “1.0” );
...
terminateToolRun( projectID, flowID,
toolID );
return 0;
}
12/9/99
## Wrapper example
( $File, $PID, $FID ) = @ARGV;
$TID = `initToolRun $PID $FID`;
open ( IN, “< $File” );
while ( <IN> )
{
if ( /Begin\s+(\S+)\s+on\s+(\S+.*)/)
{
system “sendMetrics $PID $FID $TID \
TOOL_NAME $1”;
system “sendMetrics $PID $FID $TID \
START_TIME $2”;
}
...
}
system “terminateToolRun $PID $FID \
$TID”;
11
Example of METRICS XML
<? xml version=“1.0” ?>
<METRICSPACKET>
<REQUEST>
<TYPE> TOOL </TYPE>
<PROJECTID> 173 </PROJECTID>
<FLOWID> 9 </FLOWID>
<PARAMETER> 32 </PARAMETER>
</REQUEST>
<METRICS>
<PROJECTID> 173 </PROJECTID>
<FLOWID> 9 </FLOWID>
<TOOLID> P32 </TOOLID>
<DATETIME> 93762541300 </DATETIME>
<NAME> TOOL_NAME </NAME>
<VALUE> CongestionAnalysis </VALUE>
</METRICS>
</METRICSPACKET>
12/9/99
12
Current Testbed: A Metricized P&R Flow
DEF
Capo Placer
Placed DEF
LEF
QP ECO
Legal DEF
Congestion
Map
WRoute
Routed DEF
Incr. WRoute
Final DEF
12/9/99
M
E
T
R
I
C
S
CongestionAnalysis
13
METRICS Reporting
Web-based
platform independent
accessible from anywhere
Example: correlation plots created on-the-fly
12/9/99
understand the relation between two metrics
find the importance of certain metrics to the flow
always up-to-date
14
METRICS Reporting
WEB
Browser
Request
Report
Report
Request
Java
Servlet
SQL
Oracle8i
Data
3rd Party
Graphing
Tool
(Excel,Lotus)
Request
Data
Wrapper
plot
data
Local
Graphing
Tool
(GNUPlot)
Future implementation
Inter/Intra-net
12/9/99
15
Example Reports
# Via vs WL
Congestion vs WL
12/9/99
16
METRICS Standards
Standard metrics naming across tools
same name same meaning, independent of tool supplier
generic metrics and tool-specific metrics
no more ad hoc, incomparable log files
Standard schema for metrics database
12/9/99
17
Generic and Specific Tool Metrics
Generic Tool Metrics
tool_name
tool_version
tool_vendor
compiled_date
start_time
end_time
tool_user
host_name
host_id
cpu_type
os_name
os_version
cpu_time
12/9/99
char
char
char
mm/dd/yyyy
hh:mm:ss
hh:mm:ss
char
char
char
char
char
char
hh:mm:ss
Placement Tool Metrics
num_cells
num_nets
layout_size
row_utilization
wirelength
weighted_wl
integer
integer
double
double
double
double
Routing Tool Metrics
num_layers
integer
num_violations integer
num_vias
integer
wirelength
double
wrong-way_wl double
max_congestion double
18
Current Status
Completion of METRICS server with Oracle8i,
Servlet, and XML parser
Initial transmittal API in C++
METRICS wrapper for Cadence P&R tools
Simple reporting scheme for correlations
12/9/99
19
Additional Infrastructure
Industrial standard network discovery
Jini, UPNP (Universal Plug & Play), SLP (Salutation)
Security
encryption for XML data
SSL (Secure Socket Layer)
user id & password authentication (reporting)
registered users (transmitting)
3rd party reporting tool
MS Office integration, Crystal report, …
Data mining
12/9/99
20
METRICS Demo
Transmission of metrics
API inside tools
Perl wrapper for log files
Reporting
12/9/99
correlation reports
progress on current tool run, flow, design
21
Potential Benefits to Project Management
Accurate Resource Prediction At any point in Design Cycle
up front estimates for People, Time, Technology, EDA Licenses, IP re-use...
go/no go at earliest point
Accurate Project Post-mortems
Everything tracked - tools, flows, users, notes
Optimize for next Project based on past results
No “loose”, random data or information left at Project end (log files!!!)
Management Console
Web-based, status-at-a-glance of Tools, Designs, Systems at any point in
project
No wasted resources
12/9/99
prevent out of sync runs
no duplication of data or effort
22
Potential Benefits to Tools R&D
Methodology for continuous tracking data over entire lifecycle of
instrumented tools
More efficient analysis of realistic data
no need to rely only on extrapolations of tiny artificial “benchmarks”
no need to collect source files for test cases, and re-run in house
Facilitates identification of key design metrics, effects on tools
standardized vocabulary, schema for design/instance attributes
Improves benchmarking
12/9/99
apples to apples, and what are the apples in the first place
apples to oranges as well, given enough correlation research
23
Potential Research Enabled by METRICS
Tools:
scope of applicability
predictability
usability
Designs:
difficulty of design or manufacturing
verifiability, debuggability/probe-ability
likelihood of a bug escape
$ cost (function of design effort, integratability, migratability, ...)
Statistical metrics, time-varying metrics
What is the appropriate abstraction of manufacturing process for
design?
12/9/99
Impact of manufacturing on design productivity
Inter- and intra-die variation
Topography effects
Impact, tradeoffs of newer lithography techniques and materials
24
Ongoing Work
Work with EDA, designer community to establish
standards
tool users: list of metrics needed for design process optimization
tool vendors: implementation of the metrics requested with the
standardized naming
Improve the transmitter
add message buffering
“recovery” system for network / server failure
Extend METRICS system to include project management
tools, email communications, etc.
Additional reports, data mining
12/9/99
25