Metrics O`Plenty

Download Report

Transcript Metrics O`Plenty

Metrics
"A science is as mature
as its measurement tools."
-- Louis Pasteur
Starter Questions

What can we measure?

What value can those
numbers have?
Why Measure?

accurate estimation



how productive are we
how consistent are we
quality improvement


what do we do well
what do we do poorly
Types of Metrics

Product Metrics



direct measures - number of bugs, LOC
indirect measures - usability, maintainability
Project and Process Metrics


direct measures - costs, LOC per month
indirect measures - quality assurance, reliability
Code Metrics

Size



Efficiency


BigO
Complexity



Lines of Code
Function Points
Cyclomatic Complexity
Halstead's complexity metrics (next slide)
Maintainability
Halstead's Complexity Metrics
n1 = the number of distinct operators
n2 = the number of distinct operands
N1 = the total number of operators
N2 = the total number of operands
Program length
Program vocabulary
Volume
Difficulty
Effort
N = N1 + N2
n = n1 + n2
V = N * (LOG2 n)
D = (n1 / 2) * (N2 / n2)
E=D*V
McCall's Quality Factors
Maintainability
Portability
Flexibility
Reusability
Testability
Product
Revision
Product
Transition
Product Operations
Correctness
Reliability
Efficiency
Integrity
Usability
Interoperability
Operability
Training
Communicativeness
Input/Output volume
Input/Output gate
Access Control
Access Audit
Storage efficiency
Execution Efficiency
Traceability
Completeness
Accuracy
Error Tolerance
Consistency
Simplicity
Conciseness
Instrumentation
Expandability
Generality
Self-Descriptiveness
Modularity
Machine Independence
Software System Independence
Communications Commonality
Data Commonality
Usability
Integrity
Efficiency
Correctness
Reliability
Maintainability
Testability
Flexibility
Reusability
Portability
Interoperability
ISO 9126
Quality Characteristics and Guidelines for Their Use
Quality Factors
1.
Functionality
2.
Reliability
3.
Usability
4.
Efficiency
5.
Maintainability
6.
Portability
Design Metrics



Fan In
Fan Out
Morphology

based on number of nodes, depth, width
Module Design Metrics

Cohesion

how many functions does a module perform







coincidental
logical - eg does all output
temporal - eg all the startup work
procedural - executed in this order
communicational - module arrangement on work on what data
functional
Coupling

how is the module connected to other modules

global variable, parameters, stands alone
Object-Oriented Metrics

Weighted Methods per Class

not only how many methods per class are there, but also
how complex are they

Depth of Inheritance Tree

Number of Children


Response for Class


how many child classes does a class have
number of local methods, plus number of methods they call
Lack of Cohesion Metric

number of non-intersecting (don't use the same variables)
methods
Lack of Cohesion Example




Module A calls Module B
B accesses Variable X
C and D access Y
D calls E
A
C
B
D
E
Y
X

This should be split into two classes.
Project Metrics

LOC or FP per month

Errors per LOC (aka Defect Density)

Defect Removal Efficiency

Time required to make changes

Test coverage

Required Skills
Obviously,
Beware of Statistics
Version 2
Module#
Version 1
Defects
LOC
Defects/LOC
Defects
LOC
Defects/LOC
Relation
1
12
777
0.01544
3
55
0.05455
<
2
5
110
0.04545
6
110
0.05455
<
3
2
110
0.01818
3
110
0.02727
<
4
3
110
0.02727
4
110
0.03636
<
5
6
1000
0.00600
70
10000 0.00700
<
28
2107
0.01329
86
10385 0.00828
>
Sum
http://irb.cs.tu-berlin.de/~zuse/sme.html
Basic Questions

What are the basic metrics that managers
need to track?

How do we gather all these numbers?

When do we process all these numbers?
SEI CMM Level 2

Software Requirements Management



status of allocated requirements
number of changes to requirements
Software Project Planning



(repeatable)
completion of milestones compared to the plan
work completed, funds expended, … compared to plan
Software Project Tracking and Oversight

resources expended to conduct oversight
SEI CMM Level 3

Training Program




(defined)
number of training waivers approved
actual attendance vs projected attendance
results on post-training tests
Software Product Engineering



numbers, types, and severity of defects by stage
effort to analyze proposed changes
number of changes by category
Summary
To do something well, we must
understand what we are doing. To
understand something, we must be able
to measure it.
We can measure what we are building and
we can measure our building process.
Next Topics…

Configuration Management
Agile Development

Exam #2
