Combination of random variables using probability calculus

Download Report

Transcript Combination of random variables using probability calculus

5. Combination of random
variables
• Understand why we need bottoms-up approach for
reliability analysis
• Learn how to compute the probability density
function, mean value and standard deviation of
functions of random variables. Also learn how to
approximate the mean value and standard
deviation of functions of random variables.
• We will assume static reliability models for the
rest of the course.
1
Bottoms-up approach for reliability
analysis
Select primitive random variables
Data and
judgment
Relation
between
performance
and
primitive
random
variables
Probability distributions of
primitive random variables
Probability calculus
Reliability or failure probability
2
Why bottoms-up approach for
reliability analysis
• Sometimes we do not have enough failure data to estimate reliability of
a system. Examples: buildings, bridges, nuclear power plants, offshore
platforms, ships
• Solution: Bottom up approach for reliability assessment: start with the
probability distributions of the primitive (generic random variables),
derive probability distribution of performance variables (e.g. failure
time).
• Advantages:
– Estimate probability distribution of input random variables (e.g., yield
stress of steel, wind speed). Use the same probability distribution of the
generic random variables in many different problems.
– Identify and reduce important sources of uncertainty and variability.
3
Transformation of random
variables
• y=g(x)
• Objective: given probability distribution of
X, and function g(.), derive probability
distribution of Y.
4
Transformation of random variables
Y
One-to-one transformation
y=g(x)
y+Dy
y
x
x+Dx
X
f X ( x)
fY ( y ) 
where x  g-1 ( y )
dg ( x )
dx
5
General transformation
multiple-valued inverse function
f X ( x)
f X ( x)
fY ( y ) 

 ...
dy
dy
dx x1 g 1 ( y )
dx x 2 g 1 ( y )
6
Functions of many variables
Y2
X2
Ay
Ax
Y1
X1
f X 1 X 2 ( x1 , x2 ) f X 1 X 2 ( x1 , x2 )
fY1Y2 ( y1 , y2 ) 

Ay
J
Ax
7
Expectation (mean value) and
variance
• In many problems it is impractical to
estimate probability density functions so we
work with mean values (expectations) and
variances
• Expectation
– E(aX)=aE(X)
– E(X+Y)=E(X)+E(Y)
– If X, Y independent, then E(XY)=E(X)E(Y)
8
Variance
2  a 2 2
 aX
X
2
 2aX   X
2  E ( X 2 )  [ E ( X )]2
X
2
2   2  2  
X


X Y
Y
X
Y
9
Covariance
• Covariance measures the degree to which
two variables tend to increase to decrease
together
Y
Positive
covariance
X
Y
Negative
covariance
X
10
Correlation coefficient
• Correlation coefficient, : covariance
normalized by the product of standard
deviations
• Ranges from –1 to +1
• Uncorrelated variables: correlation
coefficient=0
11
Relation between correlation and
statistical dependence
• If X, Y independent then they are
uncorrelated
• If X, Y are uncorrelated, then they may be
dependent or independent
Uncorrelated
variables
Independent
variables
12
Variance of uncorrelated
variables
2
2  2
X


Y
X
Y
13
Chebyshev’s inequality
• Upper bound of probability of a random
variable deviating more than k standard
deviations from its mean value
• P(|Y-E(Y)|k)1/k2
• Upper bound is to large to be useful
14
k
1
2
3
4
6
10
P(|Y-E(Y)|k)1/k2 P(|Y-E(Y)|k)
from Chebyshev’s
Y normal
inequality
1
0.317
0.25
0.046
0.11
0.003
0.06
6*10E5
0.03
0
0.01
0
15
Approximations for mean and variance
of a function of random variables
• Function of one variable: g(X)
• E(g(X))=g(E(X))
• Standard deviation of
g(X)=[dg(X)/dX]standard deviation of X
• Derivative of g(X) is evaluated at the mean
value of X
16
Approximations for mean and variance
of a function of random variables
• Function of many variables: g(X1,…,Xn)
• E(g(X1,…,Xn))=g(E(X1),…, E(Xn))
• Variance of g(X) =
[dg(X1,…,Xn)/dXi]2×variance of Xi+ 2
[dg(X1,…,Xn)/dXi] × [dg(X1,…,Xn)/dXj]
×covariance of Xi,Xj
• Derivatives of g(X) are evaluated at the
mean value of X
17
When are the above approximations
good?
• When the standard deviations of the
independent variables are small compared
to their average values
• Function g(X) is mildly nonlinear i.e. the
derivatives do not change substantially
when the independent variables change
18