MCE2 - School of Computing
Download
Report
Transcript MCE2 - School of Computing
Overview of Probability Theory
In statistical theory, an experiment is any operation that can be replicated infinitely often
and gives rise to a set of elementary outcomes, which are deemed to be equally likely. The
sample space S of the experiment is the set of all possible outcomes of the experiment. Any
subset E of the sample space is called an event. We say that an event E occurs whenever any
of its elements is an outcome of the experiment. The probability of occurrence of E is
P {E} = Number of elementary outcomes in E
Number of elementary outcomes in S
S
E
The complement E of an event E is the set of all elements that belong to S but not to E. The
union of two events E1 E2 is the set of all outcomes that belong to E1 or to E2 or to both.
The intersection of two events E1 E2 is the set of all events that belong to both E1 and E2.
Two events are mutually exclusive if the occurrence of either precludes the occurrence of
the other (i.e) their intersection is the empty set . Two events are independent if the
occurrence of either is unaffected by the occurrence or non-occurrence of the other event.
Theorem of Total Probability.
P {E1 E2} = P{E1} + P{E2} - P{E1
S
E2}
P{E1 E2} = (n1, 0 + n1, 2 + n0, 2) / n
= (n1, 0 + n1, 2) / n + (n1, 2 + n0, 2) / n - n1, 2 / n
= P{E1} + P{E2} - P{E1 E2}
Corollary.
If E1 and E2 are mutually exclusive, P{E1 E2} = P{E1} + P{E2}
Proof.
n = n0, 0 + n1, 0 + n0, 2 + n1, 2
E1
n1, 0
E2
n1, 2
n0, 2
n0, 0
The probability P{E1 | E2} that E1 occurs, given that E2 has occurred (or must occur) is called
the conditional probability of E1. Note that in this case, the only possible outcomes of the
experiment are confined to E2 and not to S.
S
Theorem of Compound Probability
P{E1 E2} = P{E1 | E2} * P{E2}.
Proof.
P{E1 E2} = n1, 2 / n
= {n1, 2 / (n1, 2 + n0, 2) } * { n1, 2 + n0, 2) / n}
Corollary.
If E1 and E2 are independent, P{E1
E2
E1
n1, 0
n1, 2
n0, 2
n0, 0
E2} = P{E1} * P{E2}.
The ability to count the possible outcomes in an event is crucial to calculating probabilities.
By a permutation of size r of n different items, we mean an arrangement of r of the items,
where the order of the arrangement is important. If the order is not important, the arrangement
is called a combination.
Example. There are 5*4 permutations and 5*4 / (2*1) combinations of size 2 of A, B, C, D, E
Permutations:
AB, BA, AC, CA, AD, DA, AE, EA
BC, CB, BD, DB, BE, EB
CD, DC, CE, EC
DE, ED
Combinations:
AB, AC, AD, AE, BC, BD, BE, CD, CE, DE
Standard reference books on probability theory give a comprehensive treatment of how these
ideas are used to calculate the probability of occurrence of the outcomes of games of chance.
Statistical Distributions
If a statistical experiment only gives rise to real numbers, the outcome of the experiment is
called a random variable. If a random variable X
takes values
X1, X2, … , Xn
with probabilities p1, p2, … , pn
then the expected or average value of X is defined to be
n
E[X] = pj Xj
j 1
and its variance is
n
VAR[X] = E[X2] - (E[X])2 =
j 1
pj Xj2 - (E[X])2.
Example. Let X be a random variable measuring
the distance in Kilometres travelled by children
to a school and suppose that the following data
applies. Then the mean and variance are
E[X]
= 5.30 Kilometres
VAR[X] = 33.80 - 5.302
= 5.71 Kilometres2
Prob. Distance
pj
Xj
pj Xj
0.15
0.40
0.20
0.15
0.10
1.00
2.0
4.0
6.0
8.0
10.0
-
0.30
1.60
1.20
1.20
1.00
5.30
pj Xj2
0.60
6.40
7.20
9.60
1.00
33.80
Similar concepts apply to continuous distributions. The distribution function is defined by
F(t) = P{ X t} and its derivative is the frequency function
f(t) = d F(t) / dt
t
so that
F(t) =
f(x) dx.
Sums and Differences of Random Variables
Define the covariance of two random variables to be
COVAR [ X, Y] = E [(X - E[X]) (Y - E[Y]) ] = E[X Y] - E[X] E[Y].
If X and Y are independent, COVAR [X, Y] = 0.
Lemma
E[ X + Y]
= E[X] + E[Y]
VAR [ X + Y]
= VAR [X] + VAR [Y] + 2 COVAR [X, Y]
E[ k. X] = k .E[X] VAR[ k. X] = k2 .VAR[X] for a constant k.
Example. A company records the journey time X
of a lorry from a depot to customers and
the unloading times Y, as shown.
E[X]
= {1(10)+2(13)+3(17)+4(10)}/50 = 2.54
E[X2] = {12(10+22(13)+32(17)+42(10)}/50 = 7.5
VAR[X] = 7.5 - (2.54)2 = 1.0484
E[Y]
= {1(20)+2(19)+3(11)}/50 = 1.82
VAR[Y] = 3.9 - (1.82)2 = 0.5876
Y =1
2
3
Totals
X=1
7
2
1
10
2
5
6
2
13
3
4
8
5
17
4
4
3
3
10
Totals
20
19
11
50
E[Y2] = {12(20)+22(19)+32(11)}/50 = 3.9
E[X+Y]
= { 2(7)+3(5)+4(4)+5(4)+3(2)+4(6)+5(8)+6(3)+4(1)+5(2)+6(5)+7(3)}/50 = 4.36
2
E[(X + Y) ]
= {22(7)+32(5)+42(4)+52(4)+32(2)+42(6)+52(8)+62(3)+42(1)+52(2)+62(5)+72(3)}/50 = 21.04
VAR[(X+Y)] = 21.04 - (4.36)2 = 2.0304
E[X.Y]
= {1(7)+2(5)+3(4)+4(4)+2(2)+4(6)+6(8)+8(3)+3(1)+6(2)+9(5)+12(3)}/50 = 4.82
COVAR (X, Y) = 4.82 - (2.54)(1.82) = 0.1972
VAR[X] + VAR[Y] + 2 COVAR[ X, Y] = 1.0484 + 0.5876 + 2 ( 0.1972) = 2.0304