Transcript Slide 1
Lecture 8
Dustin Lueker
Experiment
Random (or Chance) Experiment
Outcome
Sample Space
Event
Simple Event
◦ Any activity from which an outcome, measurement, or other
such result is obtained
◦ An experiment with the property that the outcome cannot
be predicted with certainty
◦ Any possible result of an experiment
◦ Collection of all possible outcomes of an experiment
◦ A specific collection of outcomes
◦ An event consisting of exactly one outcome
STA 291 Fall 2009 Lecture 8
2
Let A and B denote two events
Complement of A
◦ All the outcomes in the sample space S that do not
belong to the even A
◦ P(Ac)=1-P(A)
Union of A and B
◦ A∪B
◦ All the outcomes in S that belong to at least one of
A or B
Intersection of A and B
◦ A∩B
◦ All the outcomes in S that belong to both A and B
STA 291 Fall 2009 Lecture 8
3
Let A and B be two events in a sample space S
◦ P(A∪B)=P(A)+P(B)-P(A∩B)
A and B are Disjoint (mutually exclusive)
events if there are no outcomes common to
both A and B
◦ A∩B=Ø
Ø = empty set or null set
◦ P(A∪B)=P(A)+P(B)
STA 291 Fall 2009 Lecture 8
4
Can be difficult
Different approaches to assigning probabilities to
events
◦ Subjective
◦ Objective
Equally likely outcomes (classical approach)
Relative frequency
STA 291 Fall 2009 Lecture 8
5
Relies on a person to make a judgment as to
how likely an event will occur
Events of interest are usually events that cannot be
replicated easily or cannot be modeled with the
equally likely outcomes approach
◦
As such, these values will most likely vary from
person to person
The only rule for a subjective probability is
that the probability of the event must be a
value in the interval [0,1]
STA 291 Fall 2009 Lecture 8
6
The equally likely approach usually relies on
symmetry to assign probabilities to events
◦ As such, previous research or experiments are not
needed to determine the probabilities
Suppose that an experiment has only n outcomes
The equally likely approach to probability assigns a
probability of 1/n to each of the outcomes
Further, if an event A is made up of m outcomes then
P(A) = m/n
STA 291 Fall 2009 Lecture 8
7
Borrows from calculus’ concept of the limit
P ( A ) lim
a
n
n
◦ We cannot repeat an experiment infinitely many
times so instead we use a ‘large’ n
Process
Repeat an experiment n times
Record the number of times an event A occurs, denote this
value by a
Calculate the value of a/n
P ( A)
a
n
STA 291 Fall 2009 Lecture 8
8
X is a random variable if the value that X will
assume cannot be predicted with certainty
◦ That’s why its called random
Two types of random variables
◦ Discrete
Can only assume a finite or countably infinite number
of different values
◦ Continuous
Can assume all the values in some interval
STA 291 Fall 2009 Lecture 8
9
Are the following random variables discrete
or continuous?
◦ X = number of houses sold by a real estate
developer per week
◦ X = weight of a child at birth
◦ X = time required to run 800 meters
◦ X = number of heads in ten tosses of a coin
STA 291 Fall 2009 Lecture 8
10
A list of the possible values of a random
variable X, say (xi) and the probability
associated with each, P(X=xi)
◦ All probabilities must be nonnegative
◦ Probabilities sum to 1
0 P ( xi ) 1
P(x ) 1
i
STA 291 Fall 2009 Lecture 8
11
X
0
1
2
3
4
P(X)
.1
.2
.2
.15
.1
5
6
7
.05 .05 .15
The table above gives the proportion of
employees who use X number of sick days in
a year
◦ An employee is to be selected at random
Let X = # of days of leave
P(X=2) =
P(X≥4) =
P(X<4) =
P(1≤X≤6) =
STA 291 Fall 2009 Lecture 8
12
Expected Value (or mean) of a random
variable X
◦ Mean = E(X) = μ = ΣxiP(X=xi)
Example
X
2
4
6
8
10
12
P(X)
.1
.05
.4
.25
.1
.1
◦ E(X) =
STA 291 Fall 2009 Lecture 8
13
Variance
◦ Var(X) = E(X-μ)2 = σ2 = Σ(xi-μ)2P(X=xi)
Example
X
2
4
6
8
10
12
P(X)
.1
.05
.4
.25
.1
.1
◦ Var(X) =
STA 291 Fall 2009 Lecture 8
14
A random variable X is called a Bernoulli r.v. if
X can only take either the value 0 (failure) or
1 (success)
Heads/Tails
Live/Die
Defective/Nondefective
◦ Probabilities are denoted by
P(success) = P(1) = p
P(failure) = P(0) = 1-p = q
◦ Expected value of a Bernoulli r.v. = p
◦ Variance = pq
STA 291 Fall 2009 Lecture 8
15
Suppose we perform several, we’ll say n, Bernoulli
experiments and they are all independent of each
other (meaning the outcome of one even doesn’t
effect the outcome of another)
◦ Label these n Bernoulli random variables in this manner: X1,
X2,…,Xn
The probability of success in a single trial is p
The probability of success doesn’t change from trial to trial
We will build a new random variable X using all of
these Bernoulli random variables: n
X X1 X 2
Xn
X
i
i 1
◦ What are the possible outcomes of X? What is X counting?
STA 291 Fall 2009 Lecture 8
16
The probability of observing k successes in n
independent trails is
n k nk
P ( X k ) p q , k 0,1, , n ,
k
◦ Assuming the probability of success is p
◦ Note:
n
n!
k k ! ( n k )!
Why do we need this?
STA 291 Fall 2009 Lecture 8
17
For small n, the Binomial coefficient “n
choose k” can be derived without much
mathematics
n
n!
k k !( n k ) !
w here n ! 1 2 3
n and 0 ! 1
E xam ple:
4
4!
4!
1 2 3 4
6
2 2 !(4 2) ! 2 ! 2 ! 1 2 1 2
STA 291 Fall 2009 Lecture 8
18
Assume Zolton is a 68% free throw shooter
◦ What is the probability of Zolton making 5 out of 6
free throws?
6
5
65
P ( X 5) 0.68 (1 0.68)
5
6 0.1454 0.32 0.279
◦ What is the probability of Zolton making 4 out of 6
free throws?
6
4
64
P ( X 4) 0.68 (1 0.68)
4
15 0.2138 0.1024 0.3284
STA 291 Fall 2009 Lecture 8
19
n p
2
n p (1 p )
n p (1 p )
STA 291 Fall 2009 Lecture 8
20