Transcript [SC]2.3
2.3 General
Conditional
Expectations
報告人:李振綱
Review
• Def 2.1.1 (P.51)
Let be a nonempty set. Let T be a fixed
positive number, and assume that for each
t [0, T ] there is a - algebra F(t) . Assume
further that if s t , then every set in F(s) is
also in F(t) .
Then we call the collection of - algebra F(t) , 0 t T
, a filtration.
• Def 2.1.5 (P.53)
Let X be a r.v. defined on a nonempty sample
space . Let be a - algebra of subsets of
If every set in (X) is also in , we say that X is
- measurable .
Review
• Def 2.1.6 (P.53)
Let be a nonempty sample space equipped
with a filtration F(t) , 0 t T .
Let X (t ) be a collection of r.v.’s is an adapted
stochastic process if, for each t, the r.v. X (t ) is
F(t) measurable .
Introduction
•
(, F , ) and a sub - - algebra of F
If X is measurable the information in
sufficient to determine the value of X.
is
If X is independent of , then the information in
provides no help in determining the value of
X.
In the intermediate case, we can use the
information in to estimate but not precisely
evaluate X.
Toss coins
Let be the set of all possible outcomes of
N coin tosses,
p : probability for head
q=(1-p) : probability for tail
En [ X ](1......n )
n1 ...
p # H (n1 ...N ) q #T (n1 ...N ) X (1...nn 1...N ).
N
Special cases n=0 and n=N,
E0 [ X ]
0 ... N
p # H (0 ...N ) q #T (0 ...N ) X (0 ...N ) E[ X ]
EN [ X ](0 ...N ) = X(0 ...N )
Example (discrete continous)
S3 ( HHH ) u 3 S0
S 2 ( HH ) u 2 S0
S1 ( H ) uS0
S0
S1 (T ) dS0
S3 ( HHT ) S3 ( HTH ) S3 (THH )
S2 ( HT ) S2 (TH ) udS0
S 2 (TT ) d 2 S0
u 2 dS0
S3 ( HTT ) S3 (THT ) S3 (TTH )
ud 2 S0
S3 (TTT ) d 3 S0
• Consider the three-period model.(P.66~68)
E 2 [S3 ](HH) = pS3 (HHH) +qS3 (HHT) ....(2.3.4)
(間斷)
E2 [S3 ](HH) P(AHH ) =
(連續)
AHH
S ()P()
AHH
E2 [ S3 ]( ) dP( ) =
3
AHH
P(A HH )
....(2.3.8)
S3 ( ) dP( )
(Lebesgue
integral)
General Conditional Expectations
• Def 2.3.1.
let (, F , ) be a probability space, let be a
sub - - algebra of F , and let X be a r.v. that is either
nonnegative or integrable. The conditional
expectation of X given , denoted E[ X | ] , is
any r.v. that satisfies
(i) (Measurability)
E[ X | ] is
measurable
(ii) (Partial averaging)
A
E[ X | ]() dP() = X() dP() for all A
A
E[ X | ] unique ?
• (See P.69)
Suppose Y and Z both satisfy condition(i) ans (ii)
of Def 2.3.1. Suppose both Y and Z are measurable
, their difference Y-Z is as well, and thus the set
A={Y-Z>0} is in . So we have
Y ()dP()
A
and thus
A
A
X ()dP() Z ()dP()
A
(Y ( ) Z ( ))dP() 0
The integrand is strictly positive on the set A, so
the only way this equation can hold is for A to
have probability zero(i.e. Y Z almost surely).
We can reverse the roles of Y and Z in this
argument and conclude that Y Z almost surely .
Hence Y=Z almost surely.
General Conditional Expectations
Properties
• Theorem 2.3.2
let (, F , ) be a probability space and let
a sub - - algebra of F .
be
(i) (Linearity of conditional expectation) If X and Y are
integrable r.v.’s and
c1and c2 are constants, then
E[c1X+c2 Y| ] = c1E[X| ] + c2 E[Y| ]
(ii) (Taking out what is known) If X and Y are integrable
r.v.’s, Y and XY are integrable, and X is
E[XY| ] = XE[Y| ]
measurable
General Conditional Expectations
Properties(conti.)
(iii) (Iterated condition)If H is a
is an integrable r.v., then
sub - - algebra of
and X
E[E[X| ]| H ] = E[X|H ]
(iv) (Independence)If X is integrable and independent of
, then
E[X| ] = E[X]
(v) (Conditional Jensen’s inequality)If (X) is a convex
function of a dummy variable x and X is integrable, then
E[ (X)| ] (E[X| ])
p.f(Volume1 P.30)
Example 2.3.3. (P.73)
• X and Y be a pair of jointly
normal random
variables. Define W Y - X so that X and W are
independent, we know W is normal with mean
2
2
2
=
=
(1
)
and
variance
3
2 . Let us take
the conditioning to be = (X) .We estimate Y,
based on X.
1
so,
Y
X W
1
2
2
3
1
2
1
2
1
1
E[Y|X] =
X +EW =
(X-1 )+2
2
2
Y-E[Y|X] = W-E[W]
(The error is random, with expected value zero, and is
independent of the estimate E[Y|X].)
• In general, the error and the conditioning r.v. are
uncorrelated, but not necessarily independent.
Lemma 2.3.4.(Independence)
• let (, F , ) be a probability space, and let be
a sub - - algebra of F . Suppose the r.v.’s
X1.... X K are measurable and the r.v.’s Y1....YL
are independent of . Let f ( x1, ..., xK , y1, ..., yL ) be
a function of the dummy variables x1, ..., xK and
y1, ..., yL define
g ( x1, ..., xK ) Ef ( x1, ..., xK , y1, ..., yL )
Then
Ef ( X 1, ..., X K ,Y1, ..., YL | ) g ( X 1, ..., X K )
Example 2.3.3.(conti.) (P.73)
• Estimate some function f ( x, y ) of the r.v.’s X and Y
based on knowledge of X.
By Lemma 2.3.4
1
g ( x) Ef ( x,
x W )
2
E[ f ( X , Y ) | X ] g ( X )
Our final answer is random but
( X ) - measurable.
Martingale
• Def 2.3.5.
let (, F , ) be a probability space, let T be a
fixed positive number, and let F (t ) , 0 t T ,
be a filtration of sub - - algebras of F.
Consider an adapted stochastic process
M(t), 0 t T .
(i) If E[M(t)|F(s)] = M(s) for all 0 s t T,
we say this process is a martingale. It has no tendency
to rise or fall.
(ii) If E[M(t)|F(s)] M(s) for all 0 s t T,
we say this process is a submartingale. It has no
tendency to fall; it may have a tendency to rise.
(iii) If E[M(t)|F(s)] M(s) for all 0 s t T,
we say this process is a supermartingale. It has no
tendency to rise; it may have a tendency to fall.
Markov process
• Def 2.3.6.
Continued Def 2.3.5. Consider an adapted
stochastic process X (t ) , 0 t T.
Assume that for all 0 s t T and for every
nonnegative, Borel-measurable function f, there
is another Borel-measurable function g such that
E[ f ( X (t )) | F ( s )] g ( X ( s )).
Then we say that the X is a Markov process.
E[ f (t , X (t )) | F ( s )] f ( s, X ( s )), 0 s t T .
Thank you for your listening!!