Transcript ppt - SEAS
TCOM 501:
Networking Theory & Fundamentals
Lecture 6
February 19, 2003
Prof. Yannis A. Korilis
1
6-2
Topics
Time-Reversal of Markov Chains
Reversibility
Truncating a Reversible Markov Chain
Burke’s Theorem
Queues in Tandem
6-3
Time-Reversed Markov Chains
{Xn: n=0,1,…} irreducible aperiodic Markov chain with
transition probabilities Pij
j 0
Pij 1, i 0,1,...
Unique stationary distribution (πj > 0) if and only if:
π j i 0 πi Pij ,
j 0,1,...
Process in steady state:
Pr{ X n j} π j lim Pr{ X n j | X 0 i}
n
Starts at n=-∞, that is {Xn: n = …,-1,0,1,…}
Choose initial state according to the stationary distribution
How does {Xn} look “reversed” in time?
6-4
Time-Reversed Markov Chains
Define Yn=Xτ-n, for arbitrary τ>0
{Yn} is the reversed process.
Proposition 1:
{Yn} is a Markov chain with transition probabilities:
Pij*
π j Pji
πi
,
i, j 0,1,...
{Yn} has the same stationary distribution πj with the
forward chain {Xn}
6-5
Time-Reversed Markov Chains
Proof of Proposition 1:
Pij* P{Ym j | Ym 1 i, Ym 2 i2 ,
, Ym k ik }
P{ X m j | X m 1 i, X m 2 i2 ,
P{ X n j | X n 1 i, X n 2 i2 ,
, X m k ik }
, X n k ik }
P{ X n j , X n 1 i, X n 2 i2 , , X n k ik }
P{ X n 1 i, X n 2 i2 , , X n k ik }
P{ X n 2 i2 , , X n k ik | X n j , X n 1 i}P{ X n j , X n 1 i}
P{ X n 2 i2 , , X n k ik | X n 1 i}P{ X n 1 i}
P{ X n j , X n 1 i}
P{ X n j | X n 1 i} P{Ym j | Ym 1 i}
P{ X n 1 i}
P{ X n 1 i | X n j}P{ X n j} Pji π j
P{ X n 1 i}
πi
i 0 π P i 0 πi
*
i ij
π j Pji
πi
π j i 0 Pji π j
Reversibility
6-6
Stochastic process {X(t)} is called reversible if
(X(t1), X(t2),…, X(tn)) and (X(τ-t1), X(τ-t2),…, X(τ-tn))
have the same probability distribution, for all τ, t1,…, tn
Markov chain {Xn} is reversible if and only if the transition
probabilities of forward and reversed chains are equal Pij Pij*
or equivalently, if and only if
πi Pij π j Pji ,
i, j 0,1,...
Detailed Balance Equations ↔ Reversibility
Reversibility – Discrete-Time Chains
6-7
Theorem 1: If there exists a set of positive numbers {πj}, that
sum up to 1 and satisfy:
πi Pij π j Pji ,
i, j 0,1,...
Then:
1. {πj} is the unique stationary distribution
2. The Markov chain is reversible
Example: Discrete-time birth-death processes are reversible,
since they satisfy the DBE
6-8
Example: Birth-Death Process
Pn 1,n
P01
0
1
2
Sc
Pn ,n 1
n
Pn ,n 1
P10
P00
S
Pn ,n
n+1
Pn 1,n
One-dimensional Markov chain with transitions only
between neighboring states: Pij=0, if |i-j|>1
Detailed Balance Equations (DBE)
π n Pn ,n 1 π n 1Pn 1,n
n 0,1,...
Proof: GBE with S ={0,1,…,n} give:
n
n
π P πP
j 0 i n 1
j
ji
j 0 i n 1
i ij
π n Pn ,n 1 πn 1Pn 1,n
Time-Reversed Markov Chains (Revisited)
6-9
Theorem 2: Irreducible Markov chain with transition probabilities Pij. If
there exist:
A set of transition probabilities Qij, with ∑j Qij=1, i ≥ 0, and
A set of positive numbers {πj}, that sum up to 1, such that
πi Pij π j Q ji , i, j 0,1,...
(1)
Then:
Qij are the transition probabilities of the reversed chain, and
{πj} is the stationary distribution of the forward and the reversed chains
Remark: Use to find the stationary distribution, by guessing the transition
probabilities of the reversed chain – even if the process is not reversible
6-10
Continuous-Time Markov Chains
{X(t): -∞< t <∞} irreducible aperiodic Markov chain with
transition rates qij, i≠j
Unique stationary distribution (pi > 0) if and only if:
p j i j q ji i j pi qij ,
j 0,1,...
Process in steady state – e.g., started at t =-∞:
Pr{ X (t ) j} p j lim Pr{ X (t ) j | X (0) i}
t
If {πj}, is the stationary distribution of the embedded
discrete-time chain:
pj
π j / j
i π i / i
, j i j q ji ,
j 0,1,...
6-11
Reversed Continuous-Time Markov Chains
1.
Reversed chain {Y(t)}, with Y(t)=X(τ-t), for arbitrary τ>0
Proposition 2:
{Y(t)} is a continuous-time Markov chain with transition rates:
qij*
p j q ji
pi
,
i, j 0,1,..., i j
2.
{Y(t)} has the same stationary distribution {pj} with the forward chain
Remark: The transition rate out of state i in the reversed chain is equal
to the transition rate out of state i in the forward chain
j i
*
ij
q
j i
p j q ji
pi
pi j i qij
pi
j i qij i ,
i 0,1,...
Reversibility – Continuous-Time Chains
6-12
Markov chain {X(t)} is reversible if and only if the transition rates of
*
forward and reversed chains are equal qij qij , or equivalently
pi qij p j q ji , i, j 0,1,..., i j
Detailed Balance Equations ↔ Reversibility
Theorem 3: If there exists a set of positive numbers {pj}, that sum up to 1
and satisfy:
pi qij p j q ji , i, j 0,1,..., i j
Then:
1. {pj} is the unique stationary distribution
2. The Markov chain is reversible
6-13
Example: Birth-Death Process
n 1
1
0
0
1
2
2
1
n
Transitions only between neighboring states
qi ,i 1 i , qi ,i 1 i , qij 0, | i j | 1
Detailed Balance Equations
n pn n1 pn1, n 0,1,...
Proof: GBE with S ={0,1,…,n} give:
n
pq
j 0 i n 1
j
n
ji
pi qij n pn n 1 pn 1
j 0 i n 1
M/M/1, M/M/c, M/M/∞
Sc
n
n
S
n+1
n1
6-14
Reversed Continuous-Time Markov Chains (Revisited)
Theorem 4: Irreducible continuous-time Markov chain with transition rates
qij. If there exist:
A set of transition rates φij, with ∑j≠i φij=∑j≠i qij, i ≥ 0, and
A set of positive numbers {pj}, that sum up to 1, such that
piij p j q ji ,
i, j 0,1,..., i j
Then:
φij are the transition rates of the reversed chain, and
{pj} is the stationary distribution of the forward and the reversed chains
Remark: Use to find the stationary distribution, by guessing the transition
probabilities of the reversed chain – even if the process is not reversible
6-15
Reversibility: Trees
4
q23
q12
0
q32
q21
q61
1
q10
q16
3
2
q01
5
q67
6
7
q76
Theorem 5:
For a Markov chain form a graph, where states are the nodes, and for
each qij>0, there is a directed arc i→j
Irreducible Markov chain, with transition rates that satisfy qij>0 ↔ qji>0
If graph is a tree – contains no loops – then Markov chain is reversible
Remarks:
Sufficient condition for reversibility
Generalization of one-dimensional birth-death process
6-16
Kolmogorov’s Criterion (Discrete Chain)
Detailed balance equations determine whether a Markov chain is
reversible or not, based on stationary distribution and transition
probabilities
Should be able to derive a reversibility criterion based only on the
transition probabilities!
Theorem 6: A discrete-time Markov chain is reversible if and only if:
Pi1i2 Pi2i3
Pin1in Pini1 Pi1in Pinin1
Pi3i2 Pi2i1
for any finite sequence of states: i1, i2,…, in, and any n
Intuition: Probability of traversing any loop i1→i2→…→in→i1 is equal
to the probability of traversing the same loop in the reverse direction
i1→in→…→i2→i1
6-17
Kolmogorov’s Criterion (Continuous Chain)
Detailed balance equations determine whether a Markov chain is
reversible or not, based on stationary distribution and transition rates
Should be able to derive a reversibility criterion based only on the
transition rates!
Theorem 7: A continuous-time Markov chain is reversible if and only if:
qi1i2 qi2i3
qin1in qini1 qi1in qinin1
qi3i2 qi2i1
for any finite sequence of states: i1, i2,…, in, and any n
Intuition: Product of transition rates along any loop i1→i2→…→in→i1
is equal to the product of transition rates along the same loop traversed
in the reverse direction i1→in→…→i2→i1
6-18
Kolmogorov’s Criterion (proof)
Proof of Theorem 6:
Necessary: If the chain is reversible the DBE hold
π1 Pi1i2 π 2 Pi2i1
π 2 Pi2i3 π 3 Pi3i2
Pi1i2 Pi2i3 Pin1in Pini1 Pi1in Pinin1
π n 1 Pin1in π n Pinin1
π n Pini1 π1 Pi1in
Pi3i2 Pi2i1
Sufficient: Fixing two states i1=i, and in=j and summing over all states
i2,…, in-1 we have
Pi ,i2 Pi2i3 Pin1 , j Pji Pij Pj ,in1 Pi3i2 Pi2 ,i Pijn 1Pji Pij Pjin 1
Taking the limit n→∞
lim Pijn 1 Pji Pij lim Pjin 1 π j Pji Pij πi
n
n
6-19
Example: M/M/2 Queue with Heterogeneous Servers
0
(1 )
1A
A
B
B
A
1B
2
3
A B
A B
M/M/2 queue. Servers A and B with service rates μA and μB respectively.
When the system empty, arrivals go to A with probability α and to B with
probability 1-α. Otherwise, the head of the queue takes the first free server.
Need to keep track of which server is busy when there is 1 customer in the
system. Denote the two possible states by: 1A and 1B.
Reversibility: we only need to check the loop 0→1A→2→1B→0:
q0,1 A q1 A,2 q2,1B q1B ,0 A B
q0,1B q1B ,2 q2,1 A q1 A,0 (1 ) B A
Reversible if and only if α=1/2.
What happens when μA=μB, and α≠1/2?
6-20
Example: M/M/2 Queue with Heterogeneous Servers
S3
0
1A
A
B
B
A
(1 )
1B
2
3
A B
A B
S2
S1
pn p2
A B
n 2
, n 2,3,...
p0 A p1 A B p1B
( A B ) p2 ( p1 A p1B )
( A ) p1 A p0 B p2
( A B )
A 2 A B
(1 )( A B )
p1B p0
B
2 A B
p1 A p0
p2 p0
2 (1 ) A B
AB
2 A B
2 (1 ) A B
p0 p1 A p1B n 2 pn 1 p0 1
2 A B
A B AB
1
6-21
Multidimensional Markov Chains
Theorem 8:
{X1(t)}, {X2(t)}: independent Markov chains
{Xi(t)}: reversible
{X(t)}, with X(t)=(X1(t), X2(t)): vector-valued stochastic process
{X(t)} is a Markov chain
{X(t)} is reversible
Multidimensional Chains:
Queueing system with two classes of customers, each having its own
stochastic properties – track the number of customers from each class
Study the “joint” evolution of two queueing systems – track the number
of customers in each system
Example: Two Independent M/M/1 Queues
6-22
Two independent M/M/1 queues. The arrival and service rates at queue i
are λi and μi respectively. Assume ρi= λi/μi<1.
{(N1(t), N2(t))} is a Markov chain.
Probability of n1 customers at queue 1, and n2 at queue 2, at steady-state
p( n1 , n2 ) (1 1 ) 1n1 (1 2 ) 2n2 p1 ( n1 ) p2 ( n2 )
“Product-form” distribution
Generalizes for any number K of independent queues, M/M/1, M/M/c,
or M/M/∞. If pi(ni) is the stationary distribution of queue i:
p( n1 , n2 ,
, nK ) p1 ( n1 ) p2 ( n2 )
pK ( nK )
Example: Two Independent M/M/1 Queues
6-23
Stationary distribution:
n1
p( n1 , n2 ) 1 1 1 1 2 2
1 1 2 2
1
1
n2
03
Detailed Balance Equations:
1 p( n1 1, n2 ) 1 p( n1 , n2 )
2 p( n1 , n2 1) 2 p( n1 , n2 )
13
2
2
1
1
23
2
2
02
1
1
2
1
00
1
1
1
2
2
2
1
1
1
2
2
31
1
2
2
1
20
1
2
2
32
1
2
1
21
10
1
2
2
11
1
2
1
33
22
2
2
01
Verify that the Markov chain is
reversible – Kolmogorov criterion
1
12
2
2
1
2
2
30
1
6-24
Truncation of a Reversible Markov Chain
Theorem 9: {X(t)} reversible Markov process with state space S, and
stationary distribution {pj: jS}. Truncated to a set ES, such that the
resulting chain {Y(t)} is irreducible. Then, {Y(t)} is reversible and has
stationary distribution:
pj
pj
kE pk
,
jE
Remark: This is the conditional probability that, in steady-state, the
original process is at state j, given that it is somewhere in E
Proof: Verify that:
p j q ji pi qij
pj
kE pk
p j jE
jE
q ji
pj
p
kE k
1
pi
kE pk
qij p j q ji pi qij , i , j S ; i j
Example: Two Queues with Joint Buffer
6-25
The two independent M/M/1 queues of
the previous example share a common
buffer of size B – arrival that finds B
customers waiting is blocked
State space restricted to
E {( n1 , n2 ) : ( n1 1) ( n2 1) B}
1
03
2
2
p(0,0) 1n1 2n2
( n1 ,n2 )E
2
2
2
2
1
1
1
1
2
1
00
31
1
1
2
2
1
21
1
10
1
2
2
11
1
1
1
22
2
2
01
2
Theorem specifies joint distribution up
to the normalization constant
Calculation of normalization constant is
often tedious
1
12
Distribution of truncated chain:
Normalizing:
1
02
p( n1 , n2 ) p(0,0) 1n1 2n2 , ( n1 , n2 ) E
13
2
2
1
20
1
State diagram for B =2
2
2
30
1
6-26
Burke’s Theorem
{X(t)} birth-death process with stationary distribution {pj}
Arrival epochs: points of increase for {X(t)}
Departure epoch: points of increase for {X(t)}
{X(t)} completely determines the corresponding arrival and departure
processes
Arrivals
Departures
6-27
Burke’s Theorem
Poisson arrival process: λj=λ, for all j
Birth-death process called a (λ, μj)-process
Examples: M/M/1, M/M/c, M/M/∞ queues
Poisson arrivals → LAA:
For any time t, future arrivals are independent of {X(s): s≤t}
(λ, μj)-process at steady state is reversible: forward and reversed chains
are stochastically identical
Arrival processes of the forward and reversed chains are stochastically
identical
Arrival process of the reversed chain is Poisson with rate λ
The arrival epochs of the reversed chain are the departure epochs of the
forward chain
Departure process of the forward chain is Poisson with rate λ
6-28
Burke’s Theorem
t
t
t
t
Reversed chain: arrivals after time t are independent of the chain
history up to time t (LAA)
Forward chain: departures prior to time t and future of the chain
{X(s): s≥t} are independent
6-29
Burke’s Theorem
Theorem 10: Consider an M/M/1, M/M/c, or M/M/∞ system with
arrival rate λ. Suppose that the system starts at steady-state. Then:
1. The departure process is Poisson with rate λ
2. At each time t, the number of customers in the system is
independent of the departure times prior to t
Fundamental result for study of networks of M/M/* queues, where
output process from one queue is the input process of another
6-30
Single-Server Queues in Tandem
Poisson
Station 1
1
Station 2
2
Customers arrive at queue 1 according to Poisson process with rate λ.
Service times exponential with mean 1/μi. Assume service times of a
customer in the two queues are independent.
Assume ρi=λ/μi<1
What is the joint stationary distribution of N1 and N2 – number of
customers in each queue?
Result: in steady state the queues are independent and
p( n1 , n2 ) (1 1 ) 1n1 (1 2 ) 2n2 p1 ( n1 ) p2 ( n2 )
6-31
Single-Server Queues in Tandem
Poisson
Station 1
1
2
Q1 is a M/M/1 queue. At steady state its departure process is Poisson
with rate λ. Thus Q2 is also M/M/1.
Marginal stationary distributions:
p1 (n1 ) (1 1 ) 1n1 , n1 0,1,...
Station 2
p2 (n2 ) (1 2 ) 2n2 , n2 0,1,...
To complete the proof: establish independence at steady state
Q1 at steady state: at time t, N1(t) is independent of departures prior to t,
which are arrivals at Q2 up to t. Thus N1(t) and N2(t) independent:
P{N1 (t ) n1, N 2 (t ) n2} P{N1 (t ) n1}P{N 2 (t ) n2} p1(n1) P{N 2 (t ) n2}
Letting t→∞, the joint stationary distribution
p(n1 , n2 ) p1 (n1 ) p2 (n2 ) (1 1 ) 1n1 (1 2 ) 2n2
7-32
Queues in Tandem
Theorem 11: Network consisting of K single-server queues in tandem.
Service times at queue i exponential with rate μi, independent of service
times at any queue j≠i. Arrivals at the first queue are Poisson with rate
λ. The stationary distribution of the network is:
K
p( n1 , , nK ) (1 i )ini , ni 0,1,...; i 1,..., K
i 1
At steady state the queues are independent; the distribution of queue i is
that of an isolated M/M/1 queue with arrival and service rates λ and μi
pi ( ni ) (1 i )ini , ni 0,1,...
Are
the queues independent if not in steady state? Are stochastic
processes {N1(t)} and {N2(t)} independent?
7-33
Queues in Tandem: State-Dependent Service Rates
Theorem 12: Network consisting of K queues in tandem. Service times
at queue i exponential with rate μi(ni) when there are ni customers in the
queue – independent of service times at any queue j≠i. Arrivals at the
first queue are Poisson with rate λ. The stationary distribution of the
network is:
K
p( n1 , , nK ) pi ( ni ), ni 0,1,...; i 1,..., K
i 1
where {pi(ni)} is the stationary distribution of queue i in isolation with
Poisson arrivals with rate λ
Examples: ./M/c and ./M/∞ queues
If queue i is ./M/∞, then:
( / i )ni / i
pi ( ni )
e
, ni 0,1,...
ni !