TCOM 501: Lecture 9

Download Report

Transcript TCOM 501: Lecture 9

TCOM 501:
Networking Theory & Fundamentals
Lectures 9 & 10
M/G/1 Queue
Prof. Yannis A. Korilis
1
10-2
Topics






M/G/1 Queue
Pollaczek-Khinchin (P-K) Formula
Embedded Markov Chain Observed at Departure Epochs
Pollaczek-Khinchin Transform Equation
Queues with Vacations
Priority Queueing
10-3
M/G/1 Queue



Arrival Process: Poisson with rate λ
Single server, infinite waiting room
Service times:



Independent identically distributed following a general distribution
Independent of the arrival process
Main Results:
Determine the average time a customer spends in the queue waiting
service (Pollaczek-Khinchin Formula)
Calculation of stationary distribution for special cases only
10-4
M/G/1 Queue – Notation





Wi : waiting time of customer i
X i : service time of customer i
Qi : number of customers waiting in queue (excluding the one in service) upon arrival of
customer i
Ri : residual service time of customer i = time until the customer found in service by customer
i completes service
Ai : number of arrivals during the service time X i of customer i
Service Times

X1, X2, …, independent identically distributed RVs

Independent of the inter-arrival times

Follow a general distribution characterized by its pdf f X ( x ) , or cdf FX ( x )

Common mean E [ X ]  1/ 

Common second moment E[ X 2 ]
10-5
M/G/1 Queue
State Representation:

{N (t ) : t  0} is not a Markov process – time spent at each state is not exponential

R(t) = the time until the customer that is in service at time t completes service

{( N (t ), R(t )) : t  0} is a continuous time Markov process, but the state space is not a
countable set

Finding the stationary distribution can be a rather challenging task
Goals:

Calculate average number of customers and average time-delay without first calculating the
stationary distribution

Pollaczek-Khinchin (P-K) Formula:
E[ X 2 ]
E[W ] 
2(1  E[ X ])

To find the stationary distribution, we use the embedded Markov chain, defined by observing
N ( t ) at departure epochs only – transformation methods
10-6
A Result from Probability Theory
Proposition: Sum of a Random Number of Random Variables

N: random variable taking values 0,1,2,…, with mean E [ N ]

X1, X2, …, XN: iid random variables with common mean E [ X ]
Then: E[ X 1 
 X N ]  E[ X ]  E[ N ]
Proof: Given that N=n the expected value of the sum is
n
n
N
E  j 1 X j | N  n   E  j 1 X j    j 1 E[ X j ] nE[ X ]




Then:


N
N
E   j 1 X j    E   j 1 X j | N  n   P{N  n}   nE [ X ]  P{N  n}

 n 1 

n 1

 E[ X ] nP{N  n}  E [ X ]E [ N ]
n 1
10-7
Pollaczek-Khinchin Formula

Assume FCFS discipline

Waiting time for customer i is:
Wi  Ri  X i 1  X i  2 
 X i Qi  Ri   j i Q X j
i 1
i

Take the expectation on both sides and let i   , assuming the limits exist:
i 1
E[Wi ]  E[ Ri ]  E   j i Q X j   E[ Ri ]  E[ X ]E[Qi ] 
i


E[W ]  E[ R]  E[ X ]E[Q ]

Averages E[Q], E[R] in the above equation are those seen by an arriving customer.

Poisson arrivals and Lack of Anticipation: averages seen by an arriving customer are equal
averages seen by an outside observer – PASTA property

Little’s theorem for the waiting area only:
E[Q ]  E[W ]


E[ R ]
1 
  E [ X ]   /  : utilization factor = proportion of time server is busy
  E[ X ]  1  p0
E[W ]  E[ R ]  E[ X ]  E[W ]  R  E[W ]  E[W ] 
Calculate the average residual time: E[ R]  lim E[ Ri ]
i 
10-8
Average Residual Time
R(t )
X1
t
X1
X2
X D(t )

Graphical calculation of the long-term average of the residual time

Time-average residual time over [0,t]: t 1  R( s)ds
t
0

Consider time t, such that R(t)=0. Let D(t) be the number of departures in [0,t] and assume
that R(0)=0. From the figure we see that:
2
1 t
1 D ( t ) X i2 1 D(t )  i 1 X i
R( s )ds  
 


t 0
t i 1 2
2 t
D (t )
D(t )
X i2
1 t
1
D(t )

i 1
lim  R( s )ds   lim
 lim
t  t 0
2 t  t t  D(t )
Ergodicity: long-term time averages = steady-state averages (with probability 1)
1 t
E[ R ]  lim E[ Ri ]  lim  R( s )ds
i 
t  t 0
D(t )

10-9
Average Residual Time (cont.)

X i2
1 t
1
D(t )

i 1
lim  R( s )ds   lim
 lim
t  t 0
2 t  t t  D(t )
lim D(t ) / t : long-term average departure rate. Should be equal to the long-term average

arrival rate. Long-term averages = steady-state averages (with probability 1):
D (t )
lim

t 
t
Law of large numbers:
D(t )

t 

lim
t 
D(t )
i 1
X i2
D (t )

 lim
n 
n
i 1
X i2
n
 E[ X 2 ]
Average residual time:
1
E [ R ]  E [ X 2 ]
2
P-K Formula:
E[ R] E[ X 2 ]
E[W ] 

1   2(1  )
10-10
P-K Formula
P-K Formula:
E[ R] E[ X 2 ]

E[W ] 
1   2(1  )

Average time a customer spends in the system
1 E[ X 2 ]
E[T ]  E[ X ]  E[W ]  
 2(1  )

Average number of customers waiting for service:
 2 E[ X 2 ]
E[Q ]  E[W ] 
2(1  )

Average number of customers in the system (waiting or in service):
 2 E[ X 2 ]
E[ N ]  E[T ]   
2(1  )
Averages E[W], E[T], E[Q], E[N] depend on the first two moments of the service time
10-11
P-K Formula: Examples
M/D/1 Queue: Deterministic service times all equal to 1/μ



E[ X ] 
1
,

E[ X 2 ] 
1
2
E[ X 2 ]

 2 E[ X 2 ]
2
E[W ] 

, E[Q ] 

2(1  ) 2(1  )
2(1  ) 2(1  )
1 E[ X 2 ] 1

2 
(2  )
E[T ]  
 

, E[ N ]  E[T ] 
 2(1  )  2(1  ) 2(1  )
2(1  )
M/M/1 Queue: Exponential service times with mean 1/μ



E[ X ] 
1
,

E[ X 2 ] 
2
2
E[ X 2 ]

 2 E[ X 2 ]
2
E[W ] 

, E[Q ] 

2(1  ) (1  )
2(1  ) (1  )
1 E[ X 2 ] 1

1

E[T ]  
 

, E[ N ]  E[T ] 
 2(1  )  (1  )   

10-12
Distribution Upon Arrival or Departure
Theorem 1: For an M/G/1 queue at steady-state, the distribution of customers seen by an
arriving customer is the same as that left behind by a departing customer.
Proof: Customers arrive one at a time and depart one at a time.

A(t ), D(t ) : number of arrivals and departures (respectively) in (0,t)

U n (t ) : number of (n,n+1) transitions in (0,t) = number of arrivals that find system at state n

Vn (t ) : number of (n+1,n) transitions in (0,t) = number of departures that leave system at state n

U n (t ) and Vn (t ) differ by at most 1 [when a (n,n+1) transition occurs, another (n,n+1) transition
can occur only if the state has moved back to n, i.e., after a (n+1,n) transition has occurred]
 Stationary probability that
an arriving customer finds the system at state n:
n  lim P{N (t )  n | arrival at t  }
t 

n is the proportion of arrivals that find the system at state n:
U (t )
n  lim n
t  A(t )
 Similarly,
stationary probability that a departing customer leaves system at state n:
Vn (t )
t  D ( t )
 Noting that lim A(t ) / t  lim D(t ) / t   , we have:
n  lim
t 
t 
U n (t )
V (t )
U (t )
V (t )
A(t )
D(t )
 lim n  lim n lim
 lim n lim
 n   n
t 
t 
t  A(t ) t 
t  D (t ) t 
t
t
t
t
lim
10-13
Distribution Upon Arrival or Departure (cont.)
Theorem 2: For an M/G/1 queue at steady-state, the probability that an arriving customer finds n
customers in the system is equal to the proportion of time that there are n customers in the
system. Therefore, the distribution seen by an arriving customer is identical to the stationary
distribution.
Proof: Identical to the PASTA theorem due to:
 Poisson arrivals
 Lack of anticipation: future arrivals independent of current state N(t)
Theorem 3: For an M/G/1 queue at steady-state, the system appears statistically identical to an
arriving and a departing customer. Both an arriving and a departing customer, at steady-state, see
a system that is statistically identical to the one seen by an observer looking at the system at an
arbitrary time.
Analysis of the M/G/1 Queue:

Consider the embedded Markov chain resulting by observing the system at departure epochs

At steady-state, the embedded Markov chain and {N(t)} are statistically identical

Stationary distribution pn is equal to the stationary distribution of the embedded Markov chain
10-14
Embedded Markov Chain

s j : time of jth departure

L j  N ( s j ) : number of customers left behind by the jth departing customer
Show that {L j : j  1} is a Markov chain

If L j 1  1 : customer j enters service immediately at time s j 1 . Then:
L j  L j 1  1  Aj , if L j 1  1

If L j 1  0 : customer j arrives after time s j 1 and departs at time s j . Then:
L j  Aj , if L j 1  0

Combining the above:
L j  L j 1  Aj  1{L j 1  0}

A j : number of arrivals during service time X j :


0
0
P{Aj  k}   P{Aj  k | X j  t} f X (t )dt  (1/ k !)  e t (t )k f X (t )dt

A1, A2,…: independent – arrivals in disjoint intervals
L j depends on the past only through L j 1 . Thus: {L j : j  1} is a Markov chain
10-15
Number of Arrivals During a Service Time
A1, A2, …: iid. Drop the index j – equivalent to considering the system at steady state

1 
ak  P{ A  k }   P{ A  k | X  t} f X (t )dt   e t (t ) k f X (t )dt , k  0,1,...
0
k! 0
Find the first two moments of A
Proposition: For the number of arrivals A during service time X, we have:
E[ A]  E[ X ]  
E[ A2 ]   2 E[ X 2 ]  E[ X ]
Proof: Given X=t, the number of arrivals A follows the Poisson distribution with parameter t .



0
0
0
E[ A]   E[ A | X  t ] f X (t )dt  (t ) f X (t )dt   tf X (t )dt E[ X ]


0
0
E[ A2 ]   E[ A2 | X  t ] f X (t )dt   ( 2t 2  t ) f X (t )dt


0
0
  2  t 2 f X (t )dt    tf X (t )dt   2 E [ X 2 ]  E [ X ]
Lemma: Let Y be a RV following the Poisson distribution with parameter   0 . Then:
E[Y ]   k 1 k  e



k
k!
 , E[Y ]   k 1 k  e
2

2


k
k!
 2  
10-16
Embedded Markov Chain
i 1
0

2
i
i2
1
Transition Probabilities: Pij  P{Ln 1  j | Ln  i}
P0 j   j ,

i 1
3
j  0,1,...
, j  i 1

Pij   j i 1
, i 1
0,
j

i

1

Stationary distribution:  j  lim P{Ln  j}
n 
 0 1 2



 j  0  j  1 j    j 1   j 10 , j  1
0 1 2

 or   P, P  
0




0  0 0  10
0
1





Unique solution:  j is the fraction of departing customers that leave j customers behind

From Theorem 3:  j is also the proportion of time that there are j customers in the system
10-17
Calculating the Stationary Distribution

Applying Little’s Theorem for the server, the proportion of time that the server is busy is:
1  0  E[T ]    0  1  

Stationary distribution can be calculated iteratively:
0  0 0  10
1  0 1  11  2 0
Iterative calculation might be prohibitively involved
Often, we want to find only the first few moments of the distribution, e.g., E[N] and E[N2]

We will present a general methodology based on z-transforms that can be used to
1.
Find the moments of the stationary distribution without calculating the distribution itself
2.
Find the stationary distribution, in special cases
3.
Derive approximations of the stationary distribution
10-18
Moment Generating Functions
Definition: Moment generating function of random variable X; for any t  R
  etx f ( x)dx,
X

M X (t )  E[etX ]   
tx
 j e j P{ X  x j },

X continuous
X discrete
Theorem 1: If the moment generating function M X (t ) exists and is finite in some neighborhood
of t=0, it determines the distribution (pdf or pmf) of X uniquely.
Theorem 2: For any positive integer n:
1.
dn
M X (t )  E[ X n etX ]
n
dt
2.
dn
M X (0)  E[ X n ]
n
dt
Theorem 3: If X and Y are independent random variables:
M X Y (t )  M X (t ) M Y (t )
10-19
Z-Transforms of Discrete Random Variables

For a discrete random variable, the moment generating function is a polynomial of e t .

It is more convenient to set z  et and define the z-transform (or characteristic function):
GX ( z )  E[ z X ]   z j P{ X  x j }
x
j

Let X be a discrete random variable taking values 0, 1, 2,…, and let pn  P{ X  n} . The ztransform is well-defined for | z | 1 :
GX ( z )  p0  zp1  z 2 p2  z 3 p3 

  pn z n
n 0

Z-transform uniquely determines the distribution of X

If X and Y are independent random variables: GX Y ( z )  GX ( z )GY ( z )
Calculating factorial moments:


n 1
n 1
lim GX ( z )  lim  npn z n 1   npn  E[ X ]
z 1
z 1


n2
n2
lim GX ( z )  lim  n( n  1) pn z n  2   n(n  1) pn  E[ X ( X  1)]
z 1

z 1
Higher factorial moments can be calculated similarly
10-20
Continuous Random Variables
Distribution
(parameters)
Prob. Density Fun.
fX (x)
Moment Gen. Fun.
MX (t)
Mean
E[X]
Variance
Var(X)
Uniform over
(a; b)
1
b¡a
etb ¡eta
t(b¡a)
a+b
2
(b¡a)2
12
¸
¸¡t
1
¸
1
¸
¹
¾2
Exponential
¸
Normal
(¹; ¾ 2 )
a<x<b
¸e¡¸x
x¸0
2
2
p 1 e¡(x¡¹) =2¾
2¼¾
¡1 < x < 1
e¹t+(¾t)
2 =2
10-21
Discrete Random Variables
Distribution
(parameters)
Binomial
(n; p)
Prob. Mass Fun.
P fX = kg
Poisson
¸
Mean
E[X]
Variance
Var(X)
pk (1 ¡ p)n¡k
k = 0; 1; : : : ; n
(pet + 1 ¡ p)n
np
np(1 ¡ p)
(1 ¡ p)k¡1 p
k = 1; 2; : : :
pet
1¡(1¡p)et
1
p
1¡p
p2
r
p
r(1¡p)
p2
¸
¸
¡n¢
k
Geometric
p
Negative Bin.
(r; p)
Moment Gen. Fun.
MX (t)
³
k¡1
r¡1
´
pr (1
¡ p)k¡r
h
ir
pet
1¡(1¡p)et
k = r; r + 1; : : :
k
e¡¸ ¸k!
k = 0; 1; : : :
t
e¸(e ¡1)
10-22
P-K Transform Equation
We have established:
L j  L j 1  1{L j 1  0}  Aj  ( L j 1  1)  Aj
Let n  lim P{L j  n} be the stationary distribution and GL ( z )  n 0 n z n its z-transform.

j 
Noting that ( L j 1  1) and A j are independent, we have:
E[ z j ]  E[ z
L
( L j 1 1)
A
]E[ z j ]
At steady-state, L j and L j 1 are statistically identical, with pmf  n . Therefore:
E[ z j ]  E[ z
L
L j 1
]  GL ( z)
Moreover: E[ z j ]  GA ( z )  n 0 n z n

A
Let X be a discrete random variable taking values 0, 1, 2,…, and let pn  P{ X  n} . Then:

E[ z ( X 1) ]  p0  p1  zp2  z 2 p3 
Therefore: E[ z
( L j 1 1) 
]  0  z 1 ( E[ z
L j 1
 p0  z 1 ( E[ z X ]  p0 )
]  0 )  0  z 1 (GL ( z )  0 )
Then:
GL ( z )  [ 0  z 1 (GL ( z )  0 )]G A ( z )  GL ( z ) 
0 ( z  1)G A ( z )
z  GA ( z )
10-23
P-K Transform Equation
Probability  0 can be calculated by requiring lim GL ( z )   n 0 n  1

z 1
Using L’Hospital’s rule:
0 [GA ( z )  ( z  1)GA ( z )]
0

 0  1  E[ A]
z 1
1  GA ( z )
1  E[ A]
1  lim
Recall that E[ A]  E[ X ]   . For 0  0 , we must have:   E [ A]  1. Finally:
G A ( z )   n  0 z k   n  0 z


k
k


0
e
x
( x ) k
f X ( x )dx
k!

(xz )k
  e  n 0
f X ( x )dx   e ( z 1) x f X ( x )dx
0
0
k!
 M X (( z  1))

x


where M X (t )  E[etX ]  0 etx f X ( x )dx is the moment generating function of the service time X
At steady-state the number of customers left behind by a departing customer and the number
of customers in the system are statistically identical, i.e., {L j } and {N(t)} have the same pmf
Concluding:
(1  )( z  1)GA ( z ) (1  )( z  1) M X (( z  1))
G N ( z )  GL ( z ) 

z  GA ( z )
z  M X (( z  1))
10-24
P-K Transform Equation
Example 1: M/M/1 Queue. X is exponentially distributed with mean 1/μ
.
M X (t )  E [e tX ] 
T

t
Then, the z-transform of the number of arrivals during a service time is:
G A ( z )  M X (( z  1)) 



   ( z  1)     z
The P-K Transform equation, then, gives:
GN ( z ) 
(1  )( z  1)G A ( z )

z  GA ( z )
(1  )( z  1)

1 
    z


1  z
z
    z
For | z |  :
1
 1  z   2 z 2 
1  z
Then:
GN ( z )   n 0 (1  )n  z n

Therefore:
pn  (1 )n , n  0
h
e
n
:
10-25
Expansion in Partial Fractions
Assume that the z-transform is of the form:
G( z) 
U ( z)
,
V ( z)
U(z) and V(z) polynomials without common roots. Let z1 , , zm be the roots of V(z). Then:
V ( z )  ( z  z1 )( z  z2 )
( z  zm )
Expansion of G(z) in partial fractions:
G( z) 
1
2


( z1  z ) ( z2  z )

m
( zm  z )
Given such an expansion, for | z || zk | :
2
1
z  z 
1    
1  z / zk
zk  zk 
Then:
n
m
 m
k
k   z 

1
G( z)  
       n k 1 z n
k 1 zk 1  z / zk
k 1 zk n  0  zk 
n  0 k 1 zk
m
Therefore:
k
n 1
k 1 zk
m
pn  
10-26
Expansion in Partial Fractions (cont.)
G( z ) 
U ( z)
, V ( z )  ( z  z1 )( z  z2 )
V ( z)
( z  zm )
Expansion of G(z) in partial fractions:
G( z) 
1
2


( z1  z ) ( z2  z )

m
( zm  z )
Determining the coefficients of the partial fractions:
1  lim( z1  z )G( z )
z  z1
Note that:
lim( z1  z )G ( z )  
z  z1
U ( z1 )
U ( z1 )

( z1  z2 ) ( z1  zm )
V ( z1 )
Therefore, the coefficients can be determined as:
k  lim( zk  z )G ( z )  
z  zk
U ( zk )
V ( zk )
10-27
M/G/1 Queue with Priority Classes

M/G/1 system with arriving customers divided in c priority classes

Class 1 highest priority, class 2 second highest, up to class c which is the lowest priority class

Class k customers arrive according to a Poisson process with rate  k

Service times of class k customers are iid, following a general distribution with mean
X k  1/  k and second moment X k2

Arrival processes are independent of each other and independent of the service times

k   k X k   k /  k : utilization factor for class k

Wk : average queueing time (at steady state) of class k customers

Preemptive or non-preemptive priority discipline
Develop a formula that gives the average queueing time for each priority class
10-28
Non-Preemptive Priority

Non-preemptive: service of a customer completes uninterrupted, even if customers of higher
priority arrive in the meantime

A separate queue is maintained for each class; each time the server becomes free, the first
customer in the highest priority queue (that is not empty) enters service

Non-preemptive policy: the mean residual service R time seen by an arriving customer is the
same for all priority classes
Priority Class 1

Queueing time of class 1 customer = residual service time + time required to service all class
1 customers found in the queue upon arrival


Similarly to the derivation of P-K formula, this implies:
1
W1  R  Q1
1
Little’s formula: Q1  1W1

Combining the two:
W1 
R
1  1
10-29
Non-Preemptive Priority
Priority Class 2:


Queueing time for class 2 customer is the sum of the following:
1.
Residual service time
2.
Time to service all class 1 customers found in queue upon arrival
3.
Time to service all class 2 customers found in queue upon arrival
4.
Time to service all class 1 customers that arrive while the customer waits in the queue
Focusing on averages of these times:
1
1
1
W2  R  Q1  Q2  1W2
1
2
1
R

1
1
1
1W1   2W2  1W2
1
2
1
Solving for W2 and using the expression for W1 :
R
W2 
(1  1 )(1  1  2 )
10-30
Non-Preemptive Priority
Priority Class k: Using induction
Wk 
(1  1 
R
 k 1 )(1  1 
 k 1  k )

Mean Residual Service Time: Using the graphical method developed in the proof of the P-K
formula, one can show:
1 c
R    k X 2k
2 k 1

Average time a class k customer spends in the system:
1
Tk 
 Wk
k
Using Little’s formula the number of customers in the system for each class, and for all
customers, the average time delay per customer:

T
1T1 
1 
  cTc
 c
10-31
Preemptive Priority

Non-preemptive policy: average queueing time of a customer depends on the arrival rate of
lower priority customers
Preemptive/resume policy: service of a customer is interrupted when a higher priority
customer arrives; it resumes from the point of interruption when all higher priority customers
have been served

Priority k customers are not affected by the presence of lower priority customers

Calculate Tk = average time a class k customer spends in the system. This consists of:
1.
2.
Average service time of the customer 1/ k
Average time required to service customers of priority 1 through k found in the system upon
arrival. This is equal to the average waiting time in an M/G/1 system where customers of
priority lower than k are neglected, that is:
Rk
1 k
, Rk   i X i2
1  1   k
2 i 1
3.
Average time requited to service customers of priority higher than k that arrive while the
customer is in the system:
k 1
k 1
1
i Tk  i Tk , k  1


i 1 i
i 1
10-32
Preemptive Priority
Combining terms:
Tk 
Rk
1

 Tk (1 
 k 1  1   k
Final solution:
Tk 
 k 1 )
(1/  k )(1  1   k )  Rk
(1  1   k 1 )(1  1   k )
where:
1 k
Rk    i X i2
2 i 1