Transcript Chapter7

Chapter 7
Dynamic Job Shops
Advantages/Disadvantages
Planning, Control and Scheduling
Open Queuing Network Model
1
Definition of a Job Shop
• Several types of machines
– Similar machines grouped, focused on particular operations
• Storage for work in process (no blocking)
• Material handling between machine groups
• Different job types have different routings among machine
groups; may revisit the same machine group
• Manufacturing process in evolution
• “Job” is typically a batch of identical pieces
• May have to set up machines between different job types
2
Advantages & Disadvantages
Advantages (machine groups)
+ Ease of supervision/operation by skilled workers
+ High utilization of expensive machines
+ Flexibility – broad scope of products
Disadvantages (flexibility)
– High WIP and long flow times
– Conflict between resource utilization and customer service
– Difficult to meet promise dates
– Cost of variety: reduced learning, difficult scheduling
3
Planning, Control, Scheduling
Focus on produce-to-order (if produce-to-stock, treat PA as a
“customer order”)
• Design and layout
– evolutionary
– based on real or perceived bottlenecks
• Order acceptance and resource planning
– Quoted delivery date: need distribution of flow time
– Required resource levels: complicated by random order arrivals
4
Planning, Control, Scheduling (cont.)
• Loading and resource commitment
– Schedule a new job on critical machine groups first; this schedule
determines sequencing on noncritical groups and timing of release
to the shop
– Forward load jobs on all machines, looking ahead from present
– Backward load jobs on all machines, working back from due dates
– Production schedule based on deterministic processing and
transport times; slack time built in to handle uncertainty
• Allocation and resource assignment
– Assign jobs at a machine group to worker/machine
– Often from resource perspective, ignoring promise dates
5
Why Models?
• Estimate flow times for use in quoting delivery dates
• Identify policies
– order release
– scheduling
– sequencing
to reduce unnecessary flow time and work in process
Model a particular policy, then observe numbers of jobs in
system, at each machine group, and in transition between
machine groups
6
Job Assumptions
1.
2.
3.
4.
5.
A job released to the system goes directly to a machine
center
Job characteristics are statistically independent of other
jobs
Each job visits a specified sequence of machine centers
Finite processing times, identically distributed for each
job of specific type at a given machine center
Jobs may wait between machine centers
7
Machine Assumptions
1.
A machine center consists of a number (perhaps one) of
identical machines
2. Each machine operates independently of other machines;
can operate at its own maximum rate
3. Each machine is continuously available for processing
jobs
(in practice, rate in #2 is adjusted down to account for
breakdowns, maintenance, etc.)
8
Operation Assumptions
1.
2.
3.
4.
5.
6.
Each job is indivisible
No cancellation or interruption of jobs
No preemption
Each job is processed on no more than one machine at a
time
Each machine center has adequate space for jobs
awaiting processing there
Each machine center has adequate output space for
completed jobs to wait
9
Aggregation of Job Flow
Follow Section 7.3.2 in the text to determine:
 i  probability that agg. job visits m/c center i first
i  rate of job flow into/out of m/c center i
ij  rate of job flow from center i to center j
pij  ij i  fraction of jobs leaving i that go to j next
E  Si   expected service time of an agg. job at center i
wi  i E  Si   average workload brought to center i
10
Example

C
C
( )
1
( )
2
C3(
)
C
( )
4
C5(
)
E  S1( )  E  S 2( )  E  S3( )  E  S 4( )  E  S5( ) 
1
0.15
3
1
2
-
-
6
4
1
-
-
2
0.10
1
2
3
1
2
8
6
1
2
7
3
0.05
2
3
1
3
-
4
9
4
2
-
11
Job Shop Capacity
Capacity is the maximum tolerable total arrival rate, *
Job arrival rates to machine centers satisfy
n
i   i    j p ji , i  1,..., m
j 1
Let vi be the average number of times an aggregate job visits
machine center i during its stay in the system. i   vi 
n
vi   i   v j p ji , i  1,..., m
j 1
 ci 
  min 

1i  m v E  S 
 i
i 

*
unaffected by
variability!
12
Jackson Open Queuing Network
Assume
A single class (aggregated) of jobs arrives from outside
according to a Poisson process with rate 
The service times of jobs at machine center i are iid
exponential random variables with mean 1/i
If there are n jobs present at machine center i, then the total
processing rate is iri(n) (e.g., if there are ci identical
machines then ri(n) = min{n, ci})
Service protocol (queue discipline) is independent of job
service time requirements, e.g., FCFS or random
13
Jackson Network
If Ni(t) is the number of jobs at machine center i at time t, and
p(n)  lim P N1 (t )  n1 ,..., Nm (t )  nm 
t 
then
m
p (n)   pi ( ni )
i 1
that is, the network has a product-form solution. If each m/c
center has a single machine, then if i  i i  1,
m
p (n)   1  i  i
ni
i 1
The network can be decomposed into independent M/M/1
queues (M/M/ci if multiple m/c’s at each center)
14
Single Machine at Each Station
i
Average number of jobs in m/c center i: E  N i  
1  i
Total number of jobs in the system: E  N    m i
i 1
1  i
Average flow time of a job:
i
i
m vi
m
E T    i 1
  i 1
  i 1 vi E Ti 

1  i
i 1  i
1
m
Variance of the number of jobs in the system:
i
Var  N    i 1
2
1  i 
m
15
Assigning Tasks to Machine Centers
Minimize E[T] or equivalently E[N]
s.t. average number of tasks for an arbitrary job = K
m
Then task allocation determines values of vi with i 1 vi  K
Assume that if a given task is assigned to m/c center i then it
will require an exponential (i) amount of time;
m
i
vi i
EN  

i 1 1  i
i 1 1   vi i
m
Optimal:

 m

i
1
vi 
i     j   K  m

 j 1
  j 1  j

*




16
Achieving the Optimal Visit Rates
Total set of tasks for all job types:   1,..., L
Let wi be the average number of times task i needs to be
performed on an arbitrary job:  L wi  K
i 1
Ideally, find a partition
  S1  S2  ...  Sm such that  jS w j  vi
*
i
More practically, for each k=1,…,m, find l(k) as the largest
task index that satisfies
l (k )
k
*
w

v
i1 i  j 1 j (l  m   L)
Then if

l (k )
i 1
wi   j 1 v j ,
k
*
assign task l  k   1 to both m/c centers k and k  1
17
Generalized Jackson Networks
(Approximate Decomposition Methods)
• The arrival and departure processes are approximated by
renewal process and each station is analyzed individually
as GI/G/m queue.
• The performance measures (E[N], E[W]) are approximated
2
2
by 4 parameters {Ca0 j , CS j , 0 j ,  j }. (Don’t need the exact
distribution)
• References
– Whitt, W. (1983a), “The Queuing Network Analyzer,” Bell System
Technical Journal, 62, 2779-2815.
– Whitt, W. (1983b), “Performance of Queuing Network Analyzer,”
Bell System Technical Journal, 62, 2817-2843.
– Bitran, G. and R. Morabito (1996), “Open queuing networks:
optimization and performance evaluation models for discrete
manufacturing systems,” Production and Operations Management,
5, 2, 163-193.
18
Single-class GI/G/1 OQNS
Notation:
• n = number of internal stations in the network.
For each j, j = 1,…,n:
• 0 j = expected external arrival rate at station j
[ 0 j  1/ E[( A0 j )] ],
2
• Ca = squared coefficient of variation (scv) or variability of
external interarrival time at station j
{Ca2 = [Var(A0j)]/[E(A0j)2]},
•  j = expected service rate at station i [j =1/E(Sj)]
2
• CS = scv or variability of service time at station j
{CS2 = [Var(Sj)]/[E(Sj)2]}.
0j
0j
0j
0j
19
Notation (cont)
- For each pair (i,j), i = 1,…,n, j = 1,…,n:
pij = probability of a job going to station j after
completing service at station i.
*We assume no immediate feedback, pii  0
- For n nodes, the input consists of n2 + 4n numbers
• The complete decomposition is essentially
described in 3 steps
o Step 1. Analysis of interaction between stations of the
networks,
o Step 2. Evaluation of performance measures at each
station,
o Step 3. Evaluation of performance measures for the
whole network.
20
Step 1:
•
Determine two parameters for each station j:
(i) the expected arrival rate j = 1/E[Aj] where Aj
is the interarrival time at the station j.
(ii) the squared coefficient of variation (scv) or
variability of interarrival time, Ca2 =
Var(Aj)/E(Aj)2.
Consists of 3 sub-steps:
i
•
(i) Superposition of arrival processes
(ii) Overall departure process
(iii) Departure splitting according to destinations
21
Step 1:
Superposed arrival rate
•  j can ben obtained from the traffic rate equations:
 j  0 j   ij for j = 1,…,n
i 1
where ij = piji is the expected arrival rate at
station j from station i. We also get  j = j/j,
0   j < 1 (the utilization or offered load)
• The expected (external) departure rate to station 0
from station j is given by  j 0   j 1  n p ji
.

i 1

• Throughput 0 =   or  
• The expected number of visits vj = E(Kj) = j/0.
n
j 1
n
0j
j 1
j0
22
Step 1(i): (cont.)
• The s.c.v. of arrival time can be
approximated by Traffic variability equation
(Whitt(1983b))
n 
ij
2
(1)
Ca  w j  Ca2  1  w j
j
where
wj 
i 0
j
ij
1
1  4(1   j ) ( x j  1)
2
and
xj 
1
 ij
 i 0  
 j
n



2
23
Step 1(ii): Departure process
Cd2j
is interdeparture time variability from station
j. Cd2   2j CS2  (1   2j )Ca2
(2)
j
j
j
Note that if the arrival
time
and
service
processes
2
2
C
C
a
S
are Poisson (i.e.,
=
= 1), then Cd2 is exact
and leads to
Cd2 = 1.
Note also that if j  1, then we obtain Cd2  CS2
On the other hand if j  0, then we obtain
j
j
j
j
j
Cd2j

j
Ca2j
24
Step 1(iii): Departure splitting
C a2ji = the interarrival time variability at station i from
station j.
Cd2 = the departure time variability from station j to i.
ji
Ca2ji  Cd2ji  p jiCd2j  1  p ji
(3)
Equations (1), (2), and (3) form a linear
system – the traffic variability equations –
2
2
2
2
C
,
C
,
C

C
in  a d a
d 
j
j
ij
ij
25
Step 2:
• Find the expected waiting time by using Kraemer &
Lagenbach-Belz formula [modified by Whitt(1983a)]
E (W j ) 
 j (Ca2  CS2 ) g (  j , Ca2 , CS2 )
j
j
j
j
2 j (1   j )

(Ca2j  CS2j ) g (  j , Ca2j , CS2j )
2
E[W ]M / M /1
   2(1   j )(1  Ca2 ) 2  if Ca2 < 1
j

exp
 

g (  j , Ca2j , CS2 j )    3 j (Ca2j  CS2 j ) 



Ca2  1
if
1
where
j
j
Step 3:
• Expected lead time for an arbitrary job (waiting times +
n
service times)
E (T )  v ( E (W )  E ( S ))

j 1
j
j
j
26
Single-class GI/G/m OQNS
with probabilistic routing
More general case where there are m identical machines at
each stationn i.
Ca2j   j   Ca2j  ij for j = 1,…,n
i 1
where
n


2
 j  1  w j ( p0 j Ca0 j  1)   pij [(1  qij )  qij i2 xi ]
i 1


ij  w j pij qij (1  i2 )
wj 
1
1  4(1   j )2 ( x j  1)
ij
ij 
i 
qij 
pij qij  , pij  
 j 
j
i 
and xi  1 


(max CS2i ,0.2  1)
mi
27
Single-class GI/G/m OQNS
with probabilistic routing (cont.)
• Then the approximation of expected waiting time
is similar to GI/G/1 case:
(Ca2j  CS2j )
E (W j ) 
E (W j ( M / M / m j ))
2
28
Symmetric Job Shop
A job leaving m/c center i is equally likely to go to any other
m/c next:
Then
pij  1 m , j  i; pi 0  1 m (leaves system)
i   , i  1,..., m;
Cai  1 as m  
2
So we can approximate each m/c center as M/G/1.
Then, Wˆ (i , i , Ca2i , Cs2i ) can be replaced by mean waiting
time in M/G/1 queue.
2
2
2

(
1

C
E[ Si ]
i
Si )
E[Wi ] 

2(1  i ) [2i (1  i )]
29
Symmetric Job Shop
30
Uniform Flow Job Shop
All jobs visit the same sequence of machine centers. Then
i   , i  1,..., m;
2
but Cai is unaffected by m
In this case, M/G/1 is a bad approximation, but GI/G/1 works
fine.
The waiting time of GI/G/1 queue can be approximated by
using equation 3.142, 3.143 or 3.144 above and the lower
and upper bound is
2 2
2
i (Ca2  1  i )  i 2CS2

(
2


)
C


i
ai
i CS i
 Wˆ ( ,  , C 2 , C 2 ) i
i
i
ai
si
2 (1  i )
2 (1   )
i
i
i
31
Uniform-Flow Shop
32
Job Routing Diversity
Assume high work load: i  1, i  1,..., m
and machine centers each with same number of machines,
same service time distribution and same utilization. Also,
vi  1, i  1,..., m
Q: What job routing minimizes mean flow time?
2
2
2
C

C
and
C
A1: If a
S
S  1 then uniform-flow job routing
minimizes mean flow time;
A2: If Ca 2  CS 2 and CS 2  1 then symmetric job routing
minimizes mean flow time.
33
Variance of Flow Time
The mean flow time is useful information but not sufficient for
setting due dates. The two distributions below have the
same mean, but if the due date is set as shown, tardy jobs
are a lot more likely under distribution B!
A
B
E[T]
Due
34
Flow Time of an Aggregate Job
Pretend we are following the progress of an arbitrary tagged
job. If Ki is the number of times it visits m/c center i, then
its flow time is (Ki is the r.v. for which vi is the mean)
m
Ki
T   Tij , where Tij is the time spent in i on j th visit
i 1 j 1
Given the values of each Ki, the expected flow time is
m
E T K i , i  1,..., m    K i E Ti 
i 1
m
m
i 1
i 1
and then, “unconditioning”, E T  
 E  Ki  E Ti  = vi E Ti 
35
Variance
In a similar way, we can find the variance of T conditional
upon Ki and then uncondition.
Assume {Tij, i = 1,…,m; j = 1, …} and {Ki, i = 1,…,m} are all
independent and that, for each i, {Tij, j = 1, …} are
identically distributed.
Similar to conditional expectation, there is a relation for
conditional variance:
Var[ X ]  EY  Var  X Y   VarY  E  X Y 
36
Using Conditional Variance
Since T depends on {K1, K2, …, Km}, we can say
Var T   EK1 , K2 ,..., Km  Var  T K1 ,..., K m  
 VarK1 , K2 ,..., Km  E  T K1 ,..., K m  
The first term equals

 m Ki

m

EK1 , K2 ,..., Km  Var   Tij K1 ,..., K m    EK1 , K2 ,..., K m  K i Var Ti  
 i 1


 i 1 j 1
 
m
m
i 1
i 1
  E  K i  Var Ti    vi Var Ti 
37
Using Conditional Variance
The second term is
VarK1 , K2 ,..., Km
  m Ki

m

 E   Tij K1 ,..., K m    VarK1 , K2 ,..., K m   K i E Ti  
 i 1

  i 1 j 1
 
m
  Var  K i E Ti    2
i 1

1i  j  m
m
  E Ti  Var  K i   2
2
i 1
Cov  K i E Ti  , K j E T j  
 E T  E T  Cov  K , K
1i  j  m
i
j
i
j

   Cov  K i , K j  E Ti  E T j 
m
m
i 1 j 1
38
Variance
The resulting formula for the variance is:
m
m
m
i 1
i 1 j 1
Var T    vi Var Ti  + Cov  Ki , K j E Ti E T j 
If arrivals to each m/c center are approximately Poisson,
we can find Var[Ti] from the M/G/1 transform equation
(3.73), p. 61.
E Ti   
dFTi  s 
ds
2
, E Ti  
d 2 FTi  s 
s 0
ds 2
2
2
, Var Ti   E Ti   E Ti 
s 0
But we still need Cov(Ki, Kj ).
39
Markov Chain Model
Think of the tagged job as following a Markov chain with
states 1,…,m for the machine centers and absorbing state 0
representing having left the job shop.
The transition matrix is
1
0
0 ... 0 

Ki is the number of times
this M.C. visits state i
before being absorbed;
let Kji be the number of
visits given X0 = j.

m
1   p1 j
 j 1

m
1  p
2j
 
j 1



m
1   pmj
 j 1
0
p12
...
p21
0
...
pm1
pm 2

p1m 


p2 m 




0 

40
with probability 1  l 1 p jl
 ji ,
(7.83)
K ji  
with
probability
,
l
=
1,…,m
p jl
 K li   ji
m
Where
 ji  1
if j = i and 0 otherwise.
with probability 1  l 1 p jl
 ji jr ,
K ji K jr  
(7.84)
p
with
probability
,
l
=
jl
( K li   ji )( K lr   jr ),
1,…,m
m
41
Expectations
Take expectation of (7.83), we get

E  K ji 

j ,i 1,..., m
 I  P
1
and take expectation of (7.84),
E[ K ji K jr ]  E[ K jr ]E[ K ri ]  E[ K ji ]E[ K ir ], i  r
and
E[ K ]  E[ K ji ]{2E[ Kii ] 1}
2
ji
42
E[ Ki K r ]   j 1  j E[ K ji K jr ]
Because
m
E[ K ]   j 1  j E[ K 2ji ]
2
i
and
m
, we obtain
E[ Ki K r ]  E[ Ki ]E[ Kir ]  E[ K r ]E[ K ri ], i  r
and
E[ K i2 ]  E[ K i ]{2 E[ K ii ]  1}
where
E  Ki    j 1 j E  K ji 
m
Therefore
 E[ K i ]E[ K ij ]  E[ K j ]E[ K ji ]  E[ K i ]E[ K j ], i  j
Cov[ K i , K j ]  
 E[ K i ]2 E[ K ii ]  E[ K i ]  1 , i  j
43
Job Routing Extremes
• Symmetric job shop
 m  1  m  1 , i  j
E[Ki ] = 1, Cov  K , K   
 i j  2  m  1  m  1 , i  j

2
If all m/c centers are identical and m is large, Var T    E T  ,
and we can use an exponential dist’n to approximate T .
• Uniform-flow job shop
E[Ki ] = 1, Cov  K i , K j   0
Then
m
m
i 1
i 1
E T    E Ti , Var T    Var Ti 
44
Setting Due Dates
Use mean and variance of T to obtain approximate distribution
FT(t) = P[T < t], e.g.,
if Var[T] < E[T]2, fit an Erlang distribution;
if Var[T] = E[T]2, use an exponential dist’n with mean E[T];
if Var[T] > E[T]2, use a hyperexponential distribution.
If td is the due date, what matters is
FT (td )  1  FT (td )  P T  td .
45
Costs Related to Due Dates
• Setting the due date too far away: c1  td  A 

• Completing job after its due date: c2 T  td 

• Completing job before its due date: c3  td  T 

Total expected cost





C  td   c1  td  A  c2 E T  td   c3 E td  T  





has derivative
dC  td  c1  c2 FT  td   c3 FT  td  , td  A

dtd
 c2 FT  td   c3 FT  td  , td  A
46
Optimal Due Date
 1  c1  c3 
 FT 
 , if this  A
 c2  c3 


 1  c3 
*
td   FT 
 , if this  A
 c2  c3 


A, otherwise


47