Transcript Lecture 16

CSE 326: Data Structures:
Sorting
Lecture 16: Friday, Feb 14, 2003
1
Review: QuickSort
procedure quickSortRecursive (Array A, int left, int right)
if (left == right) return;
int pivot = choosePivot(A, left, right);
/* partition A s.t.:
A[left], A[left+1], …, A[i]  pivot
A[i+1], A[i+2], …, A[right]  pivot
*/
quickSortRecursive(A, left, i);
quickSortRecursive(A, i+1, right);
}
2
Review: The Partition
i = left; j = right;
repeat { while (A[i] < pivot) i++;
while (A[j] > pivot) j--;
if (i<j) {swap(A[i], A[j]);
i++; j++;}
else break;
}
Why do we
need i++,
j++ ?
There exists a sentinel A[k] pivot
A[left]
…
 pivot
A[i-1]
A[i]
…
A[j]
…
A[right]
 pivot
3
Review: The Partition
At the end:
 pivot
A[left]
…
A[j]
…
A[i]
…
A[right]
 pivot
Q: How are
these elements ?
A: They are = pivot !
quickSortRecursive(A, left, j);
quickSortRecursive(A, i, right);
4
Why is QuickSort Faster than
Merge Sort?
• Quicksort typically performs more comparisons than
Mergesort, because partitions are not always perfectly
balanced
– Mergesort – n log n comparisons
– Quicksort – 1.38 n log n comparisons on average
• Quicksort performs many fewer copies, because on
average half of the elements are on the correct side of
the partition – while Mergesort copies every element
when merging
– Mergesort – 2n log n copies (using “temp array”)
n log n copies (using “alternating array”)
– Quicksort – n/2 log n copies on average
5
Stable Sorting Algorithms
Typical sorting scenario:
• Given N records: R[1], R[2], ..., R[N]
• They have N keys: R[1].A, ..., R[N].A
• Sort the records s.t.:R[1].A  R[2].A  ...  R[N].A
A sorting algorithm is stable if:
• If i < j and R[i].A = R[j].A
then R[i] comes before R[j] in the output
6
Stable Sorting Algorithms
Which of the following are stable sorting algorithms
?
• Bubble sort
• Insertion sort
• Selection sort
• Heap sort
• Merge sort
• Quick sort
7
Stable Sorting Algorithms
Which of the following are stable sorting algorithms
?
• Bubble sort
yes
• Insertion sort
yes
• Selection sort
yes
• Heap sort
no
• Merge sort
no
• Quick sort
no
We can always transform a non-stable sorting algorithm into a stable one
How ?
8
Detour: Computing the Median
• The median of A[1], A[2], …, A[N] is some A[k] s.t.:
– There exists N/2 elements  A[k]
– There exists N/2 elements  A[k]
• Think of it as the perfect pivot !
• Very important in applications:
– Median income v.s. average income
– Median grade v.s. average grade
• To compute: sort A[1], …, A[N], then median=A[N/2]
– Time O(N log N)
• Can we do it in O(N) time ?
9
Detour: Computing the Median
int medianRecursive(Array A, int left, int right)
{ if (left==right) return A[left];
. . . Partition . . .
if N/2  j return medianRecursive(A, left, j);
if N/2  i return medianRecursive(A, i, right);
return pivot
}
Why
?
Int median(Array A, int N) { return medianRecursive(A, 0, N-1); }
 pivot
A[left]
…
A[j]
…
A[i]
…
A[right]
 pivot
10
Detour: Computing the Median
• Best case running time:
T(N) = T(N/2) + cN
= T(N/4) + cN(1 + 1/2)
= T(N/8) + cN(1 + 1/2 + 1/4)
=...
= T(1) + cN (1 + 1/2 + 1/4 + … 1/2k)
= O(N)
• Worst case = O(N2)
• Average case = O(N)
• Question: how can you compute the median in
O(N) worst case time ? Note: it’s tricky.
11
Back to Sorting
• Naïve sorting algorithms:
– Bubble sort, insertion sort, selection sort
– Time = O(N2)
• Clever sorting algorithms:
– Merge sort, heap sort, quick sort
– Time = O(N log N)
• I want to sort in O(N) !
– Is this possible ?
12
Could We Do Better?
• Consider any sorting algorithm based on
comparisons
• Run it on A[1], A[2], ..., A[N]
– Assume they are distinct
• At each step it compares some A[i] with some A[j]
– If A[i] < A[j] then it does something...
– If A[i] > A[j] then it does something else...
 Decision Tree !
13
Decision tree to sort list A,B,C
B<A
A<B
B<A
A<B
B,A,C.
A
A,C,B.
C,A,B.
facts
Legend A,B,C
C<A
C
B<
B,C,A.
B<A
C<A
C<B
C<
C
A<
A<B
C<B
A
B
A,B,C.
C<
C
A<
C<
C
B<
C,B,A
Internal node, with facts known so far
Leaf node, with ordering of A,B,C
Edge, with result of one comparison
Every possible execution of the algorithm corresponds to a root-to-leaf
path in the tree.
14
Max depth of the decision tree
• How many permutations are there of N numbers?
• How many leaves does the tree have?
• What’s the shallowest tree with a given number of leaves?
• What is therefore the worst running time (number of
comparisons) by the best possible sorting algorithm?
15
Max depth of the decision tree
• How many permutations are there of N numbers?
N!
• How many leaves does the tree have?
N!
• What’s the shallowest tree with a given number of leaves?
log(N!)
• What is therefore the worst running time (number of
comparisons) by the best possible sorting algorithm?
log(N!)
16
Stirling’s approximation
n
n!  2n  
e
At least one
branch in the
tree has this
depth
n
n

n 
log(n !)  log  2 n   


e




  n n 
 log( 2 n )  log      (n log n)
 e  


17
If you forget Stirling’s formula...
ln n !  ln1  ln 2  ...  ln n
n
  ln k   ln x dx
n
1
k 1
  x ln x 1  n ln n  n  1
n
(log n !)  (ln n !)  (n ln n)  (n log n)
Theorem:
Every algorithm that sorts by comparing keys takes (n log n) time
18
Bucket Sort
• Now let’s sort in O(N)
• Assume:
A[0], A[1], …, A[N-1] {0, 1, …, M-1}
M = not too big
• Example: sort 1,000,000 person records on
the first character of their last names:
– Hence M = 128 (in practice: M = 27)
19
Bucket Sort
int bucketSort(Array A, int N) {
for k = 0 to M-1
Q[k] = new Queue;
for j = 0 to N-1
Q[A[j]].enqueue(A[j]);
Result = new Queue;
for k = 0 to M-1
Result = Result.append(Q[k]);
Stable
sorting !
return Result;
}
20
Bucket Sort
• Running time: O(M+N)
• Space: O(M+N)
• Recall that M << N, hence time = O(N)
• What about the Theorem that says sorting takes
(N log N) ??
This is not real
sorting, because
it’s for trivial keys
21
Radix Sort
• I still want to sort in time O(N): non-trivial keys
• A[0], A[1], …, A[N-1] are strings
– Very common in practice
• Each string is:
cd-1cd-2…c1c0,
where c0, c1, …, cd-1 {0, 1, …, M-1}
M = 128
• Other example: decimal numbers
22
RadixSort
• Radix = “The base of a
number system”
(Webster’s dictionary)
– alternate terminology: radix is
number of bits needed to represent
0 to base-1; can say “base 8” or
“radix 3”
• Used in 1890 U.S.
census by Hollerith
• Idea: BucketSort on
each digit, bottom up.
23
The Magic of RadixSort
• Input list:
126, 328, 636, 341, 416, 131, 328
• BucketSort on lower digit:
341, 131, 126, 636, 416, 328, 328
• BucketSort result on next-higher digit:
416, 126, 328, 328, 131, 636, 341
• BucketSort that result on highest digit:
126, 131, 328, 328, 341, 416, 636
24
Inductive Proof that RadixSort Works
• Keys: d-digit numbers, base B
– (that wasn’t hard!)
• Claim: after ith BucketSort, least significant i digits
are sorted.
– Base case: i=0. 0 digits are sorted.
– Inductive step: Assume for i, prove for i+1.
Consider two numbers: X, Y. Say Xi is ith digit of X:
• Xi+1 < Yi+1 then i+1th BucketSort will put them in order
• Xi+1 > Yi+1 , same thing
• Xi+1 = Yi+1 , order depends on last i digits. Induction
hypothesis says already sorted for these digits because
BucketSort is stable
25
Radix Sort
int radixSort(Array A, int N) {
for k = 0 to d-1
A = bucketSort(A, on position k)
}
Running time: T = O(d(M+N)) = O(dN) = O(Size)
26
Radix Sort
A=
35
Q[0]
Q[1]
52
A=
Q[0]
53
Q[2]
Q[3]
32
52
32
Q[1]
25
33
Q[4]
53
53
Q[2]
25
A=
55
Q[3]
32
32
Q[5]
33
Q[4]
25
Q[7]
35
35
Q[5]
35
35
32
Q[6]
33
33
33
52
55
Q[6]
55
53
Q[9]
25
25
Q[7]
52
52
Q[8]
Q[8]
53
55
Q[9]
55
27
Running time of Radixsort
• N items, D digit keys of max value M
• How many passes?
• How much work per pass?
• Total time?
28
Running time of Radixsort
• N items, D digit keys of max value M
• How many passes?
D
• How much work per pass? N + M
– just in case M>N, need to account for time to empty out
buckets between passes
• Total time?
O( D(N+M) )
29
Radix Sort
• What is the size of the input ? Size = DN
cD-1
cD-2
…
A[0]
‘S’
‘m’
‘i’
‘t’
‘h’
A[1]
‘J’
‘o’
‘n’
‘e’
‘s’
c0
…
A[N-1]
• Radix sort takes time O(Size) !!
30
Radix Sort
• Variable length strings:
A[0]
A[1]
A[2]
A[3]
A[4]
• Can adapt Radix Sort to sort in time O(Size) !
– What about our Theorem ??
31
Radix Sort
• Suppose we want to sort N distinct numbers
• Represent them in decimal:
– Need D=log N digits
• Hence RadixSort takes time
O(DN) = O(N log N)
• The total Size of N keys is O(N log N) !
• No conflict with theory 
32
Sorting HUGE Data Sets
• US Telephone Directory:
– 300,000,000 records
• 64-bytes per record
– Name: 32 characters
– Address: 54 characters
– Telephone number: 10 characters
– About 2 gigabytes of data
– Sort this on a machine with 128 MB RAM…
• Other examples?
33
Merge Sort Good for Something!
• Basis for most external sorting routines
• Can sort any number of records using a tiny
amount of main memory
– in extreme case, only need to keep 2 records in
memory at any one time!
34
External MergeSort
• Split input into two “tapes” (or areas of disk)
• Merge tapes so that each group of 2 records is
sorted
• Split again
• Merge tapes so that each group of 4 records is
sorted
• Repeat until data entirely sorted
log N passes
35
Better External MergeSort
• Suppose main memory can hold M records.
• Initially read in groups of M records and
sort them (e.g. with QuickSort).
• Number of passes reduced to log(N/M)
36