CIS 5511. Programming Techniques

Sorting (1)

### 1. The sorting problem

The sorting problem: input, output, value, order.

Why to study it: practical and conceptual reasons.

Major operations: element comparisons and assignments.

Common algorithms and their time costs:

• Bubble Sort, Selection Sort, Insertion Sort: Θ(n2)
• Merge Sort, Quicksort, Heap Sort: Θ(n lg n)

Other considerations:

• Whether to demand all inputs at the beginning, and whether to produce all outputs at the same time. E.g., Insertion Sort, Selection Sort
• Space cost: "in place"? E.g. Merge Sort

### 2. Heaps

A tree is a data structure in which every node, except one called "root", has exactly one predecessor (parent) and zero or one or several successors (children) that are usually distinguished by order (i.e., 1st, 2nd, 3rd, etc.).

A binary tree is a tree data structure in which each node has at most a left successor (child) and a right (successor) child.

A (binary) heap is an array that can be viewed as a nearly complete binary tree. Each node of the tree corresponds to an element of the array. The tree is completely filled on all levels except possibly the lowest on the right end.

In a heap, the root of the tree is A[1], and given the index i of a node, the index of its parent is Parent(i) = floor(i/2), the index of its left child is Left(i) = 2i, and the index of its right child is Right(i) = 2i+1.

A heap is "sorted vertically" in the sense that the values on every path in the tree is sorted, or that a parent is never larger (or smaller) than a child.

For example, the following binary tree (a) is also a heap

which corresponds to the array (b).

In a max-heap, the value of a parent is never smaller than that of its children (as the previous example); in a min-heap, the value of a parent is never larger than that of its children.

The height of a node in a heap to be the number of edges on the longest simple downward path from the node to a leaf. The height of a heap is the height of its root. If a heap has n nodes, its height is Θ(lg n).

An array can be incrementally turned into a heap in-place. When an array A represents a heap that is being built, it has two attributes: length[A] and heap-size[A]. Although the A[1 .. length[A]] may contain valid numbers, no element of the heap past A[heap-size[A]], where heap-size[A] ≤ length[A].

### 3. Heap maintenance

The algorithm Max-Heapify(A, i) fix a heap in A by putting A[i] into proper position, under the assumption that its children Left(i) and Right(i) are already (roots of) heaps.

Example: Max-Heapify(A, 2), where heap-size[A] = 10.

It takes constant times to handle the comparison of one node and its (at most) two children. The children's subtrees each have size at most 2n/3, and the worst case occurs when the last row is half full.

Therefore, the running time of the algorithm can be described as T(n) ≤ T(2n/3) + Θ(1). The master theorem solves this recurrence with result T(n) = O(lg n), which is the same as the height of the heap.

### 4. Heap building

We can use the above algorithm to convert an arbitrary array into a heap. The basic idea is to start at the leaves, then insert the upper-level nodes one by one.

Example:

Loop invariant: At the start of each iteration of the for loop, each node in A[i+1..n] is the root of a max-heap. It can be proved as usual: Initialization, Maintenance, Termination.

Since Max-Heapify is O(lg n), and it is called less than n times in Build-Max-Heap, the latter is surely O(n lg n). However this upper bound is not tight, and it can be proved that Build-Max-Heap is O(n).

### 5. Heapsort

It is easy to sort a heap: just repeatedly exchange the root and the last element, then fix the heap after each step.

Heapsort is an "in place" algorithm. Example:

Heapsort is O(n lg n), because Build-Max-Heap takes time O(n), and each of the n-1 calls to Max-Heapify takes O(lg n) time.

### 6. Priority queues

A priority queue is an abstract data structure in which each item has a priority value attached, and there is an operation, usually the same as deletion, that removes the item with the highest priority.

To use a heap as to implement a priority queue, item with the highest priority is the root. After the root is removed from the heap, the last item is moved into root, then the heap is fixed in a top-down way.

On the other hand, the "insert" operation adds a new item at the end of the heap, then fix the heap in a bottom-up way, by calling the "increase-key" operation.

Example of "increase-key":

All the above operations cost O(lg n) time.

If a priority queue is implemented by a sorted array, then insertion takes O(n) time, and deletion takes O(1) time. If a priority queue is implemented by a unsorted array, then insertion takes O(1) time, and deletion takes O(n) time.

If the priority of items only take a finite number of possible values, then a priority queue can be implemented multiple queues, so that both insertion and deletion take only O(1) time.