**Sorting (1)**

Why to study it: practical and conceptual reasons.

Default data structure: array. Default order: non-decreasing. Major operations: element comparisons and assignments.

Common algorithms of comparison sort and their time costs:

- Θ(
*n*^{2}): Bubble Sort, Selection Sort, Insertion Sort - Θ(
*n*^{1.x}): [Shellsort] - Θ(
*n*lg*n*): Mergesort, [Timsort,] Heapsort, Quicksort

Other considerations:

- Whether to demand all inputs at the beginning, and whether to produce all outputs at the same time. E.g., Insertion Sort, Selection Sort
- Space cost: "in place" or not. E.g. Merge Sort needs 2n space, while Insertion Sort needs constant amount.

A *binary tree* is a tree data structure in which each node has at most a left successor (child) and a right (successor) child.

A heap is usually stored in an array, where the order of elements is the same as how the tree is filled. The root of the tree is *A*[1], and given the index *i* of a node, the index of its parent is Parent(*i*) = floor(*i*/2) (except for the root), the index of its left child is Left(*i*) = 2*i*, and the index of its right child is Right(*i*) = 2*i*+1.

For example, the following max-heap (a) is stored in the array (b).

The *height of a node* in a heap is the number of edges on the longest path from the node to a leaf. The *height of a heap* is the height of its root. If a heap has *n* nodes, its height is Θ(lg *n*).

Example: Max-Heapify(*A*, *2*), where heap-size[*A*] = 10.

It takes constant time to handle the comparison of one node and its (at most) two children. The children's subtrees each have size at most 2n/3, and the worst case occurs when the last row is half full.

Therefore, the running time of the algorithm can be described as T(*n*) ≤ T(2*n*/3) + Θ(1). The master theorem solves this recurrence with result T(n) = O(lg *n*), which is the same as the height of the heap.

Example:

Since Max-Heapify is O(lg *n*), and it is called less than *n* times in Build-Max-Heap, the latter is surely O(*n* lg *n*). However this upper bound is not tight, and it can be proved that Build-Max-Heap is O(n), as there are more elements near the leaves than those near the root.

Example:

Heapsort is O(*n* lg *n*), because Build-Max-Heap takes time O(*n*), and each of the *n* − 1 calls to Max-Heapify takes O(lg *n*) time.

To use a heap to implement a priority queue, item with the highest priority is the root. After the root is removed from the heap, the last item is moved into root, then the heap is fixed in a top-down way.

On the other hand, the "insert" operation adds a new item at the end of the heap, then fix the heap in a bottom-up way, by calling the "increase-key" algorithm that increase the key of *x* to *k*.

Example of "increase-key":

All the above operations cost O(lg *n*) time. A heap can be built by repeating Max-Heap-Insert, but it will be less efficient than the Build-Max-Heap algorithm when all the values are available at the beginning.

If a priority queue is implemented by a sorted array, then insertion takes O(*n*) time, and deletion takes O(1) time; if it is implemented by a unsorted array, then insertion takes O(1) time, and deletion takes O(*n*) time.

If the priority of items only takes *m* (a finite number) possible values, a priority queue can be implemented by an array of queues, where insertion and deletion take only O(1) time.