**Growth of Functions**

A central topic in algorithm analysis is to compare the time (or space) efficiency of algorithms (for the same problem). As the running time of an algorithm is represented as a function of instance size, the problem is transformed into categorition of these functions.

- A
*size*measurement can be established among the instances of a problem to indicate their*difficulty*; - The
*time*spent by an algorithm on an instance is a*nondecreasing function*of its size; - The
*time expense*of an algorithm becomes an issue when the size is*large*; - The major factor indicating the time efficiency of an algorithm is
*how fast*it grows with the size; - To compare the time efficiency of algorithms, it is often enough to find the
*order of growth*for each of them.

Each asymptotic notation corresponds to a set of functions, represented by a simple function g(n):

- Θ(g(n)) = {f(n) : there exist positive constants c
_{1}, c_{2}, and n_{0}such that 0 ≤ c_{1}g(n) ≤ f(n) ≤ c_{2}g(n) for all n ≥ n_{0}}. We say that g(n) is an*asymptotically tight bound*of f(n), and write f(n) = Θ(g(n)). [It actually means f(n) ∈ Θ(g(n)). The same for the following.] - O(g(n)) = {f(n) : there exist positive constants c and n
_{0}such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n_{0}}. We say that g(n) is an*asymptotically upper bound*of f(n), and write f(n) = O(g(n)). - Ω(g(n)) = {f(n) : there exist positive constants c and n
_{0}such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n_{0}}. We say that g(n) is an*asymptotically lower bound*of f(n), and write f(n) = Ω(g(n)). - o(g(n)) = {f(n) : for any positive constant c > 0, there exist a constant n
_{0}> 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n_{0}}. We say that g(n) is an*upper bound*of f(n) but*not asymptotically tight*, and write f(n) = o(g(n)). - ω(g(n)) = {f(n) : for any positive constant c > 0, there exist a constant n
_{0}> 0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n_{0}}. We say that g(n) is a*lower bound*of f(n) but*not asymptotically tight*, and write f(n) = ω(g(n)).

Examples:

- 4n
^{3}− 10.5n^{2}+ 7n + 103 = Θ(n^{3}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = O(n^{3}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = Ω(n^{3}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = O(n^{4}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = Ω(n^{2}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = o(n^{4}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = ω(n^{2})

Relations among the five:

- f(n) = Θ(g(n)) if and only if f(n) = O(g(n)) and f(n) = Ω(g(n)).
- f(n) = O(g(n)) if and only if f(n) = Θ(g(n)) or f(n) = o(g(n)).
- f(n) = Ω(g(n)) if and only if f(n) = Θ(g(n)) or f(n) = ω(g(n)).
- f(n) = O(g(n)) if and only if g(n) = Ω(f(n)).
- f(n) = o(g(n)) if and only if g(n) = ω(f(n)).

- f(n) = Θ(g(n)) if and only if Lim[f(n)/g(n)] = c (a positive constant).
- f(n) = o(g(n)) if and only if Lim[f(n)/g(n)] = 0.
- f(n) = ω(g(n)) if and only if Lim[f(n)/g(n)] = ∞.

- constant: Θ(
*1*) - logarithmic: Θ(log
),_{a}n*a*> 1 - polynomial: Θ(
*n*),^{a}*a*> 0 - exponential: Θ(
*a*),^{n}*a*> 1 - factorial: Θ(
*n!*)

Asymptotic notions can be used in equations and inequalities, as well as in certain calculations. For example,

2n^{2} + 3n + 1 = 2n^{2} + Θ(n) = Θ(n^{2}).

In algorithm analysis, the most common conclusions are worst case expenses expressed as O(f(n)), which can be obtained by focusing on the most expensive item in the function, as in the analysis of Insertion Sort algorithm.

In general, the "divide-and-conquer" approach uses *D(n)* time to divide a problem into *a* smaller subproblems of the same type, each with *1/b* of the original size. After the subproblems are solved, their solutions are combined in *C(n)* time to get the solution for the original problem. This process is repeated on the subproblems, until the input size becomes so small that the problem can be solved in constant time. This approach gives us the following recurrence:

For example, for the Merge Sort algorithm, we have

This is the case, because each time an array of size *n* is divided into two halves, and to merge the results together takes linear time.

Often we need to solve the recurrence, so as to get a running time function without recursive call.

One way to solve a recurrence is to use the *substitution method*, which uses mathematical induction to prove a function that was guessed previously.

For example, if the function is

one reasonable guess is T(n) = O(*n* lg *n*). To show that this is indeed the case, we can prove that T(n) ≤ *c n* lg *n* for an appropriate choice of the constant *c* > 0. We start by assuming that this bound holds for halves of the array, then, substituting into the recurrence yields,

where the last step holds as long as c ≥ 1.

Furthermore, mathematical induction requires us to show that our solutions holds for the boundary conditions, that is, we can choose the constant *c* large enough so that T(n) ≤ *c n* lg *n* holds there. For this example, if the boundary condition is T(1) = 1, then c = 1 does not work there because *c n* lg(n) = 1 1 lg(1) = 0. To solve this issue we only need to show that T(n) ≤ *c n* lg *n* holds when *n* is above a certain value. For n = 2, we do have T(n) ≤ *c* 2 lg 2 as long as c ≥ 2.

In summary, we can use *c ≥ 2*, and we have shown that T(n) ≤ *c n* lg *n* for all *n* ≥ 2.

To get a good guess for recurrence function, one way is to draw a *recursion tree*. For example, if the function is

we can create the following recursion tree:

To determine the height of the tree *i*, we have *n* / 4^{i} = 1, that is, *i* = log_{4}*n*. So the tree has log_{4}*n* + 1 levels.

At level *i*, there are 3^{i} nodes, each with a cost *c*(n / 4^{i})^{2}, so the total is (3/16)^{i}cn^{2}.

At the leave level, the number of nodes is 3^{log4n}, which is equal to *n*^{log43}. At that level, each node costs T(1), so the total is *n*^{log43}T(1), which is Θ(*n*^{log43}).

After some calculation (see the textbpook for details), it is guessed that T(n) = O(n^{2}).

Finally, there is a "master method" that provides a general solution for all divide-and-conquer recurrence. It depends on the following theorem:

Intuitively, the theorem says that the solution to the recurrence is determined by the larger one of f(*n*) and *n*^{logba}. The above example falls into Case 3.

The proof of the master theorem is given in the textbook. It is important to realize that the three cases of the master theorem do not cover all the possibilities.

As a special case, when *a = b* we have log* _{b}a* = 1. The above results are simplified to (1) Θ(