Growth of Functions

A central topic in algorithm analysis is to compare the time complexity (or efficiency) of algorithms (for the same problem).

As the running time of an algorithm is represented as *T(n)*, a function of instance size, the problem is transformed into the categorization of these functions.

- A
*size*measurement can be established among the (infinite number of) instances of a problem to indicate their*difficulty*, so that the*time*spent by an algorithm on an instance is a*nondecreasing function*of its size (at least after a certain point); - To compare the efficiency of algorithms on all problem instances, what matters is the instances larger than a certain size, as there are only finite smaller ones, but infinite larger ones;
- No matter where the threshold is set, in general the algorithm that grows faster will cost more than another one that grows slower, so what really matters is not T(n), but its derivative T'(n);
- Since it is impossible to decide or compare all growth functions, it is often enough to categorize their
*order of growth*using representative functions.

- Θ(g(n)) = {f(n) : There exist positive constants c
_{1}, c_{2}, and n_{0}such that 0 ≤ c_{1}g(n) ≤ f(n) ≤ c_{2}g(n) for all n ≥ n_{0}}. We say that g(n) is an*asymptotically tight bound*of f(n), and write f(n) = Θ(g(n)). [It actually means f(n) ∈ Θ(g(n)). The same for the following.] - O(g(n)) = {f(n) : There exist positive constants c and n
_{0}such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n_{0}}. We say that g(n) is an*asymptotically upper bound*of f(n), and write f(n) = O(g(n)). - Ω(g(n)) = {f(n) : There exist positive constants c and n
_{0}such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n_{0}}. We say that g(n) is an*asymptotically lower bound*of f(n), and write f(n) = Ω(g(n)). - o(g(n)) = {f(n) : For any positive constant c > 0, there exist a constant n
_{0}> 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n_{0}}. We say that g(n) is an*upper bound*of f(n) but*not asymptotically tight*, and write f(n) = o(g(n)). - ω(g(n)) = {f(n) : For any positive constant c > 0, there exist a constant n
_{0}> 0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n_{0}}. We say that g(n) is a*lower bound*of f(n) but*not asymptotically tight*, and write f(n) = ω(g(n)).

- 4n
^{3}− 10.5n^{2}+ 7n + 103 = Θ(n^{3}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = O(n^{3}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = Ω(n^{3}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = O(n^{4}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = Ω(n^{2}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = o(n^{4}) - 4n
^{3}− 10.5n^{2}+ 7n + 103 = ω(n^{2})

- f(n) = Θ(g(n)) if and only if f(n) = O(g(n)) and f(n) = Ω(g(n)).
- f(n) = O(g(n)) if and only if f(n) = Θ(g(n)) or f(n) = o(g(n)).
- f(n) = Ω(g(n)) if and only if f(n) = Θ(g(n)) or f(n) = ω(g(n)).
- f(n) = O(g(n)) if and only if g(n) = Ω(f(n)).
- f(n) = o(g(n)) if and only if g(n) = ω(f(n)).

- f(n) = Θ(g(n)) if and only if Lim[f(n)/g(n)] = c (a positive constant).
- f(n) = o(g(n)) if and only if Lim[f(n)/g(n)] = 0.
- f(n) = ω(g(n)) if and only if Lim[f(n)/g(n)] = ∞.

- constant: Θ(
*1*) - logarithmic: Θ(log
),_{a}n*a*> 1 - polynomial: Θ(
*n*),^{a}*a*> 0 - exponential: Θ(
*a*),^{n}*a*> 1 - factorial: Θ(
*n!*)

2n

In algorithm analysis, the most common conclusions are worst-case expenses expressed as O(f(n)), which can be obtained by focusing on the most expensive step in the algorithm, as in the analysis of Insertion Sort algorithm.

For example, for the Merge Sort algorithm, we have

This is the case, because each time an array of size

one reasonable guess is T(n) = O(

where the last step holds as long as c ≥ 1. Furthermore, mathematical induction requires us to show that the solution holds for the boundary cases, that is, we can choose the constant

we can create the following recursion tree:

To determine the height of the tree