**Basic Concepts**

*Algorithm*: a (finite and predetermined) sequence of (directly executable or already defined) computational steps that produces an output for a valid input in finite time.

Desired properties of algorithm, as a solution to a problem:

- correctness: satisfying problem specification
- efficiency: low time and space expense
- simplicity: conceptual naturalness and length of description

*Programming*: to implement an algorithm in a computer language.

- the problem-solving method of the algorithm,
- the concrete problem instance it is given,
- the software and hardware in which the algorithm is implemented and executed.

- assuming constant time cost for each type of computational step, then only count the number of steps,
- measuring the
*size*of instance, usually by length of description, - for a given size, only considering the worst case (or average case, best case) situation,
- measuring the efficiency (time complexity) of an algorithm in a situation as a
*function*of instance size, - only considering how fast the function value increases as size increases,
- classifying functions into categories by
*order of growth*.

The *asymptotic* efficiency of algorithms, represented by a simple function g(n):

- Θ(g(n)) = {f(n) : there exist positive constants c
_{1}, c_{2}, and n_{0}such that 0 ≤ c_{1}g(n) ≤ f(n) ≤ c_{2}g(n) for all n ≥ n_{0}} — g(n) is an*asymptotic bound*of f(n). - O(g(n)) = {f(n) : there exist positive constants c and n
_{0}such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n_{0}} — g(n) is an*asymptotic upper bound*of f(n). - Ω(g(n)) = {f(n) : there exist positive constants c and n
_{0}such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n_{0}} — g(n) is an*asymptotic lower bound*of f(n). - o(g(n)) = {f(n) : for any positive constant c > 0, there exist a constant n
_{0}> 0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n_{0}} — g(n) is a*upper bound*of f(n). - ω(g(n)) = {f(n) : for any positive constant c > 0, there exist a constant n
_{0}> 0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n_{0}} — g(n) is a*lower bound*of f(n).

To decide the relation between two functions:

- f(n) = o(g(n)) if and only if Lim[f(n)/g(n)] = 0.
- f(n) = Θ(g(n)) if and only if Lim[f(n)/g(n)] = c (a positive constant).
- f(n) = ω(g(n)) if and only if Lim[f(n)/g(n)] = ∞.

The major growth orders, from low to high, are:

- constant: Θ(
*1*) - logarithmic: Θ(log
),_{a}n*a*> 1 - polynomial: Θ(
*n*),^{a}*a*> 0 — further divided into*linear*(Θ(*n*)),*quadratic*(Θ(*n*)),^{2}*cubic*(Θ(*n*)), and so on^{3} - exponential: Θ(
*a*),^{n}*a*> 1

Recursion: part of a problem is a simpler problem of the same type.

Assume the algorithm

- directly solves problems instances in constant time if their size is smaller than a constant
*c*, - otherwise uses
*D(n)*time to divide a problem into number*a*of smaller subproblems of the same type, each with*1/b*of the original size, - combines their solutions in
*C(n)*time to get the solution for the original problem,

To solve a recurrence:

*substitution method*: use mathematical induction to prove a function that was guessed previously;*recursion tree method*: represent the equation as a tree to reveal the function — the height of the tree is*log*, and the number of leaves is_{b}n*a*(same as^{logbn}*n*);^{logba}*the master method*:

In the recursion tree, the three cases correspond to different relative cost spent on root/leaves.As a special case, when

*a = b*, log= 1, and there are_{b}a*n*leaves. The above results are simplified to (1) Θ(*n*), (2) Θ(*n*lg*n*), and (3) Θ(f(*n*)), respectively.