Thomas H. Cormen, Charles E. Leiserson, Ronald Rivest, Clifford Stein. Introduction to Algorithms, Second Edition. The MIT Press, 2001
The basics of proving and comparing computational runtimes.
3.1 Asymptotic Notation
Θ-notation:
Big theta is an asymptotically tight bound.
Θ(g(n)) = {f(n) : there exist positive constants c1, c2, and n0 such that 0≤c1g(n)≤f(n)≤c2g(n) for all n≥n0}
O-notation:
Big O is an asymptotic upper bound, a.k.a. the worst-case running time.
O(g(n)) = {f(n) : there exist positive constants c and n0 such that 0≤f(n)≤cg(n) for all n≥n0}
Ω-notation:
Big omega is an asymptotic lower bound, a.k.a. the best-case running time.
Ω(g(n)) = {f(n) : there exist positive constants c and n0 such that 0≤cg(n)≤f(n) for all n≥n0}.
Theorem 3.1
For any two functions f(n) and g(n), we have f(n)=Θ(g(n)) if and only if f(n)=O(g(n)) and f(n)=Ω(g(n)).
3.2 Standard Notations and Common Functions
A function f(n) is monotonically increasing if m≤n implies f(m)≤f(n). Similarly, it is monotonically decreasing if m≤n implies f(m)≥f(n). A function f(n) is strictly increasing if m<n implies f(m)<f(n) and strictly decreasing if m<n implies f(m)>f(n).