Asymptotic Notation

Posted by Beetle B. on Fri 25 April 2014

An asymptotically positive function \(f(n)\) is one that is always positive for sufficiently large \(n\). A similar definition holds for asymptotically non-negative functions.

\(\Theta(\cdots)\)

Let \(f(n)\) and \(g(n)\) be two asymptotically positive functions. Then \(\Theta(g(n))\) is the set of all functions \(f(n)\) for which there exist positive numbers \(c_{1},c_{2},N\) such that \(c_{1}g(n)<f(n)<c_{2}g(n)\) for all \(n>N\).

Intuitively, this means that \(f(n)\) grows at about the same rate as \(g(n)\) for large enough \(n\). It is a tight bound.

\(O(\cdots)\)

\(O(g(n))\) is an upper bound. It involves only one inequality: \(f(n)<c_{2}g(n)\).

If a given algorithm is \(O(g(n))\) for the worst case (time, memory, whatever), then it is \(O(g(n))\) for all inputs. This is not the case for \(\Theta(g(n))\). As an example, for insertion sort, for certain inputs you’ll have \(\Theta(n)\) but for the worst cases you’ll have \(\Theta(n^{2})\)

When we say \(O(g(n))\) for an algorithm, we do not qualify for an input type (e.g. we cannot say that the input is always unsorted, or sorted in a certain way, etc). This means that it is \(O(g(n))\) for all inputs. This serves as a guarantee.

\(\Omega(\cdots)\)

\(\Omega(g(n))\) is a lower bound. It involves only one inequality: \(c_{1}g(n)<f(n)\).

A similar statement applies to the best case time as it did to the worst case in the section above.

Similar to \(O(g(n))\), when we say \(\Omega(g(n))\) for an algorithm, we do not qualify for an input type (e.g. we cannot say that the input is always unsorted, or sorted in a certain way, etc). This means it is \(\Omega(g(n))\) for all inputs.

This serves as a goal we strive for all the inputs.

\(o(\cdots)\)

\(o(g(n))\) is a strict upper bound (like \(<\)). It’s also different from \(O(g(n))\) in another way: It has to hold for any choice of the constant.

Given any \(c>0\), \(o(g(n))\) is the set of all functions \(f(n)\) for which there exists a positive number \(N\) (which may depend on \(c\)) such that \(f(n)<cg(n)\) for all \(n>N\).

In fact:

\begin{equation*} \lim_{n\to\infty}\frac{f(n)}{g(n)}=0 \end{equation*}

Tackling the Complexity of an Algorithm

Suppress Details

As a first approach, try to suppress details of the algorithmic complexity. Just get within a constant factor (i.e. don’t worry about smaller order terms, coefficients, etc).

Suppress Variability in Input

Assume the worst case input. This eliminates variability in input.

Putting these two together gives a worst case complexity. This serves as a guarantee for all inputs.

Then you can refine further. Ultimately you’ll want the average input performance.

Some Useful “Identities”

  • \(\max\{f(n),g(n)\}=\Theta(f(n)+g(n))\)