41
$\begingroup$

Assume I have a list of functions, for example

$\qquad n^{\log \log(n)}, 2^n, n!, n^3, n \ln n, \dots$

How do I sort them asymptotically, i.e. after the relation defined by

$\qquad f \leq_O g \iff f \in O(g),ドル

assuming they are indeed pairwise comparable (see also here)? Using the definition of $O$ seems awkward, and it is often hard to prove the existence of suitable constants $c$ and $n_0$.

This is about measures of complexity, so we're interested in asymptotic behavior as $n \to +\infty,ドル and we assume that all the functions take only non-negative values ($\forall n, f(n) \ge 0$).

asked Mar 27, 2012 at 15:47
$\endgroup$
1
  • 4
    $\begingroup$ Since the OP never came back, I'm removing the localised stuff and make a reference question out of this. $\endgroup$ Commented Feb 17, 2013 at 9:40

6 Answers 6

54
$\begingroup$

If you want rigorous proof, the following lemma is often useful resp. more handy than the definitions.

If $c = \lim_{n\to\infty} \frac{f(n)}{g(n)}$ exists, then

  • $c=0 \qquad \ ,円\iff f \in o(g)$,
  • $c \in (0,\infty) \iff f \in \Theta(g)$ and
  • $c=\infty \quad \ \ \ \iff f \in \omega(g)$.

With this, you should be able to order most of the functions coming up in algorithm analysis1. As an exercise, prove it!

Of course you have to be able to calculate the limits accordingly. Some useful tricks to break complicated functions down to basic ones are:

  • Express both functions as $e^{\dots}$ and compare the exponents; if their ratio tends to 0ドル$ or $\infty$, so does the original quotient.

  • More generally: if you have a convex, continuously differentiable and strictly increasing function $h$ so that you can re-write your quotient as

    $\qquad \displaystyle \frac{f(n)}{g(n)} = \frac{h(f^*(n))}{h(g^*(n))}$,

    with $g^* \in \Omega(1)$ and

    $\qquad \displaystyle \lim_{n \to \infty} \frac{f^*(n)}{g^*(n)} = \infty$,

    then

    $\qquad \displaystyle \lim_{n \to \infty} \frac{f(n)}{g(n)} = \infty$.

    See here for a rigorous proof of this rule (in German).

  • Consider continuations of your functions over the reals. You can now use L'Hôpital's rule; be mindful of its conditions2!

  • Have a look at the discrete equivalent, Stolz–Cesàro.

  • When factorials pop up, use Stirling's formula:

    $\qquad \displaystyle n! \sim \sqrt{2 \pi n} \left(\frac{n}{e}\right)^n$

It is also useful to keep a pool of basic relations you prove once and use often, such as:

  • logarithms grow slower than polynomials, i.e.

    $\qquad\displaystyle (\log n)^\alpha \in o(n^\beta)$ for all $\alpha, \beta > 0$.

  • order of polynomials:

    $\qquad\displaystyle n^\alpha \in o(n^\beta)$ for all $\alpha < \beta$.

  • polynomials grow slower than exponentials:

    $\qquad\displaystyle n^\alpha \in o(c^n)$ for all $\alpha$ and $c > 1$.


It can happen that above lemma is not applicable because the limit does not exist (e.g. when functions oscillate). In this case, consider the following characterisation of Landau classes using limes superior/inferior:

With $c_s := \limsup_{n \to \infty} \frac{f(n)}{g(n)}$ we have

  • 0ドル \leq c_s < \infty \iff f \in O(g)$ and
  • $c_s = 0 \iff f \in o(g)$.

With $c_i := \liminf_{n \to \infty} \frac{f(n)}{g(n)}$ we have

  • 0ドル < c_i \leq \infty \iff f \in \Omega(g)$ and
  • $c_i = \infty \iff f \in \omega(g)$.

Furthermore,

  • 0ドル < c_i,c_s < \infty \iff f \in \Theta(g) \iff g \in \Theta(f)$ and
  • $ c_i = c_s = 1 \iff f \sim g$.

Check here and here if you are confused by my notation.


1 Nota bene: My colleague wrote a Mathematica function that does this successfully for many functions, so the lemma really reduces the task to mechanical computation.

2 See also here.

answered Mar 27, 2012 at 17:12
$\endgroup$
1
  • $\begingroup$ @Juho Not publicly, afaik, but it's elementary to write yourself; compute Limit[f[n]/g[n], n -> Infinity] and perform a case distinction. $\endgroup$ Commented Mar 11, 2015 at 7:44
20
$\begingroup$

Another tip: sometimes applying a monotone function (like log or exp) to the functions makes things clearer.

answered Mar 27, 2012 at 15:54
$\endgroup$
2
  • 5
    $\begingroup$ This should be done carefully: 2ドルn\in O(n),ドル but 2ドル^{2n}\notin O(2^n)$. $\endgroup$ Commented Feb 17, 2013 at 11:35
  • 3
    $\begingroup$ Seconded. The "apply monotone function" thing seems to be some kind of folklore which does not work in general. We have been working on sufficient criteria and have been come up with what I posted in the latest revision of my answer. $\endgroup$ Commented Nov 8, 2013 at 17:15
18
$\begingroup$

Skiena provides a sorted list of the dominance relations between the most common functions in his book, The Algorithm Design Manual:

$$n!\gg c^n \gg n^3 \gg n^2 \gg n^{1+\epsilon} \gg n \lg n \gg n \gg n^{1/2}$$ $$ \gg \lg^2n \gg \lg n \gg \frac{\lg n}{\lg\lg n} \gg \lg\lg n \gg \alpha(n) \gg 1$$

Here $\alpha(n)$ denotes the inverse Ackermann function.

siracusa
3602 silver badges13 bronze badges
answered Mar 15, 2013 at 7:31
$\endgroup$
2
  • $\begingroup$ That's an oddly specific list. Many of the relations (whatever $\gg$ means exactly) can be summarised to a handful of more general lemmata. $\endgroup$ Commented Nov 8, 2013 at 17:16
  • $\begingroup$ It's his notation for a dominance relation. $\endgroup$ Commented Nov 10, 2013 at 10:14
12
$\begingroup$

Tip: draw graphs of these functions using something like Wolfram Alpha to get a feeling for how they grow. Note that this is not very precise, but if you try it for sufficiently large numbers, you should see the comparative patterns of growth. This of course is no substitute for a proof.

E.g., try: plot log(log(n)) from 1 to 10000 to see an individual graph or plot log(log(n)) and plot log(n) from 1 to 10000 to see a comparison.

answered Mar 27, 2012 at 15:51
$\endgroup$
3
  • 9
    $\begingroup$ Should we really recommend vodoo? $\endgroup$ Commented Mar 27, 2012 at 17:13
  • $\begingroup$ +1 for suggesting to draw graphs of the functions, although the linked graphs are rather confusing unless you know what they mean. $\endgroup$ Commented Mar 27, 2012 at 17:35
  • 1
    $\begingroup$ Take a graph as a hint what you might want to prove. That hint may be wrong of course. $\endgroup$ Commented May 21, 2017 at 20:44
10
$\begingroup$

I suggest proceeding according to the definitions of various notations. Start with some arbitrary pair of expressions, and determine the order of those, as outlined below. Then, for each additional element, find its position in the sorted list using binary search and comparing as below. So, for example, let's sort $n^{\log\log n}$ and 2ドル^n,ドル the first two functions of n, to get the list started.

We use the property that $n = 2^{\log n}$ to rewrite the first expression as $n^{\log\log n} = (2^{\log n})^{\log\log n} = 2^{\log n\log\log n}$. We could then proceed to use the definition to show that $n^{\log\log n} = 2^{\log n\log\log n} \in o(2^n),ドル since for any constant $c > 0,ドル there is an $n_0$ such that for $n \geq n_0,ドル $c(n^{\log\log n}) = c(2^{\log n\log\log n}) < 2^n$.

Next, we try 3ドル^n$. We compare it to 2ドル^n,ドル the largest element we have placed so far. Since 3ドル^n = (2^{\log 3})^n = 2^{n\log3},ドル we similarly show that 2ドル^n \in o(3^n) = o(2^{n \log 3})$.

Etc.

answered Mar 27, 2012 at 18:02
$\endgroup$
5
$\begingroup$

Here a list from Wikipedia, The lower in the table the bigger complexity class; $$ \begin{array}{|l|l|} \hline Name & \text{Running Time} \\ \hline \text{Constant time} & \mathcal{O}(1) \\ \text{Inverse Ackermann time} & \mathcal{O}(a(n)) \\ \text{Iterated logarithmic time} & \mathcal{O}(\log^*n) \\ \text{Log-logarithmic} & \mathcal{O}(n \log n) \\ \text{Logarithmic time} & \mathcal{O}(\log n) \\ \text{Polylogarithmic time} & poly(\log n) \\ \text{Fractional power} & \mathcal{O}(n^c) ,\text{where } 0<c<1 \\ \text{Linear time} & \mathcal{O}(n) \\ \text{"n log star n" time} & \mathcal{O}(n \log^* n) \\ \text{Quasilinear time} & \mathcal{O}(n \log n) \\ \text{Quadratic time} & \mathcal{O}(n^2) \\ \text{Cubic time} & \mathcal{O}(n^3) \\ \text{Polynomial time} & poly(n) = 2^{\mathcal{O}(\log n)} \\ \text{Quasi-polynomial time} & 2^{\mathcal{O}(poly(\log n))} \\ \text{Sub-exponential time (first definition)} & \mathcal{O}(2^{n^{\epsilon}}), \epsilon >0 \\ \text{Sub-exponential time (second definition)} & 2^{\mathcal{o}(n)}\\ \text{Exponential time(with linear exponent)} & 2^{\mathcal{O}(n)} \\ \text{Exponential time} & 2^{poly(n)} \\ \text{Factorial time} & \mathcal{O}(n!) \\\hline \end{array} $$

note : $poly(x) = x^{\mathcal{O}(1)}$

answered Oct 6, 2018 at 15:18
$\endgroup$
9
  • 1
    $\begingroup$ Also, interesting how the table suggests that 2ドル^{n \log n} \in o(n!)$. While the table you link to is somewhat accurate, the one linked there (and which you copied) is about complexity classes, which is not a helpful thing to mix in here. Landau notation is not about "time". $\endgroup$ Commented Oct 19, 2018 at 6:13
  • 1
    $\begingroup$ I put this so the name of the complexity classes can be talked directly here. Yes, Landau is more about a specific type of algorithm in Cryptography. $\endgroup$ Commented Oct 19, 2018 at 6:27
  • 1
    $\begingroup$ I object to some of @Raphael's views. I have been a mathematician and an instructor for many years. I believe, apart from proving those things, a big table like this increases people's intuition easily and greatly. And the names of the asymptotic classes help people remember and communicate a lot. $\endgroup$ Commented Oct 19, 2018 at 6:40
  • 1
    $\begingroup$ @Apass.Jack In my teaching experience, when given a table many students will learn it by heart and fail to order wrt any function not in the table. Note how that effect seems to account for many of questions regarding asymptotic growth that land on this site. That said, of course we'll use lemmata implied by the table if it makes proofs easier, but that comes after learning how to proof the table. (To emphasize that point, people who come here don't need help reading stuff off a table. They need help proving relations.) $\endgroup$ Commented Oct 19, 2018 at 8:17
  • 1
    $\begingroup$ @kelalaka "Yes, Landau is more about a specific type of algorithm in Cryptography." -- that doesn't even make sense. Landau notation is a shorthand to describe properties of mathematical functions. It has nothing to do with algorithms let alone cryptography, per se. $\endgroup$ Commented Oct 19, 2018 at 8:18

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.