3
$\begingroup$

Are there any problems where there is no smallest time complexity, so there is a big-$\mathcal{O}$ but no big-$\Theta$?

More formally, I am looking for any problem $p$ where

  • for any algorithm that solves $p$ in time $f(n)$
  • there provably exists another algorithm that solves $p$ in time $g(n)$
  • where $g(n) = o(f(n))$.

An example would be a problem for which there exist algorithms of complexity $\mathcal{O}(n^k)$ for all $k > 0,ドル but for which there is no algorithm of lower complexity, like $\mathcal{O}(1)$ or $\mathcal{O}(\log n)$.

asked Dec 8, 2017 at 0:52
$\endgroup$
2
  • 2
    $\begingroup$ There is no $<$ relation between $O(f)$ and $O(g),ドル you could use the little $o$ notation and say $g(n)=o(f(n))$. One example for what you required in your last statement could be finding an element in a sorted array. I think this post answers the question in the title. $\endgroup$ Commented Dec 8, 2017 at 12:20
  • $\begingroup$ @Ariel Thanks, I've never heard of Blum's Speedup Theorem before. I fixed the niggles in the post as well. $\endgroup$ Commented Dec 8, 2017 at 12:36

2 Answers 2

3
$\begingroup$

The existence of such problems follows from the Blum speedup theorem. The theorem shows, for example, that there exists a problem such that for any algorithm solving it in time $T(n),ドル there is another one solving it in time at most $\log T(n)$.

answered Dec 8, 2017 at 12:31
$\endgroup$
2
  • $\begingroup$ Thanks. Are there any good examples? I don't have time to read the proof right now. $\endgroup$ Commented Dec 8, 2017 at 12:37
  • $\begingroup$ Not really. The examples constructed by the theorem are very contrived. $\endgroup$ Commented Dec 8, 2017 at 12:40
1
$\begingroup$

You might be interested in something similar to what you are describing (in terms of a hierarchy of algorithms of different runtimes that get closer and closer to some limiting runtime). Polynomial time approximation schemes (PTAS) have runtimes stated like $O(2^{1/\epsilon}n^2)$ or $O(n^{1/\epsilon})$ for any epsilon, and you are sort of free to pick your epsilon as a trade-off between the runtime or the accuracy to an approximate solution.

See http://www.win.tue.nl/~mdberg/Onderwijs/AdvAlg_Material/Course%20Notes/lecture7.pdf for an intro to PTAS and an example showing a $O(n^3/\epsilon)$ PTAS for a solution to Knapsack which is at least $(1-\epsilon)\times OPT$

answered Dec 8, 2017 at 20:40
$\endgroup$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.