Considering the recent question "Is there an algorithm whose time complexity is between polynomial time and exponential time?", some commenters observed that merely providing a function with growth strictly between polynomial and exponential answers the question, because it is easy to construct an algorithm with the required complexity. For example, if you chose $f(n) = 2^{\sqrt{n}}$, then the following algorithm has complexity $O(f(n))$:
- Compute $x = 2^{\sqrt{n}}$.
- Increment an integer variable until you reach $x$.
Sounds easy - but for many functions step 1 takes longer than step 2. For example, let $f(n)$ be the value of $A(n, n)$ mod 37. Assuming that this requires actually computing $A(n, n)$ (and there is not some theorem like "$A(n, n)$ is always divisible by 37ドル$") then the procedure above will not produce an algorithm with the required complexity.
Of course, since this $f$ is $O(1)$, it's easy to construct an algorithm with the required complexity. But it's conceivable that there is some slow-growing hard-to-compute function which does not have the same "cheat".
My intuition, however, is that there is a cleverer technique that will always work. Is there?
-
$\begingroup$ (It should not be too hard to construct a procedure to fit. An *algorithm' is defined as a solution to a problem. Non-monotone functions may be less useful.) $\endgroup$greybeard– greybeard2021年12月30日 16:17:13 +00:00Commented Dec 30, 2021 at 16:17
-
$\begingroup$ en.wikipedia.org/wiki/Constructible_function, math.stackexchange.com/q/51082/14578 $\endgroup$D.W.– D.W. ♦2021年12月30日 19:17:22 +00:00Commented Dec 30, 2021 at 19:17