8
$\begingroup$

We have a function which takes an array as input. It breaks an array into $\log_2(n)$ parts with equal sizes where $n$ is the size of the subarray. It keeps breaking each of the subarrays until there are only two elements left in it. What is the depth of this recursion?

Example of the process:

First we have $n$ elements and break them up into $\log_2(n)$ parts with equal sizes. Each of these parts has $\frac {n} {log_2(n)}$ elements in it. In the next level of recursion we will break each of the subarrays into $log_2(\frac {n} {log_2(n)})$ parts again with equal sizes. These will now have $\frac {\frac {n} {log_2(n)}} {log_2(\frac {n} {log_2(n)})}$ elements in each one of them. And we keep on breaking the array in this manner until we reach a subarray with only two elements in it.

asked Jan 28, 2019 at 15:28
$\endgroup$
5
  • $\begingroup$ It’s easily calculated, isn’t it? Try n = 1,000,000 for example. Do that by hand then it should be obvious. $\endgroup$ Commented Jan 28, 2019 at 15:32
  • 1
    $\begingroup$ @gnasher729 it's still not clear to me... $\endgroup$ Commented Jan 28, 2019 at 17:39
  • $\begingroup$ What is log_2 (1,000,000)? Then if you split 1,000,000 items into that many equal size subarrays, how big is each? What if you do it again? $\endgroup$ Commented Jan 28, 2019 at 19:32
  • $\begingroup$ I will not split the array into $\log_2 (1,000,000)$ parts every time. In first step yes, but then I will split it into $\log_2(\frac {1,000,000}{\log_2 (1,000,000)})$ and so on. $\endgroup$ Commented Jan 28, 2019 at 19:50
  • $\begingroup$ You did say "where n is the size of the array", not "where n is the size of the subarray". It makes little difference in practice. $\endgroup$ Commented Jan 28, 2019 at 23:34

4 Answers 4

5
$\begingroup$

Let $f(n) = n/\log n$, and denote by $g(n)$ the number of applications of $f$ it takes to get $n$ below some arbitrary constant. On the one hand, $f^{(t)}(n) \geq n/\log^t n$, and so $$ g(n) \geq \log_{\log n} n = \frac{\log n}{\log \log n}. $$ To obtain an upper bound, notice that as long as $f^{(t)}(n) \geq \sqrt{n}$, we have $\log f^{(t)}(n) \geq (\log n)/2$, and so it takes at most $\log_{(\log n)/2} n = O(\log n/\log\log n)$ iterations to reduce $n$ to $\sqrt{n}$. The same argument applies for reducing $\sqrt{n}$ to $\sqrt[4]{n}$ and so on, and therefore $$ g(n) \ll \frac{\log n}{\log\log n} + \frac{\log \sqrt{n}}{\log\log \sqrt{n}} + \frac{\log \sqrt[4]{n}}{\log\log \sqrt[4]{n}} + \cdots. $$ (Here $\cdot \ll \cdot$ is the same as $\cdot = O(\cdot)$.) Notice that $\log \sqrt{n} = \frac{1}{2} \log n$. For large $n$, we will have $\log\log \sqrt{n} \geq \frac{2}{3} \log \log n$ (say), and so $$ \frac{\log \sqrt{n}}{\log\log \sqrt{n}} \leq \frac{2}{3} \frac{\log n}{\log\log n}. $$ Therefore the sum is majorized by a convergent geometric series, and we conclude that $$ g(n) = O\left(\frac{\log n}{\log\log n}\right). $$ In total, we get that $$ g(n) = \Theta\left(\frac{\log n}{\log \log n}\right). $$ With more work, we can probably prove a more refined estimate such as $$ g(n) \sim \frac{\log n}{\log \log n}. $$

answered Jan 29, 2019 at 6:26
$\endgroup$
6
  • $\begingroup$ What does $t$ represent? $\endgroup$ Commented Jan 29, 2019 at 8:46
  • $\begingroup$ It’s the number of times you applied $f$. $\endgroup$ Commented Jan 29, 2019 at 8:49
  • $\begingroup$ I can't figure out how you got $g(n) \geq \log_{\log n} n$ from the first inequality. $\endgroup$ Commented Jan 29, 2019 at 9:06
  • $\begingroup$ By solving the equation $n/\log^t n = 1$. Here 1ドル$ is my "arbitrary" constant. $\endgroup$ Commented Jan 29, 2019 at 12:02
  • $\begingroup$ Now I don't understand how you got $\log_{(\log n)/2} n$ iterations from $\log f^{(t)}(n) \geq (\log n)/2$. Can you please explain it to me? $\endgroup$ Commented Jan 29, 2019 at 15:58
2
$\begingroup$

Seems that you are refering to iterated logarithm, also know as the $log *$ function. It is a function that expresses how many times you can take the logarithm of a number until it produces a number less than or equals to 1.

The $log*$ function grows very slowly. It is the inverse of tetration.

See more of that here.

answered Jan 28, 2019 at 20:21
$\endgroup$
1
  • 2
    $\begingroup$ It's not the iterated logarithm. For example: at first step I will have $n$ elements and separate it into $\log_2(n)$ parts. In each part I will have $\frac {n}{\log_2(n)}$. That means that at the next level I will not take a logarithm of a logarithm. I will separate the subarray into $\log_2(\frac {n}{\log_2(n)})$ parts which is definitely quite more than $\log_2(\log_2(n))$. And this pattern goes on. I have actually written the program and depth of recursion is always bigger than $\log_2(\log_2(n))$ and always smaller than $\log_2(n)$. $\endgroup$ Commented Jan 28, 2019 at 20:37
2
$\begingroup$

Let n be the size of the array. Let k = log2 (n). At the first step, you divide by k. As long as the array size is more than $n^{1/2}$, you divide by more than k/2. I'd say this is O (log n / log log n).

(Always splitting into k parts takes log n / log k splits. You asked for the special case k = log n, therefore log n / log log n. Then you need to decide whether the shrinking k makes a difference or not. )

answered Jan 28, 2019 at 23:38
$\endgroup$
2
  • $\begingroup$ But subarray size will be less than $n^\frac {1}{2}$ at some point. And even if that weren't true, I don't understand how you got $O(\frac {\log_2(n)} {\log_2(\log_2(n))})$. $\endgroup$ Commented Jan 29, 2019 at 6:10
  • $\begingroup$ Your edited answer provides a good explanation for a constant $k$ but here $k$ isn't constant so it's still not helpful. $\endgroup$ Commented Jan 29, 2019 at 15:53
1
$\begingroup$

All logarithm below means logarithm with base 2, $\log_2(\cdot)$.


Let us name the function given in the question $\operatorname{slog}$. ("slog" comes from splitting with log. It can also imply "smaller than log".) In precise terms, $\operatorname{slog}$ on $\Bbb N$ is defined by the following conditions.

  • base case, $\operatorname{slog}(1)=0$ and $\operatorname{slog}(2)=1.$
  • the recurrence relation, $\operatorname{slog}(n)=1+ \text{slog}\left(\left\lceil\dfrac{n}{ \lceil\log n\rceil}\right\rceil\right)$ for $n\gt2.$

We can prove the following proposition on the asymptotic behavior of $\operatorname{slog}(n)$. $$\lim_{n\to\infty}\dfrac{\quad\text{ slog}(n)\quad}{\dfrac{\log n}{\log\log n}}=1.$$

Here is the basic idea of the proof.

$$\begin{align} \lim_{n\to\infty}\frac{\log n}{\log\log n}-\frac{\log \frac n{\log n}}{\log\log \frac n{\log n}} &=\lim_{m\to\infty}\frac{m}{\log m}-\frac{m-\log m}{\log(m-\log m)}\\ &=\lim_{m\to\infty}\frac{m}{\log m}-\frac{m-\log m}{\log m + \log(1- \frac{\log m}m)}\\ &=\lim_{m\to\infty}\frac{(\log m)^2+ m\log(1- \frac{\log m}m)}{(\log m)(\log m+ \log(1- \frac{\log m}m))}\\ &=\lim_{m\to\infty}\frac{(\log m)^2+ m(-\frac{\log m}m)}{(\log m)(\log m)}\\ &=1 \end{align}$$

Corollary. $\text{slog}(n)=\Theta\left(\dfrac{\log n}{\log\log n}\right)$.


The same conclusion holds if we define $\text{slog}(x)$ for $x>1$ with $\text{slog}(x)=1$ for 1ドル\lt x\le 2$ and $\operatorname{slog}(n)=1+ \text{slog}\left(\dfrac{n}{ \log n}\right)$.


Exercise. Write the full proof of $\text{slog}(n)\sim \dfrac{\log n}{\log\log n}$.

answered Jan 29, 2019 at 11:30
$\endgroup$
1
  • $\begingroup$ The asymptotic result here should be folklore or known. However, I have not found any reference yet. $\endgroup$ Commented Jan 29, 2019 at 15:46

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.