0
$\begingroup$

LeetCode has an Integer Replacement problem defined as follows:

Given a positive integer $n$, you can apply one of the following operations:

  1. If $n$ is even, replace $n$ with $n / 2$.
  2. If $n$ is odd, replace $n$ with either $n + 1$ or $n - 1$.

Return the minimum number of operations needed for $n$ to become 1ドル$.

I would like to determine the time and space complexity of a naive tree-recursive algorithm that uses memoization (caching to avoid recalculation). Here is the Python implementation:

class Solution:
 def __init__(self):
 self.results = {1: 0} # specifies 1 as the base case requiring 0 operations
 def integerReplacement(self, n: int) -> int:
 if n not in self.results:
 if n % 2: # n odd
 self.results[n] = 1 + min(
 self.integerReplacement(n + 1),
 self.integerReplacement(n - 1)
 )
 else: # n even
 self.results[n] = 1 + self.integerReplacement(n / 2)
 return self.results[n]

If this were tree recursive in all cases, the running time would be $O(2^n)$ because the branching factor is two; however, this is tree recursive only half the time. What is the time complexity of this algorithm? I am also interested in the space complexity, which I know would depend on the greater of the maximum stack depth and hash table size.

asked May 27, 2024 at 2:48
$\endgroup$

2 Answers 2

1
$\begingroup$

I figured out the answer to my question. The time complexity is $O(\log(n))$ because $n$ is halved every two calls (worst case). For more detail, see my solution.

answered Jun 16, 2024 at 15:27
$\endgroup$
0
$\begingroup$

Here, you can get away with branching in only one direction greedily. Thus, you can even convert the recursion into a simple while loop, which takes about $O(\log n)$ steps. Here is the code:

int integerReplacement(int n) {
 int count = 0;
 if (n == INT_MAX) {
 return 32;
 }
 while(n > 1){
 count ++;
 if ((n & 1) == 0) { // n is even
 n = n >> 1; // n = n/2
 } else { // n is odd
 if ((n & 2) == 0 || n < 4) { // 2nd last bit is zero or n is small
 n--; // n = n - 1
 } else {
 n++; // n = n + 1
 }
 }
 }
 return count;
}

If $n$ is odd, then adding or subtracting 1ドル$ makes it even, which will then be halved. If we somehow anticipate whether adding or subtracting will make it a multiple of 4ドル$, then we would rather go in that direction.

Now if $n~ mod~ 4 = 1$, then subtracting helps, whereas if $n~ mod~ 4 = 3$, then adding helps. By 'helps' I mean, you can perform (at least) two successive halvings. Thus we have an optimal greedy strategy at our hand.

For added effciency, I have converted the $mod$ operations into equivalent bitwise operations.


Now let us analyze your algorithm: Here we have $$T(n) = \begin{cases} T(n+1) + T(n-1) + c, & \text{when $n$ is odd}\\ T(n/2) + c, & \text{when $n$ is even} \end{cases}$$

Combining these two steps, we have $$T(n) \le T(\frac{n+1}{2}) + T(\frac{n-1}{2}) + 3c$$ Thus, we have $T(n) = O(n)$.

answered May 27, 2024 at 8:04
$\endgroup$
2
  • $\begingroup$ You can see my solution there: leetcode.com/problems/integer-replacement/solutions/5215070/… $\endgroup$ Commented May 27, 2024 at 8:05
  • $\begingroup$ Thank you very much for your time and explanation, but what I'm looking for is a complexity analysis of my code. I will edit my question to make that clearer. $\endgroup$ Commented May 27, 2024 at 15:20

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.