I have this algorithm and I wanna analyse the time complexity but I am not sure I am correct:
n = int(input("Enter Target Value: "))
x = 1
count = 0
while n != x:
if n % 2 == 0:
n /= 2
count +=1
else:
n -= 1
count +=1
print(count)
for the while loop, n/2 will have the time complexity of O(logn) and n-1 will be O(n), so O(logn)+O(n) will still be O(logn) in the for loop. The 3 initializing will be O(1), so the run time complexity of this algo will be O(logn). Am I correct? Thanks
1 Answer 1
The outcome is correct, but the reasoning is not. The n-=1
statement will not be executed O(n) times, and O(logn)+O(n) is actually O(n), not O(logn).
Imagine n in its binary representation. Then the n-=1
operation will be executed just as many times as there are 1-bits in that representation. The n/=2
statement will be executed just as many times as there are bits in the representation, regardless of whether they are 0 or 1. This is because a 1-bit will first be converted to a 0-bit with the n-=1
operation, and then the next iteration will pick up that same bit (which has become 0) for the n/=2
operation, which actually drops that bit.
So in the worst case, all the significant bits of n are 1-bits. And then you have O(logn) executions of n-=1
operation, and O(logn) executions of n/=2
. In total the loop makes 2O(logn) iterations, which gives this algorithm a O(logn) time complexity.